The Puchi Herald Reblog

A reblogging blog

Guida al GDPR per chi non ne vuole sapere: a chi hai dato i dati (“so spariti i dati”)?

Guida al GDPR per chi non ne vuole sapere: a chi hai dato i dati (“so spariti i dati”)?

Se ricordi ho scritto nel post precedente (faccio finta che qualcuno li legga, sai) di cosa dovresti fare per iniziare ad affrontare questa rogna del GDPR. La prima era assumere un DPO, la seconda riguardava i dati…

Ma che sono ‘sti dati?

voglio dire tutti parlano di dati, ma cosa vuol dire? dove sono? chi sono? cosa fanno?

Allora visto che sto scrivendo in italiano assumo che tu che leggi sia italiano e probabilmente interessato alla versione italiana della storia.

Quindi vediamo cosa dice il garante al riguardo:

 


http://www.garanteprivacy.it/web/guest/home/diritti/cosa-intendiamo-per-dati-personali

Sono dati personali le informazioni che identificano o rendono identificabile una persona fisica e che possono fornire dettagli sulle sue caratteristiche, le sue abitudini, il suo stile di vita, le sue relazioni personali, il suo stato di salute, la sua situazione economica, ecc..

Particolarmente importanti sono:

  • i dati identificativi: quelli che permettono l’identificazione diretta, come i dati anagrafici (ad esempio: nome e cognome), le immagini, ecc.;
  • i dati sensibili: quelli che possono rivelare l’origine razziale ed etnica, le convinzioni religiose, filosofiche o di altro genere, le opinioni politiche, l’adesione a partiti, sindacati, associazioni od organizzazioni a carattere religioso, filosofico, politico o sindacale, lo stato di salute e la vita sessuale;
  • i dati giudiziari: quelli che possono rivelare l’esistenza di determinati provvedimenti giudiziari soggetti ad iscrizione nel casellario giudiziale (ad esempio, i provvedimenti penali di condanna definitivi, la liberazione condizionale, il divieto od obbligo di soggiorno, le misure alternative alla detenzione) o la qualità di imputato o di indagato.

Con l’evoluzione delle nuove tecnologie, altri dati personali hanno assunto un ruolo significativo, come quelli relativi alle comunicazioni elettroniche (via Internet o telefono) e quelli che consentono la geolocalizzazione, fornendo informazioni sui luoghi frequentati e sugli spostamenti.

LE PARTI IN GIOCO

Interessato è la persona fisica cui si riferiscono i dati personali. Quindi, se un trattamento riguarda, ad esempio, l’indirizzo, il codice fiscale, ecc. di Mario Rossi, questa persona è l”interessato” (articolo 4, comma 1, lettera i), del Codice);

Titolare è la persona fisica, l’impresa, l’ente pubblico o privato, l’associazione, ecc., cui spettano le decisioni sugli scopi e sulle modalità del trattamento, oltre che sugli strumenti utilizzati (articolo 4, comma 1, lettera f), del Codice);

Responsabile è la persona fisica, la società, l’ente pubblico o privato, l’associazione o l’organismo cui il titolare affida, anche all’esterno della sua struttura organizzativa, specifici e definiti compiti di gestione e controllo del trattamento dei dati (articolo 4, comma 1, lettera g), del Codice). La designazione del responsabile è facoltativa (articolo 29 del Codice);

Incaricato è la persona fisica che, per conto del titolare, elabora o utilizza materialmente i dati personali sulla base delle istruzioni ricevute dal titolare e/o dal responsabile (articolo 4, comma 1, lettera h), del Codice).

____________________________________________________________________________________________________

Partiamo dalla definizione:

I dati che rendono identificabile o identificano una persona significa tutte le informazioni che ci permettono di risalire ad una persona fisica, con l’estensione anche al capire cosa fa, cosa gli piace ….

La quantità di dati che rientrano in questa categoria è estremamente ampia, il garante si è espresso diverse volte in merito mettendo persino gli indirizzi IP in questa categoria. Cosa significa?

Gestiamo quotidianamente una enorme mole di dati: li distribuiamo in giro, sia nostri che di altri, senza spesso neanche rendercene conto. Se usi un telefono, la posta elettronica, le chat, i social media allora magari sai di cosa sto parlando anche senza rendertene pienamente conto. Se vuoi sapere dove si trova il tuo amico, collega, cliente puoi magari verificare su una qualche funzione di geolocalizzazione offerta da diverse apps o condividere direttamente coordinate o …

Ops scusa sto divagando.

Il punto è che magari non ti rendi neanche conto di questa cosa.

Cosa sono questi fantomatici dati:

Proviamo a trasferirla in area aziendale per vedere se riesci ad allargare i tuoi confini di comprensione.

Molto di quello che si fa in una azienda è, oggi come oggi, legato a doppio filo con la digitalizzazione.

Pensaci bene:

  • Comunichi principalmente via e-mail: offerte, contratti, proposte, CV per le assunzioni, comunicazioni interne, chiacchiere …ci passa un sacco di roba
  • Utilizzi il web sia per recuperare informazioni (hai presente la pagina di google che interroghi sempre) quanto per comunicare esternamente (magari vendi online, magari hai un sito web, magari fai marketing online…)
  • Forse hai anche una presenza social (traduco roba tipo facebook o linkedin)
  • Probabilmente usi un sistema di contabilità informatizzato
  • Un CRM magari
  • Hai un elenco dei clienti, con le loro email, i telefoni, forse anche i riferimenti Skype e social e altre informazioni da qualche parte…se sei in gamba forse hai anche le date di nascita (sa come è, per fare gli auguri) e altri particolari.
  • hai un elenco di dipendenti con i loro dati tipo conto corrente, contatto familiare, stipendio …
  • trasporti questi dati, li salvi, li processi e magari qualche volta li vendi anche (ci sono aziende che lo fanno come mestiere)

E la cosa interessante è che magari non ti rendi conto che in quello scambio di informazioni hai mescolato elementi di business e personali.

Ora il problema del signor GDPR e del suo perfido assistente DPO che pretende di sapere dove sono questi dati per farteli gestire e proteggere.

Dove sono?

ti ho scritto qualche giorno fa che le prime 2 cose dovresti fare è iniziare a mappare i dati per capire dove sono e se sono importanti.

per fare questa operazione la cosa più semplice è passare piano piano le vare funzioni, operazioni e tools che usi, mappare i dati relativi in termini di:

  1. cosa sono
  2. come li raccolgo
  3. dove sono
  4. come li gestisco
  5. sono importanti per GDPR

ti suggerisco di usare un duplice approccio: uno funzionale e uno per tecnologia e ppoi incroci

ad esempio quello funzionale può essere:

  1. vendite -come gestisco la vendita, come viene fatta l’offerta, come viene comunicata, che dati trattengo del cliente, offro servizi post vendita …
  2. marketing – tramite che canali comunico, faccio eventi, uso database …
  3. gestione del personale – come gestisco i dati dei dipendenti, dove metto i cv se faccio richieste personali
  4. produzione – …

quello tecnologico invece può essere:

  1. cosa comunico tramite la posta elettronica
  2. gestisco l’accesso al web dei dipendenti, proxy
  3. uso app, chat, videoconferenza
  4. uso servizi cloud
  5. uso database
  6. uso archivi cartacei (si contano anche quelli)

il consiglio è, ovviamente, quello di incrociare poi le due cose per aiutarti a capire:

  1. quali dati effettivamente usi
  2. a cosa ti servono
  3. come li gestisci

siccome non ci hai mai pensato fare un lavoro su due fronti ti aiuta ad evitare a dimenticarti dei pezzi, e ti risulterà utilissimo poi quando dovrai fare la PIA … (non  nel senso di una persona molto religiosa…)

l’idea è quella di aiutarti a capire i dati nel suo complesso.

ricordi il mio post sulla gestione dei curriculum? ecco quello è un esempio.

ma voglio anche farti altri esempi: se fai andare i tuoi utenti su internet registri, anche se non lo sai, un sacco di dati che sarebbe opportuno tu gestissi correttamente:

i log dei tuoi firewall o proxy contengono dati a rischio, tipo l’IP, L’utente e il sito visitato

se hai un sito web con delle email pubbliche, servizi vari, blog, newsletter potresti ricevere comunicazioni che vanno gestite opportunamente o potresti ricevere sottoscrizioni che vanno gestite.

ma anche il semplice database dei tuoi dipendenti se dovesse essere attaccato e i relativi dati resi pubblici ti esporrebbe al rischio, se non hai messo in piedi le norme minime di protezione, di una sonora (e meritata) multa.

 

Che fare poi?

ok una volta che hai fatto la mappa dei dati ti ritrovi in mano un nuovo strumento che ti dice

  • che dati hai
  • a cosa ti servono
  • come li usi

a questo punto puoi iniziare a capire cosa dovresti fare per essere compliant con il GDPR.

Il primo punto è capire cosa rischi, qui entra in gioco la PIA

Il secondo punto è definire il valore di questi dati

Il terzo punto è iniziare a definire le azioni correttive che servono a gestire i dati.

Le azioni correttive coprono:

  1. la definizione delle procedure operative di raccolta e gestione dei dati
    1. metriche di misurazione e controllo per l’auditing
    2. definizione delle responsabilità operative
  2. l’introduzione delle corrette tecnologie
    1. valutazione delle tecnologie correnti
    2. definizione delle eventuali introduzioni tecnologiche di sicurezza o gestione dati
  3. la definizione delle procedure di monitoraggio ed auditing
  4. la definizione delle procedure di gestione delle eccezioni e degli incidenti

ora non voglio e non posso andare in ulteriori dettagli, non fosse per altro che:

  1. non esiste una soluzione adatta a tutti
  2. anche esistesse, se faccio io il lavoro qui gratis non ci guadagno
  3. mica sto scrivendo un libro, ma solo una serie di amichevoli consigli.

dai sorridi, almeno io ti ho lanciato qualche avvertimento, e sai come si dice: uomo avvisato ….

 

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

Guida al GDPR per chi non ne vuole sapere: a chi hai dato i dati (“so spariti i dati”)? was originally published on The Puchi Herald Magazine

GDPR and the technology market

GDPR and the technology market

Question: will the new privacy policies and laws impact the technology market?

This is an interesting question to ask ourselves; whether we are consumer of the technology market or technology vendors the impact of the new technologies (from cloud to IoT, from industry 4.0 to big data just to name the most acknowledged from a marketing point of view) privacy regulations can affect heavily our behaviours and the market.

so let try to understand what could be the implications of this new focus on privacy and data protection.

First of all we should try to understand what we are talking about.

Privacy, GDPR and the rest.

Privacy: the state of being alone, or the right to keep one’s personal matters and relationships secret:

In nowadays environments the presence of data related technology is pervasive: from business to  personal life technology play a big part of our life.  data related technology means we use technologies that is able to manipulate information: informations are collected, changed, communicated, shared all in form of data. Bit and bytes that describes our job, our business, our personal life.

Although in the past privacy was mainly a physical issue, and therefore legislation was focusing on those aspects, this increasing presence of data collection and sharing makes people realize that there is a new abstraction layer that involve privacy that is no more related to be alone or in a confined physical space, but in a undefined and without borders digital virtual space.

Email, Blogs, social networks, chat, E-commerce, electronic payment, smart phones all this and more shifted the same perception of privacy from a simple concept to something more hard to be defined.

Rulers and consumers started to deal with those issues in the last years whole enterprise and technical world has been remained almost frozen waiting for indications. the first indications that this would have been a wakeup call for enterprise has been the ending of the safe harbour agreement, privacy was not longer a secondary issue even for the economy.

The latest development can be easily identified in the new  European Union’s General Data Protection Regulation (GDPR), which comes into effect in May 2018, has far-reaching implications that extend far beyond the EU.

Businesses that fail to meet the new mandates aimed at protecting personal data face severe consequences. They can be fined up to $20 million, or 4 percent of global revenues — a cost that makes this regulation impossible to ignore.

But other areas of the world are moving toward a more cautious approach toward data privacy, not only Europe. While it is not yet clear how will be the new USA administration approach toward this subject, it is out of doubt that data privacy is becoming a major issue in the next years; how this will impact business is, although, not yet clear.

For sure is that GDPR will enforce companies to deal with a tremendous amount of data to be protected. Any data used to make inferences linked tenuously or otherwise to a living person is personal data under GDPR. Cookie IDs, IP addresses, any device identifier? All personal data. Even metadata with no obvious identifier is caught under the GDPR’s definition of personal data. Truth be told, such assertions are not entirely new. The difference under GDPR is that they will be enforced and the non compliance fined.

Today swathes of business practices unlocking data monetization rely upon data not being considered personal. So they apply weak consent, onward transfer and data reuse concepts. These models are going to change; either by choice, or by obligation.

Data Privacy , Data Protection and Cyber Security

One aspect that is not yet completely perceived and understood is the correlations between data privacy, data security and cyber security. The requirements that enforce the companies to respect data privacy legal requirements are intrinsically bound with the explicit request for data protection and, therefore, cyber security.

GDPR clearly define data should be fairly processed and protected: the implications are not only in terms of procedure to adopt inside the enterprises, but also technical in terms of data manipulation, retention, storage and security.

Recent security outbreaks as the one related to ransomware are an example of how basic cyber security threats can impact directly on this area, as well as common and well known cyber attack directed to data exfiltration.

This is a growing phenomenon and is affecting not only the classical online services (think of classic dating site attacks, as an example, to collect username and passwords) but, as an example, extensively the healthcare industry.

While in the past those outbreaks could have been just a relative minor issue, the new GDPR structure of fines could affect in a heavy way any company, regardless its sector, and some departments that in the past have never considered those issues as a business imperative, as marketing or Human Resource, will have to face a difficult transaction in terms of awareness, policies to be implemented and technology approach.

It is easy to forecast that this situation will shape in the next years the technology market in different areas.

Impact on the technology market

When we talk about the technology market we face different aspects, “technology” as a term can cover a wide range of things. We can talk about hardware vendors or software vendors. We can talk about service vendors (cloud, CRM or whatever you like more), IT enterprise or carrier HW providers, Security vendors, End user HW providers (as smart phone makers).

Recently the trend is to aggregate functions and offering, making those areas overlapping in the same company although not often integrated.

Since all the industry will have to face the new privacy requirements it is to be expected a increase on data privacy expertise requests hitting the market, and a growing demand for IT solutions that will help companies to manage the requirements. this could, as an example, give a small impulse to historically neglected areas as DLP solutions, data categorization solutions and so on.

Some little advance and effort will be probably put also on more traditional areas as backup.

An heavier impact will be seen in the growing online market with the need to protect not only privacy of users but also to save the economic transactions, content providers, social or gaming platforms will be heavily impacted too.

In a second run we will probably see a renewed interest for baseline security solutions, as the stakeholders will, sooner or later, realize that there is no compliance without data protection and there is not data protection without cyber security.

The request for expertise and consulting services will be mostly redirected outside to technology vendors (here considering HW\SW vendors as cisco, hp, huawei, SAP, Microsoft; service vendors as cloud providers – azure, AWS, google –  but also app stores, CRM online providers), consulting companies and technology integrators.

On the other end technology vendors will have to face a strange situations where they will be both requested to provide solutions compliant with the new rules, be the driver of the new requirements and implementations (public-private partnership basically means this)  and in need to implement solutions to protect themselves in different areas as:

Product and Services development

Here vendors will have to start developing products\services considering data protection a major issue. It is clear the impact on cloud or services, where data protection can be easily identified, but also the HW product side will have to face issues. Although it can seems trivial we can remember the problem related to GPS tracking in apple and, at some extension, android happened some years ago. privacy implication with products can be wider than expected, since we have to protect not only the data per se, but also the metadata (this is the wider range of GDPR and new privacy regulations).

Usually we tend not to consider, as an example, system logs as a problem in terms of privacy, but in effect they are if they contains data that can point to a physical person and being used to track somehow the person behaviour.

Firewall and router logs, as an example, could be used to determine what is someone doing online, and therefore can expose information that are subject to GDPR realm. minor features apparently but the truth that also metadata are object of GDPR.

Privacy By design and Privacy Enhanced Technology will be mandatory component of any product\service developement.

Marketing and Sales

Marketing(and or  sales)  has always been considered agnostic towards technology, but the ultimate scope of marketing is to get in touch with the market, this means customers and ultimately people. Marketing activities will get a huge impact towards GDPR requirements both in terms of operations, since is up on marketing to manage a large amount of data coming from outside the company, and communication.

Technology vendors, somehow, will be expected to lead and drive the request both in terms of consulting and example. The result of a breach or misinterpretation of GDPR guidances will impact severely the business from a brand point of view and undermine vendor credibility.

Internal protection

As any other company there will be a direct impact on business operations of any vendor dealing in the technology field. But this case the extension of the problem will not focus just on the standard cyber security procedures, since technology vendors enter, somehow, almost directly on customers IT or data processing infrastructure the request will be to implement an end to end protection system which include GDPR compliance and cyber security application. This will require technology vendors to operate on:

  1. supply chain
  2. production and vulnerability disclosure
  3. product and service delivery

all three area are still trying to develop standards and good practice although something is moving.

So what are the changes expected under the new regulation?

There are around a dozen headline changes which technology companies should be aware of.

Some of the key areas include:

  • Privacy by design and Privacy enhancing technology – privacy by design calls for the inclusion of data protection from the onset of the designing of systems. Companies must also only hold and process data which is absolutely necessary.

Privacy enhancing technology (PET) and Privacy by Design (PbD) are obligatory and mandated requirements under the GDPR. There remains no generally accepted definition of PET or PbD, but PbD is considered an evidencing step for software development processes to take account of privacy requirements. So the incorporation of what can broadly be defined as PET in such solutions represents PbD.

Two particular PET techniques that control downside and enable upside risk are differential privacy & homomorphic encryption.

  • Differential privacy counters re-identification risk and can be applied to anonymous data mining of frequent patterns. The approach obscures data specific to an individual by algorithmically injecting noise. More formally: for a given computational task T and a given value of ϵ there will be many differentially private algorithms for achieving T in a ϵ-differentially private manner. This enables computable optima’s of privacy and also data utility to be defined by modifying either the data (inputs to query algorithms) or by modifying the outputs (of the queries), or both.
  • Searchable/homomorphic encryption allows encrypted data to be analyzed through information releasing algorithms. Considered implausible only recently, advances in axiomatizing computable definitions of both privacy and utility have enabled companies such as IBM & Fujitsu to commercially pioneer the approach.
  • Data processors – those who process data on behalf of data controllers, including cloud-providers, data centres and processors. Liability will extend to these and businesses that collect and use personal data.
  • Data portability: Empowers customers to port their profiles and segmentation inferences from one service provider to another. This is a reflection by lawmakers that data is relevant to competition law, whilst not conceding an imbalance between a companies ability to benefit from data at expenses of us all as citizens.
  • Data protection officers – internal record keeping and a data protection officer (DPO) will be introduced as a requirement for large scale monitoring of data. Their position involves expert knowledge of data protection laws and practices, and they will be required to directly report to the highest level of management.
  • Consent – explicit permission to hold any personal data in electronic systems will become mandatory. It will no longer be possible to rely on implied consent with individuals having the option to opt-out.Customers consent to privacy policies that change. Being able to prove which contract was agreed to, in court or to a regulator, requires  registration time stamping and tamper resistant logs become de rigueur.As we move into an opt-in world of explicit consent and ubiquitous personal data, data transmissions beyond a website visit must be explicitly permissioned and controlled. In this world, default browser values de-link machine identifiers from search queries. In other words, in this new world, online advertising to EU citizens is in line for fundamental change.And given particular regulatory emphasis on profiling, explicit consent will require loyalty programs to differentiate consent between general and personalized marketing consents. Those consent flags must cascade through registration, reporting and analysis, targeting and profiling, contact center operations and all other processes that handle such data.
  • Breach notifications – the notification of a breach, where there is a risk that the rights and freedoms of individuals could become compromised, must be reported within 72 hours of the breach being identified. it is underestimate the relationship between breach notification and vulnerability disclosure. While for an end user those two aspect seems to be unrelated, there could be a higher impact on vendors for, at least, a couple of factors:
    • The breach notification could expose the vendor as the main source of the breach itself due to lack of vulnerability management and disclosure.
    • The victim could consider liability against the vendors which “vulnerabilities” caused the breach redirecting to them part of the costs.
  • Right to access – data subjects will now have the right to obtain confirmation from you of what personal data is held concerning them, how is it being processed, where and for what purpose.
  • Right to be forgotten – data subjects will now have the right to be forgotten which entitles the data subject to have you ensure that information is deleted from every piece of IT equipment, portable device and from server back-ups and cloud facilities.A framework to comply with this obligation would include the following steps:
    • Spot identifiers which tie together datasets, e.g: machine identifiers link together our social media experiences;
    • Prescribe how re-identifiable data flows in and outside the organization;
    • Document a scalable process to overwrite identifiers in all datasets where re-identification can be established, upon the validated request of a user, and
    • Third party contracts and SLAs should be adjusted to ensure compliance with validated requests.
  • Data Bookkeeping: Field level data, linked to an identifier, flows across geographies and legal entities, processed by machines and people. Organizations will account for these flows with evergreen reporting. It stands to reason that these flows will be threat-modeled for integrity and confidentiality so controls can be readily evidenced upon request.

 

GDPR impact

Privacy regulations as GDPR and the growing awareness and concerns related to data privacy and security are related to the expanding presence in everydays life and business of smart mobile devices able to process data, the growing online market, consolidated trends as cloud services or newcomers as IoT.

Technology market face this transition in front line, and will see the impact of new regulations and customer reactions in several ways. This is both a chance and a problem; implementation of new mandatory requirements will impact all areas, from design and production to sales and delivery. But this will means also new area of business in the consulting area, in the technologies to support GDPR and privacy compliances in the market where data analysis technology, artificial intelligence and other high end technology areas could provide a competitive\price insensitive advance vs the consolidated technology market.

The key success factor is to embrace this change and drive it acquiring internally the needed competences, implementing the correct corrections and driving the needed improvement related to product and services provided.

Future trend will see a prevalence of  technologies related to “data” processing and services related to data vs products. The new Data paradigm is already visible nowadays as example in the Big Data market (take data lake implementation as an example). in terms of technology market this will means to focus on Data Science which will pose a new and somehow unpredictable relationship with privacy regulations.

GDPR Risks and “Data Science”

The term data science describes a process from data discovery, to providing access to data through technologies such as Apache Hadoop (open source software for large data sets) in the case of Big Data; and distilling the data through architectures such as Spark, in-memory and parallel processing. That data science creates value is understood. What isn’t are the risks it exposes investors to under the GDPR, of which there are principally three:

Risk 1: The Unknown Elephant in the Room – Unicity: a general misunderstanding in monetization strategies is that stripping away identifiers of a data model renders the data set anonymous. Such a belief is flawed. So-called anonymous data sets can often, without implausible effort, be re-identified. Unicity is a measure of how easy it is to re-identify data. It quantifies additional data needed to re-identify a user. The higher a data set’s unicity, the easier it is to re-identify. Transactional and geo-temporal data yield not only high monetization potential, they carry statistically unique patterns which give rise to high unicity.

Risk 2: Relevance & Quality: Income, preferences and family circumstances routinely change, and preference data on children is difficult to ethically justify processing. While this creates a problem for predictive analytics, that data and the inferences it engenders can be considered inaccurate at a given point in time, which creates a GDPR cause-of-action. Data quality needs to stay aligned to business objectives.

Risk 3: Expecting the Unexpected: When data science creates unexpected inferences about us, it tends to invalidate the consent that allowed data to be captured in the first place, which, again, is a big deal. Data collected today, particularly from mobile devices, is subject to a constant stream of future inferences that neither the customer nor the collector can reasonably comprehend. Consider a car-sharing app that can model propensity for one-night-stands from usage patterns. While that data may not result in propositions today, the market will consider upside risk/option value to have been created (the market still does not seem to believe in GDPR impact), but this incremental data coming into existence creates downside risk (such data is difficult to find a legal-basis for, given the vagaries of a given consented disclosure).

More generally, the problem of negative correlations is brought to the fore by algorithmic flaws, biased data and ill-considered marketing or risk practices, the enduring example being U.S. retailer Targets’ predictive campaigns to pregnant teenagers, spotted by parents. These are examples of a new form of systemic control failure, leading to potentially actionable GDPR claims.

 

Related articles

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

GDPR and the technology market was originally published on The Puchi Herald Magazine

NFV network function virtualization security considerations

NFV network function virtualization security considerations

I have been asked to write down a few things related to NFV and security. NFV is relatively a new thing in the IT world. It has been on the news in 2012 and since then it has followed the developing path common to the virtualization technologies.

Virtualization has made dramatic improvement in the last years. It all started at first with simple virtualization platforms, of course VMware on top of our mind, but not only. The idea was to abstract HW platforms from software ones.

Developing the idea, the abstraction growth covering multiple hw platforms moving also to the multisite in WAN and geographical development. We call this sort of implementation nowadays cloud, but all the cloud story started from the old virtualization idea.

While this platform change was taking place, the world of services was experimenting different clientserver options (web services and so on).

With the new platforms taking place it was clear the network part would have followed this trend, moving to software and virtual shores.

Form the network point of view the first step has been the SDN (Software Defined Network).

Software defined networks (SDN) allow dynamic changes of network configuration that can alter network function characteristics and behaviors. For example, SDN can render real-time topological changes of a network path. A SDN-enabled network provides a platform on which to implement a dynamic chain of virtualized network services that make up an end-to-end network service.

SDN basically allow to centrally administer, manage, configure network services creating policies that can be related to different needs and able to adapt to a changing environment.

But this level of abstraction was not enough to cover the needed flexibility of the new implementation of modern datacenter, cloud and virtualized environment.

In a SDN environment the network gears remain mainly real solid box in an environment that is way more virtualized.

The first attempt to hybridize the physical network with the virtual one was the introduction of the first virtual network element as switches and firewalls. Those components were sometimes part of the hypervisor of the virtualizing platform, sometimes virtual appliances able to run inside a virtual environment as virtual appliances.

Those solutions were (are, since actually exist) good t target specific needs but were not covering the needed flexibility, resilience and scalability required to modern virtualization systems. Products like VMware’s vShield, Cisco’s ASA 1000v and F5 Networks‘ vCMP brought improvements in management and licensing more suited to service provider needs. Each used different architectures to accomplish those goals, making a blending of approaches difficult. But the lack of a comprehensive approach was making difficult to expand those services extensively.

The natural step of the process of virtualization would have be to define something to address in a more comprehensive way the need to transfer part of the network function inside the virtual environment.

Communications service providers and network operators came together through ETSI to try to address the management issues around virtual appliances that handle network functions.

NFV represents a decoupling of the software implementation of network functions from the underlying hardware by leveraging virtualization techniques. NFV offers a variety of network functions and elements, including routing, content delivery networks, network address translation, virtual private networks (VPNs), load balancing, intrusion detection and prevention systems (IDPS), and firewalls. Multiple network functions can be consolidated into the same hardware or server. NFV allows network operators and users to provision and execute on-demand network functions on commodity hardware or CSP platforms.

NFV does not depend on SDN (and vice-versa) and can be implemented without it. However, SDN can improve performance and enable a rich feature set known as Dynamic Virtual Network Function Service Chaining (or VNF Service Chaining). This capability simplifies and accelerates deployment of NFV-based network functions.

Based on the framework introduced by the European Telecommunications Standards Institute (ETSI), NFV is built on three main domains:

  • VNF,
  • NFV infrastructure, and
  • NFV management and orchestration (MANO).

VNF can be considered as a container of network services provisioned by software, very similar to a VM operational model. The infrastructure part of NFV includes all physical resources (e.g., CPU, memory, and I/O) required for storage, computing and networking to prepare the execution of VNFs. The management of all virtualization-specific tasks in NFV framework is performed by NFV management and orchestration domain. For instance, this domain orchestrates and manages the lifecycle of resources and VNFs, and also controls the automatic remote installation of VNFs.

The resulting environment now is a little bit more complicated than a few years before.

Where in the past we used to have

  • physical servers running Operative Systems as Linux, Unix or Windows bound to the specific hardware platform, and almost monolithic services running on those solutions,
  • physical storage unit running on different technologies and network (Ethernet, iscasi, fiber optic and so on),
  • network connected through physical devices, with some specific unit providing external access (VPN servers)
  • and protected by some sort of security unit providing some sort of control (firewall, IPSIDS, 802.1x, AAA and so on)
  • managed quite independently trough different interfaces or programs

now we moved to a world where we have

a virtualized environment where services (think as an example at Docker implementations) or entire operating systems run on a virtual machines (VMs) that manage the abstraction with the hardware

and is able to allocate resources dynamically in terms of performance and even geographic locations,

a network environment which services are partially virtualized (as in VNF implementation) and partially physical and interact with the virtual environment dynamically

a network configured dynamically through control software (SDN) which can dynamically and easily modify the network topology itself in order to respond to the changing request coming from the environment (users, services, processes).

Nowadays, the impressive effects of network functions virtualization (NFV) are evident in the wide range of applications from IP node implementations (e.g., future Internet architecture) to mobile core networks. NFV allows network functions (e.g., packet forwarding and dropping) to be performed in virtual machines (VMs) in a cloud infrastructure rather than in dedicated devices. NFV as an agile and automated network is desirable for network operators due to the ability of easily developing new services and the capabilities of self-management and network programmability via software-defined networking (SDN). Furthermore, co-existence with current networks and services leads to improve customer experience, and reduces the complexity, capital expenditure (CAPEX), and operational expenditure (OPEX).

In theory, virtualization broadly describes the separation of resources or requests for a service from the underlying physical delivery of that service. In this view, NFV involves the implementation of network functions in software that can run on a range of hardware, which can be moved without the need for installation of new equipment. Therefore, all low-level physical network details are hidden and the users are provided with the dynamic configuration of network tasks.

Everything seems so better and easy, but all those transformation does not come out without a price in terms of security.

Every step into virtualization bring security concerns, related to the control plane (think of hypervisor security, orchestrator security), the communication plane, the virtual environment itself (that often inherit the same problem of the physical platform), and the transition interface between the physical and virtual world.

Despite many advantages, therefore NFV introduces new security challenges. Since all software-based virtual functions in NFV can be configured or controlled by an external entity (e.g., third-party provider or user), the whole network could be potentially compromised or destroyed. For example, in order to properly reduce hosts’ heavy workloads, a hypervisor in NFV can dynamically try to achieve the load-balance of assigned loads for multiple VMs through a flexible and programmable networking layer which is known as virtual switch; however, if the hypervisor is compromised, all network functions can be disabled completely (a good old Ddos) or priority can be provided to some services instead others.

Also, NFV’s attack surface is considerably increased, compared with traditional network systems. Besides network resources (e.g., routers, switches, etc.) in the traditional networks, virtualization environments, live migration, and multi-tenant common infrastructure could also be attacked in NFV. For example, an at- tacker can snare a dedicated virtualized network function (VNF) and then spread out its bots in a victim’s whole network using the migration and multicast ability of NFV. To make matters worse, the access to a common infrastructure for a multi-tenant network based on NFV inherently allows for other security risks due to the shared resources between VMs. For example, in a data center network (DCN), side-channels (e.g., cache-based side channel) attacks and/or operational interference could be introduced unless the shared resources between VMs is securely controlled with proper security policies. In practice, it is not easy to provide a complete isolation of VNFs in DCNs.

The challenge related to secure a VFN are complex because are related to all the element that compose the environment: physical, virtual and control.

According to CSA Securing this environment is challenging for at least the following reasons:

  1. Hypervisor dependencies: Today, only a few hypervisor vendors dominate the marketplace, with many vendors hoping to become market players. Like their operating system vendor counterparts, these vendors must address security vulnerabilities in their code. Diligent patching is critical. These vendors must also understand the underlying architecture, e.g., how packets flow within the network fabric, various types of encryption and so forth.
  2. Elastic network boundaries: In NFV, the network fabric accommodates multiple functions. Placement of physical controls are limited by location and cable length. These boundaries are blurred or non-existent in NFV architecture, which complicates security matters due to the unclear boundaries. VLANs are not traditionally considered secure, so physical segregation may still be required for some purposes.
  3. Dynamic workloads: NFV’s appeal is in its agility and dynamic capabilities. Traditional security models are static and unable to evolve as network topology changes in response to demand. Inserting security services into NFV often involves relying on an overlay model that does not easily coexist across vendor boundaries.
  4. Service insertion: NFV promises elastic, transparent networks since the fabric intelligently routes packets that meet configurable criteria. Traditional security controls are deployed logically and physically inline. With NFV, there is often no simple insertion point for security services that are not already layered into the hypervisor.
  5. Stateful versus stateless inspection: Today’s networks require redundancy at a system level and along a network path. This path redundancy cause asymmetric flows that pose challenges for stateful devices that need to see every packet in order to provide access controls. Security operations during the last decade have been based on the premise that stateful inspection is more advanced and superior to stateless access controls. NFV may add complexity where security controls cannot deal with the asymmetries created by multiple, redundant network paths and devices.
  6. Scalability of available resources: As earlier noted, NFV’s appeal lies in its ability to do more with less data center rack space, power, and cooling.

Dedicating cores to workloads and network resources enables resource consolidation. Deeper inspection technologies—next-generation firewalls and Transport Layer Security (TLS) decryption, for example—are resource intensive and do not always scale without offload capability. Security controls must be pervasive to be effective, and they often require significant compute resources.

Together, SDN and NFV create additional complexity and challenges for security controls. It is not uncommon to couple an SDN model with some method of centralized control to deploy network services in the virtual layer. This approach leverages both SDN and NFV as part of the current trend toward data center consolidation.

NFV Security Framework try to address those problems.

If we want to dig the security part a little deeper we can analyze

  • Network function-specific security issues

and

  • Generic virtualization-related security issues

Network function-specific threats refer to attacks on network functions and/or resources (e.g., spoofing, sniffing and denial of service).

The foundation of NFV is set on network virtualization. In this NFV environment, a single physical infrastructure is logically shared by multiple VNFs. For these VNFs, providing a shared, hosted network infrastructure introduces new security vulnerabilities. The general platform of network virtualization consists of three entities; the providers of the network infrastructure, VNF providers, and users. Since the system consists of different operators, undoubtedly, their cooperation cannot be perfect and each entity may behave in a non-cooperative or greedy way to gain benefits.

The virtualization threats of NFV can be originated by each entity and may target the whole or part of the system.

In this view, we need to consider the threats, such as side-channel or flooding attacks as common attacks, and hypervisor, malware injection or VM migration related attacks as the virtualization and cloud specific attacks.

Basically VNF add a new layer of security concerns to the virtualizedcloud platforms for at least 3 reasons:

  • It inherits all the classic network security issues and expand them to cloud level

This means once a VNF is compromised there are good chances it can spread the attack or problem to the whole environment affecting not only the resources directly assigned but anything connected to the virtual environment. Think, as an example, the level of damage that can be provided performing a Ddos that deplete rapidly all the cloud network resources modifying, as an example, the Qos parameters and not using the traditional flooding techniques (which are anyway available).

  • It depends to several layers of abstraction and controls

Orchestrator and hypervisor are, as a matter of fact, great attack point since can

  • It requires a better planned implementation than the classic physical one,

With a tighter control on who is managing the management interfaces since, in common with SDN, VNF is more exposed to unauthorized access and configuration-related issues.

Still VNF requires studies and analysis from security perspective, the good part is that this is a new technology under development therefore there are big space for improvement.

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

NFV network function virtualization security considerations was originally published on The Puchi Herald Magazine

Happy new insecure 2017: my resolutions and wishlist for new year

Happy new insecure 2017: my resolutions and wishlist for new year

Here we are, a new year comes and we, as cyber security expert, will keep warning the world about the deeply insecure world we are living.

And we will announce new technologies and new devastating scenarios related to new technologies. IoT and Cloud will rise their evil face while bad people will be lurking in the dark waiting to attack the innocent lamb crossing the road.

But, in all of this, the most of the damage will be still done by bad designed systems, by managers that does not understand what means living in a digital world, by politicians that understand cyber security issues only when they have something to gain, by entrepreneurs that still will invest in security as a disturbing side effect.

If I can make a wish for the new year is to see finally a different approach to information security, an approach that take into account that

1) to be secure you need well designed systems first and then cover it with some security geek technologies. If the design is crap all your security is crap no matter what you use on top

2) there is not security if your devices are not designed with security in mind, good code and code lifecycle is the best insurance, so if you buy the cheapest then do not cry … is your job to look for what you need and so yes is your fault if something goes wrong.

3) that finally companies, managers, entrepreneurs understand that security is within process, and not just a bunch of technologies put on top of something that you do not have the slightest idea what it is, you can’t protect what you don’t understand

4) that if people do not understand then people will not follow even the most basic rules, so training is not an optional, but the very basic. And to be sure the first that have to learn are the “CxO” which should get off the throne and start learning the world they crafted.

5) that if we keep thinking that IoT is wonderful but do not understand what IoT will bring in terms of cultural and technical problem we still will never understand what means putting security on this.

6) that if you hire an expert and then you don’t listen to himher then you are wasting hisher and your time. then do not blame the messenger.

7) that if you think that this complex field we call security can be covered by a junior that knows it all you are probably wrong unless the junior is a genious

8) that if you, security expert, think your counterpart has the slightest idea what you are talking about, you are probably wrong because you did not realize they do not understand what they does not know.

9) that all of this is part of the business, and therefore the business should took all this as one of its element, and not just a nasty annoying add on.

10) that next time someone talk about APT tells you the truth, the only way to stop an APT is to stop the attacker otherwise…. it would not be an APT

I know I know I am a but naive and still believe in fairy tales…

 

happy safe and secure 2017 to you all

security awarenesssecuritysecurity culture2017

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

Happy new insecure 2017: my resolutions and wishlist for new year was originally published on The Puchi Herald Magazine

Our memories are all we have

Our memories are all we have

SPETT.UMBERTO ECO A NAPOLI (SUD FOTO SERGIO SIANO)
SPETT.UMBERTO ECO A NAPOLI
(SUD FOTO SERGIO SIANO)

I am in China for work, with a few connection to the real world outside, so Italian news usually comes to me late, when I am able to connect to internet from hotel; great firewall allowing.

Being isolated from the Italian reality put things into a different perspective, allow you to keep less news with more time to digest and think about.

It happened a few days ago I learned of the death of Umberto Eco, one of the greatest Italian tinker of our age. He was a great writer, a great thinker, a truly free spirit.

The first thing I did when I knew about his death was to feel a great sorrow, in a moment my country needs desperately to turn back to its origin loving culture and what culture means, losing such a man was a great loss.

I used to write to my daughter every day, and force her to do the same. No matter what just the silliest thing, but I want her to learn to be committed and to use an old way of communication such writing (although with email).

After the new, instead of writing the usual nonsense we love to share one to another, I asked my daughter, 12, to read the letter Eco wrote to his nephew just to make her understand the importance of living, learning, knowing and remembering.

http://espresso.repubblica.it/visioni/2014/01/03/news/umberto-eco-caro-nipote-studia-a-memoria-1.147715

I am trying, not sure with how much success, to rise her with a critical eye on reality, trying to give her the tools to understand where things come from and just not see the moment but live it, knowing and understanding why things are what they are.

The letter was all I am trying to teach my daughter, but with way better words and meanings. But I am not Umberto, and have not a blink of his incredible knowledge.

But I remember when I was younger I did not understand the need to learn things and memorize them, I got the reason growing old, when I understood that my experience (therefore my memories) are the metric to analyze the world. And probably now I say I should have memorized more.

The Eco sad news move something on me so  I was started reading back some interviews with Eco and it make me think, what is the meaning of our lives? memories.

At the end it is memories that shape our life, and growing old we will add memories that will be the reason we lived for.

How we shape those memories is our job, we can build them good or bad, silly or deep. But it is all up to us. but to make memories we have to live them, somehow.

Reading, travelling, doing things. and those memories are the building blocks of what we are and will be. To make memories we need to understand what we see and what we do.

I have the vision of my daughter when she was ready born, a wonderful ugly conhead. So small and so a great responsibility with the lightest wight.

I remember when we discovered my wife was pregnant, I were in the kitchen when she told me the result of the test, i was shaking.

And I remember the teenage friends and our nightly talk about politics or music.

I remember the good and bad part of the job, and the people I worked with.

I remember my mistakes (this is why I write on management so much)

I remember what I would have liked to know at work (This is why I write on technology so much)

I remember that I fall in love with Japan looking at anime and manga, and then going deeper into that country history and culture, alas not language, shame on me, so I have had to enjoy Banana Yoshimoto only in italian, I am sure loosing so much (with all the respect for the translator).

I remember my first trip in USA, where all was not, at the end, so big but not all food was Mc Donald.

I remember how was wide opening to discover back my latin heritage (thanks Rika), and starting to understand the good (and bad) of the spanish speaking world.

I remember how was incredibly rewarding to read and understand Joice, Tolkien, Agatha Christie, Conan Doyle in the original language, see what only the original language can gives you. I can see Holmes home, as well as Miss Marple smile looking out of her windows. Are part of my life.

I remember how was amazing to open my life to the spanish language, writers, music and culture (I could not have understood and appreciate Orozco or Frida Kalo without knowing that culture, see, watch, talk, smell, listen, breath that culture).

I usually say to my daughter that if you know more you will find more things to enjoy. Reading is a wondeful way to find new wonderful things. Studying history and, also, its implications gives you the ability to look at the world with different eyes, so understanding different cultures, languages, foods and so on.

I disagree with the ones that claim that ignorance is a best way to happiness, ignorance is the easiest way, easy path is never (or seldom) the best path. And I disagree with the ones that for fear close themself in a shell wasting their life in useless fears, and ultimately I disagree that to preserve its own identity you have to close to the different, stranger and new.

So different from the world we are shaping for our sons. A world of people with no memory of the past is doomed to live the same errors again and again, isn’t it? Isn’t what we see everyday? Do we still care (or humanity have ever cared about) historical memory?

C’è poi la memoria storica, quella che non riguarda i fatti della tua vita o le cose che hai letto, ma quello che è accaduto prima che tu nascessi.

Then there is the historical memory, one that isn’t about the facts of your life or things you’ve read, but what happened before you were born.

Life is a learning path, and memories are the foundations of this learning. without memory of the past we can not build good memory for the future, unless we like to live in a lie (but so many did it, isn’t it?).

There is more truth in a novel than in any political speech, there is more truth in a joke than in any serious comment. Probably this is the reason why novelists, writers and comedians have, usually, the sharpen vision of our world; they have to work with memory for a living.

The moment we stop making memories, for us and for the others, we just stop living.

 

La memoria è un muscolo come quelli delle gambe, se non lo eserciti si avvizzisce e tu diventi (dal punto di vista mentale) diversamente abile e cioè (parliamoci chiaro) un idiota. E inoltre, siccome per tutti c’è il rischio che quando si diventa vecchi ci venga l’Alzheimer, uno dei modi di evitare questo spiacevole incidente è di esercitare sempre la memoria.

The memory is a muscle like those of the legs, if not used fades and you become disabled (mentally) and therefore (let’s face it) an idiot. And also, since for all there is a risk that when we get old we get Alzheimer, one of the ways to avoid this unfortunate incident is to exercise more and memory.

How many times I have seen people that stopped to use their “brain” muscle, close to learn and understand (hope you can appreciate the difference between know and understand, although the first is a mandatory step to the second).

I hope to have more human beings like Umberto Eco, that was so proud and joyful  to play with memories, and I hope my daughter will learn something from that letter.

May be when she will be my age….

She will try to write the same post, just better.

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

Our memories are all we have was originally published on The Puchi Herald Magazine

The IoT Files – Infrastructure

The IoT Files – Infrastructure

The IoT Files – Infrastructure

IoT is a complex argument, we already know it. In my previous introductory post I tried to explain the privacy and security concerns that IoT is bringing to us (https://www.linkedin.com/pulse/iot-files-privacy-antonio-ieranò).

Most of those concerns are intercnnected one to the other, but have also a strong relationship with the next point: the infrastructure needed.

With Infrastructure I refer to a lot of things, that goes beyond the simply technical aspect, because a real IoT infrastructure goes way beyond the access protocol or the wireless.

Wireless Outdoorsindoors Connections

But since we named Access protocols and wireless stuffs, let us talk about the first easy infrastructure need: connectivity.

In a IoT world devies need to be connected, and just a few of them will be able to connect trough a cable connection.

We can imagine a cable connection to our SCADA environment, sure, but things would be a little harder if we consider our smartwatch or our autonomous driving car, I can’t think we can go with an Ethernet cable connected on those devices :)

Connectivity will be an issue in IoT for several reason, and sometimes I got the impression we underestimate the issue.

Let’s think about our homes, they are in most of the case connection unfriendly. It is not just the level of connectivity and bandwidth offered by our provider (that will be object of a later point).

We should start to design new home with wireless point in mind, probably, and enough network power to get all connected, but what about the old ones (that are the majority?).

At the moment the offering is still way far to be exhaustive, some steps with power-line and home wireless have been done, but just to mention the security and privacy concerns we mention before, this is still not enough.

The routers we use in our homes are all but efficient, and surely not able to deal with hundreds of devices not in terms of connection nor in terms of protection.

But is when we go out our homes things get harder.

What we should expect? a wireless coverage of urban areas is something we can imagine, but as we go out the urban area things get harder. Of sure we can use our phone provider that offer connectivity (at the moment through 344.5 G) but will this enough?

does every device use a sim and a contract? and what happen when we need to go out our city or our country? will roaming hit us down?

Infrastructure from this point of view is all but simple, we need to be able to transport petabyte of data (is what billions of device connected means, my friends) in a multinational context, providing access to different devices with different capability.

Digital Divide

This will be a key factor that will make dramatically clear what is the digital divide. We are struggling with the digital divide so much today, the problem will be bigger in the IoT future, because if we are struggling to put in place the infrastructures now, how will happen when we will need bigger, faster, stronger and more complex infrastructures?

Large areas even in the more advanced countries will be cut out: countryside, mountain, islands…..

5G

The promise to mitigate those concerns is called 5G technology. But I want to be clear, %G is far to be a complete standardized technology at the moment, it is still under development and most of the issues we named for IoT are issues also for 5G, from access to security to business models….

I personally think that 5G will be a useful step forward, but will not replace the heterogenic  environment. wireless and ethernet will keep playing a big role and therefore interaction between the different technologies will be mandatory.

Bandwidth

Like it or not even in the 5G world we will have to fight the bandwidth problem, because this will be the issue. data requires bandwidth and IoT means data, without data IoT does not exist.

Bandwidth is not a easy issue, because it means how to prioritize traffic, how to manage traffic coming form different sources..and so on.

the amount of bandwidth available and its management will be a key issue, it require clear infrastructures and models we still lack.

E-Government

Even the government services will have to face the IoT revolution and become to be compliant, we cannot imagine that a life hyperconnected require a form manually compiled by the user, isn’t it? government infrastructure will have to shift dramatically towards a new model where informatization is not just a way to have more efficiency, but the only way to provide service.

It can not seems an infrastructural problem, unless you remember your experience when dealing with a government office…we lack of tools (HWSW), personnel, culture, policies, knowledge..isn’t this infrastructure?

Old Issues

While we will deal with new issues we should not forget we have a lot of old issues to deal with that can make hard the transition to the IoT.

Let’s name a couple that are so big (and so neglected) that I am wondering why we still talk about IoT.

Old Issues – DNS

Billions of device will try to connect to the internet, every device will look for partners to communicate. Unless we think all those device have hardcoded the partner address (wich is unlikely and highly impractical for a thousand of good reasons, one for all, flexibility) the device will need to translate a logical address to an IP address.

This service, nowadays, is done by DNS infrastructure. The DNS infrastructure is an area of big concerns, because it is subject to attacks, it is easily victim of geopolitical issues (a government closing the root, as an example) or poisoning entire zones for censorshipmass spying issues.

At the moment DNS around the world are really in a bad condition, mot of the carriers that offer DNS resolutions does not even size them proèperly, not talking about protecting. the reason is that DNS resolution service is not perceived as a key aspect, and it is not direct source of revenues.

If for a security perspective something is moving, with the DNS-SEC extensions, form a performance side this is still a pain in the butt. most of the time when you blame your provider for bandwith, if your page does not load is because of the poor DNS resolution service.

In a world of billions of devices this infrastructure, easy prediction, will collapse. Name resolution will need a support, what I am afraid is the developing of custom made legacy protocols (peer to peer style) that will address the problem in the lack of a commonly accepted solution, this will affect security and interoperability of IoT.

So if you think DNS is not a problem in IoT will be.

TCPIP

But DNS is a victim of a deeper problem, we all know that TCPIP v4 will not be able to scale to the IoT, but where we are not with TCPIP v6? let’s face the truth, we are still at the beginning, This is a big infrastructure concerns, because most of the infrastructure are not yet ready to move to IPv6, otherwise we will be already there.

There are big issues related to legacyold hardware, lack of knowledge form the technical people, absolutely not understanding form the decision makers that does not consider it an issue. so we are, in short terms, in a big sea of troubles at the moment.

When we talk about infrastructure we have always the same issue: who will pay for them?

we have to realize that all the needed infrastructure for IoT comes at a cost that someone has to pay.

Public and private will have to find a way to deal with this, because big investment will be needed.

Another painpoint is the timing: how long this infrastructure need to be set up? If prevision say we will have billions of devices for the 2020, I suppose those infrastructure will be ready for that time.

But wait, we are in 2016 now, and I can’t yet see thos ebig investment to cover and solve the issues we talked before. so may be the time will be an issue we will see sooner or later.

And we should remember that the infrastructures needed aren’t local ones, but international. Lack of standards, agreements will make it harder.

So we are seeing a big opportunity as well as a big headache.

but cheer up, as murphy told us, smile tomorrow will be worse.

next IoT files on Business models…..

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

The IoT Files – Infrastructure was originally published on The Puchi Herald Magazine

%d bloggers like this: