The Puchi Herald Reblog

A reblogging blog

It is time for research to think about security and privacy

It is time for research to think about security and privacy

We usually talk about cyber security and privacy related to the world of industry and personal, but today I would make some points related to research in universities.

how much security aware are universities?

This is an interesting topics, looking at the statistics on cyber security attacks I would say security and privacy awareness is not at the first point in their needs.

So bad …

well first of all let’s make a little distinction:

engineering vs the rest

it is out of doubt that engineering universities and research are more cyber security savvy than the rest. Some of them are also actively working and studying the the issue.

but nevertheless the overall cyber security and privacy approach, beside the ones actively working on the subject, is poorly implemented. on the other end engineering universities are full of guys playing with the fire … some will be the defenders of tomorrow, some are the hacker of today (hacker is not necessarily a bad term).

the rest is in a questionable situation, both cyber security and privacy lack of vision and willingness to address the point. even if there are areas that deal with very sensitive data, think healthcare industry.

the result is under our nose, a lot of people with great skills and knowledge on a lot of different subject completely unaware of the consequences of digitalization…. why do you think is so easy to break into healthcare systems, law firms and so on?….

The research issue

there was a time where being a scientist was putting your life at stake, was not easy to be Galileo Galilei at his time. But I hope that anyone with a brain can agree on the fact that science was mandatory to develop our society and way of life.  Science play an important role on human development, and I took science with the largest meaning…not only technology or physics, but medicine,  economy, social science, history, literature, philosophy … in a way culture … the connection and ramification of science with art, as an example, are undeniable… so we should ask ourselves if there can be a world without science.

But science is based on theories more than faith, trials more than prayers, and therefore need a solid trusted based …

the trust is no more here

In this security and privacy unaware environment seldom researchers that are not security focused put attention to security, but nowadays research environment and criminal landscape and geo_political warfare would suggest a different approach. if some years ago the word of a scientist was respected, nowadays seems that politics take over science and data and result are not what they are, consequences of studies and trial, but things are what your political beliefs want it to be.

so we see a rising of “creationists” or other religious para-scientific accreditation as “scientific”, as well the denial of scientific evidence in the name of political or religious beliefs (think at global warming as an example).

When you start a research you need, basically, to start collecting and managing data, use some computational power, share those data with peers…. but those data, those exchanges are what we should take a look for also in terms of privacy and security.

Depending on the nature of the research you can have direct evident privacy and security implications, but even if you are working on not apparently key areas you should put some precautions on the table. Let quickly try to explain why:

data are important

Data are what you have to work on, you sample, collect, store, analyze, transform data.

In a trusted environment you can avoid to care too much, come on i trust you and you trust the others so what can be wrong… but this is no more the reality.

  1. if your data have some kind of value (and i think they have, or you would not use them) you should protect them
  2. if your data are needed to prove your point you should be able to ensure they are reliable
  3. if your data need to be exchanged with others you should be sure what you transmit is what they get, and what you receive comes from a equally trusted source and data itself are trustable.
  4. if you work worth something may be you want some intellectual property on it, and therefore you have to be sure your result are not repudiable, subject to copy or used and\or modified without your knowledge

those 4 points are the the main areas where you should put privacy and security into the equation no matter what your research is.

what is the value?

Every time you have to invest something you make a tradeoff between the invested monetary resources and the expected output. in science this is a hard exercise so i understand most of the time you do not want to look for data protection but try to think how much you depend on those data..

what happen to your research if a ransomware encrypt you data?

what happen if a attacker or a incident poison your data with some bias?

sometimes you can also be a “collateral damage” and not the direct target but, does it make any different to you?

if you are not able to put those consideration on the table you can start wonder what is the value of your job.

protecting means?

usually you set up things using what comes to your hands. this does not means crappy thing but…how much planning have you put on this?

have you considered what happens if you lost your data for a mechanical crash?

or for a hacking attempt?

of for a genuine honest mistake of your developer that write the code that manage your data?

or if your shared repository have to give space to something more important?

and what if someone tamper your data?

and what if someone copy your data?

and what if ….

this kind of scenarios are not your research field, I know, but nevertheless are connected to your job and you should start to consider them.

backup, storage, encryption, access management, Intellectual Property protection, data exchange, computational requirements… all those thing should be managed in a sound reliable plan that foresee current and future needs…

the problem of exchange

another aspect that is really critical is how you can be sure that the data you are exchanging are managed correctly.

the first point when there is an exchange between to point is to be able to trust the point itself. this basically means you want to exchange data with this subject, but may be not with another one (i know you are not all friendly one to the other).

so the point is how you can be sure you are sending the data to the correct source…

When you send something you should assure the counterpart that what he\she\it will receive is what you are sending, data should be managed in a non repudiation and anti tampering way, and also maintain the ownership if needed.

now they can be a genoma of a rock, a clinical trial result on the effect of mars over alopecia, a set of data on relationship between gun distribution and bird control rate, the climate data of the last 100 years in neverland…whatever… you need your data be recognized as:

1) yours

2) truthful even after the transfer

the point here is that otherwise anyone can change assumption and therefore conclusion making you part of a fraud. you should always be able to say, they those were not my data….

and in a moment where politics and science collide once again this is not a minor issue.

food for thought

privacy and cyber security are sons of the current expansion of the digitalization. Those issues are not a side tough but real component of your everyday job even if you are a researcher in areas way far from cyber security, information technology or whatever.

you should also start thinking if those data should be kept public how to maintain, store and allow access to them in a consistent and secure way. Sure you can post them on facebook and tweet them but maybe, just maybe, this would not be the optimal solution.

And you should start thinking about those things before it’s too late. no matter who you are, what you do digital life is here for you too and you should start acting accordingly.

just think about it.

Antonio

 

 

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

It is time for research to think about security and privacy was originally published on The Puchi Herald Magazine

Dear CISO, please talk about business with your board, not technicality.


Dear CISO and Board

I think we should always consider our job as a part of the business. We finally started to consider cyber security and data protection as a serious issue but now the question is how we evaluate a risk in our analysis and business plans…

Current documentations and reports, for risk analysis, presented to most of the boards use just a flag (High, medium, low risk) but does not seems to specify any metric. Without metric it is hard to make sound evaluation and comparison so to the question raised by any member of the board : “does a high risk in XYZ be dangerous as a high risk in ABC” can’t have a credible answer if not on “perception” which is subjective if not backed up by facts..

Security metrics are, as of now, subject of interpretation and discussion but we can simplify the approach to make security analysis somehow credible and understandable.

First of all, to answer to board question what is needed is a common framework of evaluation, that include easy to read metrics, that make comparison understandable even to not cyber security experts, as most of the board member that have to take decision based upon those inputs are.

This is something that goes beyond the Cyber and Information Security Officer tasks, this requires the whole company to start thinking about its cyber security and digital assets, but unless the approach is to take a reactive way of do thing, inputs coming from you should be provided to start outlining this framework and metrics.

Alas cyber security risk analysis is all but simple, mostly if related to business impact, since it requires understanding of cyber security issue and, as well, the business in which the risk is analyzed.

There are two main aspects that need sound and readable metrics:

  1. Risk evaluation
  2. Risk consequences

The first item is used to define how “risky” is something. Measure a risk requires, to simplify a complex matter, to be able to evaluate the probability that something happens, the magnitude of the damage, and the cost for fixing things. Magnitude of the damage and cost to fix things are bound to Risk consequences, that are, basically, the metric that can be used in a board meeting to describe the risk in terms understandable to a non-cyber security aware audience.

I will not enter in the realm of risk evaluation deeply here, you have a deep knowledge and understanding of the issue and I do not want to bore you with my considerations, but let me notice how there is not, apparently, yet a common framework of evaluation spread through your company’s groups and BU on the matter.

If risk evaluation is one key, but mostly technical, aspect, let me point out something on the risk consequences aspect that can be of some use in the future business plans to make them useful from a business perspective and not just a sterile exercise.

Risk consequences can be presented, basically, in some dimensions that are somehow related, the aim here is to understand if a cyber security incident occurs what can be the measures that allow your company to describe it and, therefore, compare with another event.

Would make sense, in my point of view, to present any risk analysis to the board and other managers in those terms:

1)     Monetary cost in terms of loss revenues

2)     Monetary cost in terms of live costs

3)     Impact on market penetration

4)     Impact on brand perception

This would allow to compare an XYZ incident to a ABC incident and answer somehow to Board question, and, moreover, to give a metric to understand where and why to invest in an area instead of another one.

Let me quickly describe the 4 points.

1)     Monetary cost in terms of loss revenues

This is a dimension that can be easily perceived by sales and financial managers. This basically means to be able to estimate how many direct selling activities will be impacted by the incident. The timeframe taken into account is key, of course, since events can have different effect in terms of immediate, medium and long term timeframe.

The evaluation can be presented both in terms of net amount of money or % compared to budget. Both make sense to understand the impact.

2)     Monetary costs in terms of live costs

This basically means to put into account all the live costs related to the incident as fines, legal issues, HWSW replacements, people working on the issue and so on. It is important to separate costs related to the incident to the loss revenue related to the incident.

3)     Impact on market penetration

This is a metric that make sense for a vendor who is trying to expand its footprint in the market as your company is trying to do. It is strictly connected to the direct revenues but also to the growth expectations. This can be represented as a % of the market share.

4)     Impact on brand perception

This last item is the hardest to measure, since it depends on the metric used to value Brand inside your company, since I have been never told what metrics are used I can here just suggest to present the %variation related to the value before the incident.

For what I know this has not been done before on Cyber and Information Security Business Plans. It could be either something sound to present in your future BP or a task for the Cyber and Information Security Office to be implemented for this year if the structure is not able to do this kind of analysis and presentation.

With those 4 points would be possible to both:

make comparison between risks

and

provide to the board an output that can be objectively used to take decision.

Let take, as an example, privacy risk related to GDPR not compliancy.

This approach would allow you to present in the BP set of data to justify expenses and investments every time a risk is presented; something like:

Let me explain the the table to you, of course values are fictitious and timeframe can be adjusted to your reality but i think this can give you almost a basic understanding of what i suggest.

GDPR not compliancy:

1)     customer personal data breach: Columns headers

Short term impact (1-3 months)

It is what happen immediately after the problem, where you have to set up the required operations to make things running again somehow. If you have a Emergency Response Team (You should) this is where you put the costs…

Midterm impact (3 months – one year)

Let be honest, if it is a minor outbreak may be things will be solved quickly, but if the problem is bigger, as your marketing database exposed, you will start considering also legal costs, fines and the impact on your market…

Long Term Impact (1-3 years)

Things have an impact also after your BP, life is nt restricted to your daterange, business is not restricted to daterange, you you should be able to make prediction and analysis way longer than the simple one year timeframe. It is common in any business, so here too.

2)     customer personal data breach: rows headers

Revenue losses

This is the revenue losses that you will have to face upon your budget expectations.

Live costs

This contains what you have to pay, your direct costs that cove, as an example:

  • HWSW replacement
  • Fines
  • Estimated “damaged user” legal issues if someone sue you
  • ransom paid
  • eventual cyber security insurance policy fee rise
  • stop production costs
  • people working on the issue to solve the problem (eventual forensic analysts, cyber experts, lawyers …)

Impact on Market Penetration

This is where you put how the incident will damage your business in terms of your presence and future outlook.

Impact on Brand Perception

this is how your credibility will be affected

With this kind of matrix would be easy to make correct evaluations and comparison. I am not sure this is at the moment something that can be done with the current analysis tools but eventually would be a sound element to put in a BP for a future sound approach to cyber security risk evaluation.

regards

Antonio

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

Dear CISO, please talk about business with your board, not technicality. was originally published on The Puchi Herald Magazine

NFV network function virtualization security considerations

NFV network function virtualization security considerations

I have been asked to write down a few things related to NFV and security. NFV is relatively a new thing in the IT world. It has been on the news in 2012 and since then it has followed the developing path common to the virtualization technologies.

Virtualization has made dramatic improvement in the last years. It all started at first with simple virtualization platforms, of course VMware on top of our mind, but not only. The idea was to abstract HW platforms from software ones.

Developing the idea, the abstraction growth covering multiple hw platforms moving also to the multisite in WAN and geographical development. We call this sort of implementation nowadays cloud, but all the cloud story started from the old virtualization idea.

While this platform change was taking place, the world of services was experimenting different clientserver options (web services and so on).

With the new platforms taking place it was clear the network part would have followed this trend, moving to software and virtual shores.

Form the network point of view the first step has been the SDN (Software Defined Network).

Software defined networks (SDN) allow dynamic changes of network configuration that can alter network function characteristics and behaviors. For example, SDN can render real-time topological changes of a network path. A SDN-enabled network provides a platform on which to implement a dynamic chain of virtualized network services that make up an end-to-end network service.

SDN basically allow to centrally administer, manage, configure network services creating policies that can be related to different needs and able to adapt to a changing environment.

But this level of abstraction was not enough to cover the needed flexibility of the new implementation of modern datacenter, cloud and virtualized environment.

In a SDN environment the network gears remain mainly real solid box in an environment that is way more virtualized.

The first attempt to hybridize the physical network with the virtual one was the introduction of the first virtual network element as switches and firewalls. Those components were sometimes part of the hypervisor of the virtualizing platform, sometimes virtual appliances able to run inside a virtual environment as virtual appliances.

Those solutions were (are, since actually exist) good t target specific needs but were not covering the needed flexibility, resilience and scalability required to modern virtualization systems. Products like VMware’s vShield, Cisco’s ASA 1000v and F5 Networks‘ vCMP brought improvements in management and licensing more suited to service provider needs. Each used different architectures to accomplish those goals, making a blending of approaches difficult. But the lack of a comprehensive approach was making difficult to expand those services extensively.

The natural step of the process of virtualization would have be to define something to address in a more comprehensive way the need to transfer part of the network function inside the virtual environment.

Communications service providers and network operators came together through ETSI to try to address the management issues around virtual appliances that handle network functions.

NFV represents a decoupling of the software implementation of network functions from the underlying hardware by leveraging virtualization techniques. NFV offers a variety of network functions and elements, including routing, content delivery networks, network address translation, virtual private networks (VPNs), load balancing, intrusion detection and prevention systems (IDPS), and firewalls. Multiple network functions can be consolidated into the same hardware or server. NFV allows network operators and users to provision and execute on-demand network functions on commodity hardware or CSP platforms.

NFV does not depend on SDN (and vice-versa) and can be implemented without it. However, SDN can improve performance and enable a rich feature set known as Dynamic Virtual Network Function Service Chaining (or VNF Service Chaining). This capability simplifies and accelerates deployment of NFV-based network functions.

Based on the framework introduced by the European Telecommunications Standards Institute (ETSI), NFV is built on three main domains:

  • VNF,
  • NFV infrastructure, and
  • NFV management and orchestration (MANO).

VNF can be considered as a container of network services provisioned by software, very similar to a VM operational model. The infrastructure part of NFV includes all physical resources (e.g., CPU, memory, and I/O) required for storage, computing and networking to prepare the execution of VNFs. The management of all virtualization-specific tasks in NFV framework is performed by NFV management and orchestration domain. For instance, this domain orchestrates and manages the lifecycle of resources and VNFs, and also controls the automatic remote installation of VNFs.

The resulting environment now is a little bit more complicated than a few years before.

Where in the past we used to have

  • physical servers running Operative Systems as Linux, Unix or Windows bound to the specific hardware platform, and almost monolithic services running on those solutions,
  • physical storage unit running on different technologies and network (Ethernet, iscasi, fiber optic and so on),
  • network connected through physical devices, with some specific unit providing external access (VPN servers)
  • and protected by some sort of security unit providing some sort of control (firewall, IPSIDS, 802.1x, AAA and so on)
  • managed quite independently trough different interfaces or programs

now we moved to a world where we have

a virtualized environment where services (think as an example at Docker implementations) or entire operating systems run on a virtual machines (VMs) that manage the abstraction with the hardware

and is able to allocate resources dynamically in terms of performance and even geographic locations,

a network environment which services are partially virtualized (as in VNF implementation) and partially physical and interact with the virtual environment dynamically

a network configured dynamically through control software (SDN) which can dynamically and easily modify the network topology itself in order to respond to the changing request coming from the environment (users, services, processes).

Nowadays, the impressive effects of network functions virtualization (NFV) are evident in the wide range of applications from IP node implementations (e.g., future Internet architecture) to mobile core networks. NFV allows network functions (e.g., packet forwarding and dropping) to be performed in virtual machines (VMs) in a cloud infrastructure rather than in dedicated devices. NFV as an agile and automated network is desirable for network operators due to the ability of easily developing new services and the capabilities of self-management and network programmability via software-defined networking (SDN). Furthermore, co-existence with current networks and services leads to improve customer experience, and reduces the complexity, capital expenditure (CAPEX), and operational expenditure (OPEX).

In theory, virtualization broadly describes the separation of resources or requests for a service from the underlying physical delivery of that service. In this view, NFV involves the implementation of network functions in software that can run on a range of hardware, which can be moved without the need for installation of new equipment. Therefore, all low-level physical network details are hidden and the users are provided with the dynamic configuration of network tasks.

Everything seems so better and easy, but all those transformation does not come out without a price in terms of security.

Every step into virtualization bring security concerns, related to the control plane (think of hypervisor security, orchestrator security), the communication plane, the virtual environment itself (that often inherit the same problem of the physical platform), and the transition interface between the physical and virtual world.

Despite many advantages, therefore NFV introduces new security challenges. Since all software-based virtual functions in NFV can be configured or controlled by an external entity (e.g., third-party provider or user), the whole network could be potentially compromised or destroyed. For example, in order to properly reduce hosts’ heavy workloads, a hypervisor in NFV can dynamically try to achieve the load-balance of assigned loads for multiple VMs through a flexible and programmable networking layer which is known as virtual switch; however, if the hypervisor is compromised, all network functions can be disabled completely (a good old Ddos) or priority can be provided to some services instead others.

Also, NFV’s attack surface is considerably increased, compared with traditional network systems. Besides network resources (e.g., routers, switches, etc.) in the traditional networks, virtualization environments, live migration, and multi-tenant common infrastructure could also be attacked in NFV. For example, an at- tacker can snare a dedicated virtualized network function (VNF) and then spread out its bots in a victim’s whole network using the migration and multicast ability of NFV. To make matters worse, the access to a common infrastructure for a multi-tenant network based on NFV inherently allows for other security risks due to the shared resources between VMs. For example, in a data center network (DCN), side-channels (e.g., cache-based side channel) attacks and/or operational interference could be introduced unless the shared resources between VMs is securely controlled with proper security policies. In practice, it is not easy to provide a complete isolation of VNFs in DCNs.

The challenge related to secure a VFN are complex because are related to all the element that compose the environment: physical, virtual and control.

According to CSA Securing this environment is challenging for at least the following reasons:

  1. Hypervisor dependencies: Today, only a few hypervisor vendors dominate the marketplace, with many vendors hoping to become market players. Like their operating system vendor counterparts, these vendors must address security vulnerabilities in their code. Diligent patching is critical. These vendors must also understand the underlying architecture, e.g., how packets flow within the network fabric, various types of encryption and so forth.
  2. Elastic network boundaries: In NFV, the network fabric accommodates multiple functions. Placement of physical controls are limited by location and cable length. These boundaries are blurred or non-existent in NFV architecture, which complicates security matters due to the unclear boundaries. VLANs are not traditionally considered secure, so physical segregation may still be required for some purposes.
  3. Dynamic workloads: NFV’s appeal is in its agility and dynamic capabilities. Traditional security models are static and unable to evolve as network topology changes in response to demand. Inserting security services into NFV often involves relying on an overlay model that does not easily coexist across vendor boundaries.
  4. Service insertion: NFV promises elastic, transparent networks since the fabric intelligently routes packets that meet configurable criteria. Traditional security controls are deployed logically and physically inline. With NFV, there is often no simple insertion point for security services that are not already layered into the hypervisor.
  5. Stateful versus stateless inspection: Today’s networks require redundancy at a system level and along a network path. This path redundancy cause asymmetric flows that pose challenges for stateful devices that need to see every packet in order to provide access controls. Security operations during the last decade have been based on the premise that stateful inspection is more advanced and superior to stateless access controls. NFV may add complexity where security controls cannot deal with the asymmetries created by multiple, redundant network paths and devices.
  6. Scalability of available resources: As earlier noted, NFV’s appeal lies in its ability to do more with less data center rack space, power, and cooling.

Dedicating cores to workloads and network resources enables resource consolidation. Deeper inspection technologies—next-generation firewalls and Transport Layer Security (TLS) decryption, for example—are resource intensive and do not always scale without offload capability. Security controls must be pervasive to be effective, and they often require significant compute resources.

Together, SDN and NFV create additional complexity and challenges for security controls. It is not uncommon to couple an SDN model with some method of centralized control to deploy network services in the virtual layer. This approach leverages both SDN and NFV as part of the current trend toward data center consolidation.

NFV Security Framework try to address those problems.

If we want to dig the security part a little deeper we can analyze

  • Network function-specific security issues

and

  • Generic virtualization-related security issues

Network function-specific threats refer to attacks on network functions and/or resources (e.g., spoofing, sniffing and denial of service).

The foundation of NFV is set on network virtualization. In this NFV environment, a single physical infrastructure is logically shared by multiple VNFs. For these VNFs, providing a shared, hosted network infrastructure introduces new security vulnerabilities. The general platform of network virtualization consists of three entities; the providers of the network infrastructure, VNF providers, and users. Since the system consists of different operators, undoubtedly, their cooperation cannot be perfect and each entity may behave in a non-cooperative or greedy way to gain benefits.

The virtualization threats of NFV can be originated by each entity and may target the whole or part of the system.

In this view, we need to consider the threats, such as side-channel or flooding attacks as common attacks, and hypervisor, malware injection or VM migration related attacks as the virtualization and cloud specific attacks.

Basically VNF add a new layer of security concerns to the virtualizedcloud platforms for at least 3 reasons:

  • It inherits all the classic network security issues and expand them to cloud level

This means once a VNF is compromised there are good chances it can spread the attack or problem to the whole environment affecting not only the resources directly assigned but anything connected to the virtual environment. Think, as an example, the level of damage that can be provided performing a Ddos that deplete rapidly all the cloud network resources modifying, as an example, the Qos parameters and not using the traditional flooding techniques (which are anyway available).

  • It depends to several layers of abstraction and controls

Orchestrator and hypervisor are, as a matter of fact, great attack point since can

  • It requires a better planned implementation than the classic physical one,

With a tighter control on who is managing the management interfaces since, in common with SDN, VNF is more exposed to unauthorized access and configuration-related issues.

Still VNF requires studies and analysis from security perspective, the good part is that this is a new technology under development therefore there are big space for improvement.

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

NFV network function virtualization security considerations was originally published on The Puchi Herald Magazine

Happy new insecure 2017: my resolutions and wishlist for new year

Happy new insecure 2017: my resolutions and wishlist for new year

Here we are, a new year comes and we, as cyber security expert, will keep warning the world about the deeply insecure world we are living.

And we will announce new technologies and new devastating scenarios related to new technologies. IoT and Cloud will rise their evil face while bad people will be lurking in the dark waiting to attack the innocent lamb crossing the road.

But, in all of this, the most of the damage will be still done by bad designed systems, by managers that does not understand what means living in a digital world, by politicians that understand cyber security issues only when they have something to gain, by entrepreneurs that still will invest in security as a disturbing side effect.

If I can make a wish for the new year is to see finally a different approach to information security, an approach that take into account that

1) to be secure you need well designed systems first and then cover it with some security geek technologies. If the design is crap all your security is crap no matter what you use on top

2) there is not security if your devices are not designed with security in mind, good code and code lifecycle is the best insurance, so if you buy the cheapest then do not cry … is your job to look for what you need and so yes is your fault if something goes wrong.

3) that finally companies, managers, entrepreneurs understand that security is within process, and not just a bunch of technologies put on top of something that you do not have the slightest idea what it is, you can’t protect what you don’t understand

4) that if people do not understand then people will not follow even the most basic rules, so training is not an optional, but the very basic. And to be sure the first that have to learn are the “CxO” which should get off the throne and start learning the world they crafted.

5) that if we keep thinking that IoT is wonderful but do not understand what IoT will bring in terms of cultural and technical problem we still will never understand what means putting security on this.

6) that if you hire an expert and then you don’t listen to himher then you are wasting hisher and your time. then do not blame the messenger.

7) that if you think that this complex field we call security can be covered by a junior that knows it all you are probably wrong unless the junior is a genious

8) that if you, security expert, think your counterpart has the slightest idea what you are talking about, you are probably wrong because you did not realize they do not understand what they does not know.

9) that all of this is part of the business, and therefore the business should took all this as one of its element, and not just a nasty annoying add on.

10) that next time someone talk about APT tells you the truth, the only way to stop an APT is to stop the attacker otherwise…. it would not be an APT

I know I know I am a but naive and still believe in fairy tales…

 

happy safe and secure 2017 to you all

security awarenesssecuritysecurity culture2017

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

Happy new insecure 2017: my resolutions and wishlist for new year was originally published on The Puchi Herald Magazine

The IoT Files: Culture

The IoT Files: Culture

The IoT Files: Culture

 

Diapositiva28

In the previous IoT flies tried to outline what are, from my point of view, some key factor that have to be taken into account when talking about IoT.

The last, but not the least, point I would like to put some notes is culture.

Since IoT is something that will shape out way of life on many aspect, we have to convene that culture is a key element in order to positively and safely embrace it.

Culture refers to billions of things, from language structure to literature, from how we share information to how we get them. In any of those aspects IoT will have a great impact and relevance.

Diapositiva29

IoT awareness.

From a cultural point of view embracing IoT means, first of all, the awareness of IoT is and its implication.

This awareness and understanding will be shaped while IoT will growth and become part of our life, but if we start to talk about cultural impact of something when it is already there, it is too late.

If we weight our experience nowadays we still do not have coped, from a cultural point of view, with all the technological advantage. Sometimes we simply refuse to accept them and label as bad, ot we use it without a real comprehension.

The result is under everyone’s eye, from the rise of cybercrime to the rise of internet dependencies and the apparent shrink of interpersonal relationships literature is full of example on how we still badly cope with the new technology.

Laws also are affected by this difficult to comprehend the new environment, as management culture as well.

IoT awareness is therefore way more important since is way more pervasive than our actual technology.

A new privacy

IoT will be so pervasive that will change dramatically our perception of privacy. as a matter of fact in the IoT world there is nothing like privacy at all, somehow there is always a sensor monitoring you, and this could drive to unexpected behavior reactions. But for sure a new approach to privacy will be necessary, as well as a new approach to privacy protection. In a world where all is turned on data, those data becomes the paradigm of our reality and so we will have to deal with that accordingly.

Communication Issues

But the changes are also related to the way we will communicate. New jargon comes out every moment, millennial have different language from generations X or baby boomers, and so IoT will developed its own language. How we will incorporate it and drive it is still to be defined, but in IoT the wide level of communication and data interchange will move all this to a worldwide scale. Language will not become a local issue anymore just because to exchange data it is needed a common communication framework. As for privacy without a common understanding of the rules will soon be turn this into a chaos.

Censorship and cultural constrain

One of the main issues IoT will bring with it is how to deal with communication restrictions, or in other words censorship. We have already mentioned censorship as one of the big issues that can affect IoT, to stress more the idea it will be not only a business problem but also a cultural problem. A world of sensor that are monitoring everything (this is the downside of IoT) can affect heavily systems believes and force some culture to close up into themselves. If we will not understand how to cope with it all relationships could be bring to the extreme.

We see it nowadays with the rise of Hate speeches, bullies, urban legends, fake stories on social media how difficult is to cope with more open communication channels, can you imagine what IoT will bring back? We have to assume that the number of data will be way more, and so the way people will interact with those data.

Who is left behind?

And the cultural issues will affect more the technology illiterate, and the ones will be left behind, marking a wider distance between the IoT world citizens and the one left behind. The digital divide is already a cultural problem, IoT will widen it up. Without the proper tools to understand this world the level of non comprehension will rise up dramatically, widen tensions.

And this is not just a problem from rich and poor countries, even inside rich countries the difference and the level of familiarity with technology vary dramatically in social groups or areas.

Illiteracy today is not just referred to not be able to write or do math, but also use internet and technology as computer or Smartphone. Just wide it up the gap with the introduction of new technologies….

How to teach all this

The root of the problem will become: how to teach all this?

Diapositiva30

Today a scholar system does not approach, generally speaking, the actual technology environment. Schools is, roughly, a century behind the modern world. Access to technology, how to deal with technology, is not common in most of the worldwide scholar system. Is not just a problem of technology in place (give a computer to every student) but also how to teach with the new tools and what to teach?

Cyber security basics, as an example, should be a mandatory introduction in any school of any grade, considering the age our children approach the technology without the proper mindset. But schools are slow to cope with the new world.

But also at corporate level illiteracy about cyber security, technology use, implication between technology and communication are the common reality, and this lack of knowledge spread at every level from the lowest to the highest. a very few exception here can be done.

This issue should cover all the aspect of educations, from first grade to university, to corporate training. We can not afford anymore children that does not know how to protect themselves from the cyber world, of university graduate that face the real world as completely illiterate of what they will find in the real corporate environment, of developers that has not the slightest idea what means privacy and security, of management that is not able to evaluate the impact of technology in their business and so on.

Not to be able to deal with this will means to be overwhelmed by the impact of those technology and, in last analysis, to be ruled out as dinosaurs.

TBD

And the list could go on and on. We can make prediction but we can’t see clearly the future (unless using a crystal ball). We need to have new cultural, linguistic, philosophical tools to help us to cope with the new reality.

What to do?

We should start it now, not waiting for some higher action. Share knowledge, awareness, talk and think about those issues is the first step to find a solution and address them.

This is also a call to be active in associations, think thank group or whatever you can to help rising awareness. and where you feel gaps in your own knowledge you can try to discuss them asking from support.

good thinking

Antonio

 

var aid = '6055',
    v = 'qGrn%2BlT8rPs5CstTgaa8EA%3D%3D',
    credomain = 'adkengage.com',
    ru = 'http://www.thepuchiherald.com/wp-admin/post.php';
document.write('');

The IoT Files: Culture was originally published on The Puchi Herald Magazine

%d bloggers like this: