Showing posts with label cyber. Show all posts
Showing posts with label cyber. Show all posts

Monday, 19 November 2018

C&ESAR 2018, day 1

During this week, Rennes will be hosting the European Cyber Week. This is a great opportunity for communication service providers (CSP), manufacturers, industry players and end users to meet and present current research activities about Artificial Intelligence and cyber security.

Opening speeches stressed on the importance on AI for defense, since each day networks and services experience attacks, not only from external actors, but from inside our networks. The idea is not only to disrupt service, but to steal data, jeopardizing research and intellectual property. This constitutes the rise of adversarial AI, and to react to this menace, it is necessary to enhance our understanding and usage of generative adversarial networks.

Other interesting topic is about "opening the black box" in order to have an explanatory AI, which has the goal of explaining the reasoning of the decisions taken by AI. This is important to gain confidence of users and to speed up the adoption of these techniques in various use cases.

Some other topics that caught my attention:

  • The diverse malicious AI techniques to break cyber-security, such as data poisoning. This in order to induce errors on the machine learning model.
  • The bias problem, which most of the times has its source of the training data and leads to erroneous decisions at the end of the AI process.
  • The usage of different frameworks to have behavioral analysis, very useful to detect deviation of usual usage patterns of an entity. This helps to detect compromised entities our users that deviate or abuse of privileges.
I hope day 2 brings new insights and interesting approaches to better understand AI and its challenges.

Tuesday, 28 November 2017

Faire la pause: European Cyber Week à Rennes, day two

On day two, the approach was quite different but no less interesting: the topics covered training, penetration testing and protection from treats. Key points:
  • Simulation environments are very important because of the several use cases, for example, you could use a simulation to recreate an attack by leveraging on virtualization and traffic generators to replay the packets and perform a post-mortem analysis. Other use is for training by using a virtualised version of the real products, topology and traffic generators and controls to provide a learning environment. Something analogous to a flight simulator. It is way cheaper that playing with the real equipment. This makes me remember when I learned about networking protocols using Packet Tracer or GNS3.
  • Testing environments are really important to provide training for personnel in order to operate a platform properly and to make hacking exercises to find vulnerabilities in  the system. Specially this last part, involves not only the technical expertise on protocols and commands but also deals with the physical aspects of the infrastructure in buildings. All attack surface, (may be virtual, physical) is susceptible to be exploited and used as entry point to compromise an organization.
  • Businesses do not wait for communication Service Providers to help to implement security procedures or protection plans. Businesses and companies are taking their first approach to the problem by deploying tests and self-penetration exercises. The network is just a data pipe. This insight makes me think about the role of the infrastructure provider or slice provider to a company... A telecom would care about what traffic the customer has inside the slice? My responsibility as a telecom operator is to provide the resources and guarantee the SLAs with my customer... the same way when we provided E1s, VPLS, VPNs...
  • An authorized penetration testing is a procedure that involves a lot of administrative planning! even the presenters (from SODIFRANCE) told a fun anecdote about an "out of jail card" (pun from a Monopoly card. Everything has to be set up properly.
  • The approach proposed by the presenter (from Thales Communications and Security) covered a test-bed for a service. I wonder if the same could be done for the infrastructure. I think it is possible, since virtualisation techniques span the different layers of the anatomy of a service.
  • There is a saying that states that if the only tool you have is a hammer, all your problems would be shaped like a nail. The key point from the presentation of Franck Sicard is that people tried to apply the same techniques used to secure an IT system to an ICS (Industrial Control System). Every system, service, industry has its special equipment, protocols and processes. The security approach is different in each case.
  • The future telecommunication architecture must have the means to provide administrative rights to create snapshots of a slice, in order to provide security features, rollback of configuration and resilience to failures. Could be interesting to think about this scenario.

Monday, 27 November 2017

Faire la pause: European Cyber Week à Rennes, day one

During this week, Rennes is hosting the European Cyber Week. This event, in its second edition, covers several programs that relate to the cyber security treats in scenarios such as connected vehicles, naval environments, e-health and IoT.

This event began with the Journées C&ESAR, which will make emphasis on Data protection facing cyber threats. Conferences today covered the following use cases:

  • Naval environment
  • Laboratory of research
  • Autonomous connected vehicles
  • e-health applications and the privacy of the patient data
  • IoT
  • Government / enterprise reputation management
Each vertical has its own point of view about the treats and the value of its data: different core businesses, different kinds of data generators - consumers, networking requirements regarding QoS - QoE, types of information, metadata and associated value of it. Compromising these businesses would create havoc at different scales: measurements not arriving on time, stealing of sensitive research results; crash of vehicles, liberation of confidential medical data, economic and trust issues among countries all over the world, just to say some examples.

These variety of use cases and exigencies would finally land on the tangible entity all people only notices (the one to blame) when it fails: the network. Oh, well, human factor has to be taken into account too, but it is out of the scope for this moment ;)

The great challenge is to have a complete view, end to end, of all the components that make the service possible; to provide ground rules that provide coexistence and a "pacific" ecosystem. A common architecture that holds them all and provides communication capabilities as the users demand. 
  • How will the operation and management chain of command operate all the components and abstractions of the underlying control and infrastructure entities?
  • How involved should the communication service provider be in the data management of the segment (better: slice)?
  • Up to what extent the communication service provider must comply with certifications such as HIPAA (Health Insurance Portability and Accountability Act) in order to be able to provide services to a hospital? Or an aeronautical enterprise?
I love this conferences because provides food for the thought. A lot of questions, motivated by real world scenarios, that the telecoms (and the rest of players in the industry) must be able to answer. I hope to contribute in this process during the pursuit of my doctoral studies.