EU Cyber Resilience Act: Enforcing cyber norms far beyond Europe
Updated on 04 April 2024
The EU is drafting new cybersecurity rules, titled the Cyber Resilience Act (CRA), which have the potential to significantly change the IT sector, namely the development and distribution of software, hardware, and the internet of things (IoT) (but not ‘software as a service’ (SaaS), as the proposal explicitly excludes it).
If you have already heard of the CRA and expect to read a detailed legal analysis of the proposal, we won’t be covering that in this post, but we will build connections between the law of a state (i.e. supranational structures in Europe’s case) and the operationalisation of cyber norms (i.e. UN Group of Governmental Experts, UN GGE norms1) which were negotiated by several states and endorsed by the entire UN membership for international security and peace.
In the never-ending discussion on how to bring more cyber stability through the implementation of norms, we ask what each country specifically needs to do. Is each country required to play an active role in the operationalisation of norms, or are just a few major players needed for greater impact on the entire international community?
The CRA seems to be another example (e.g. after the EU General Data Protection Regulation, GDPR) of ‘food for thought’ for answering these questions. Europe’s future law, if adopted, wouldn’t simply change industry rules by introducing horizontal2 cybersecurity requirements for ‘products with digital elements’. If adopted, the CRA could contribute to the implementation of the UN GGE norms across multiple jurisdictions, far beyond Europe, thus creating a positive security net effect for other countries and relevant stakeholders.
How can this be possible?
While keeping in mind that we can currently only analyse the proposal text, not the final draft, some preliminary observations can be made. The current proposal:
- Establishes essential cybersecurity requirements for products with digital elements, as well as conformity assessment requirements for manufacturers, importers, and distributors of such products before putting them on the market
- In such requirements, covers a broad range of obligations: from ensuring the ‘minimisation of data’, to establishing vulnerability management and disclosure processes, and conducting ‘due diligence when integrating components sourced from third parties’
- Regulates vulnerability reporting to the EU Agency for Cybersecurity (ENISA) and/or national market surveillance authorities (depending on the role: manufacturers vs importers and distributors)
- Introduces an expected product lifetime by addressing the existing ‘end-of-life’ gap3, and by, in particular, expecting manufacturers to provide security updates for their products for the ‘expected product lifetime or for a period of five years after the placing on the market of a product with digital elements’. (The proposal, however, doesn’t specify who defines the ‘expected product lifetime’. Could it be an indefinite period of time? Whether this is a realistic expectation, even for small and medium-sized enterprises, is, of course, another question).
- Introduces a differentiated approach to compliance, and defines ‘critical products with digital elements’ (so far, it includes quite a diverse list of such products)
- Introduces security labelling (i.e. CE marking) for consumers and others
To illustrate the potential positive net effect for other jurisdictions (if the law is adopted), let’s imagine a hypothetical situation: a company headquartered in Country A in Asia-Pacific develops a password managing app which should be available to European citizens (and thus intends to put the product on the market). Company A doesn’t have employees, offices, or legal entities in the EU, but would like to provide the product globally (the app can be downloaded from any country unless the country itself imposes domestic restrictive measures).
If the CRA is adopted and keeps the current definition of ‘manufacturer’ as ‘any legal or natural person’4, Company A would first need to ensure that mandatory essential cybersecurity requirements are implemented. Given the wide scope of such requirements, it would also need to provide a far higher level of security for its app than is currently required. In the CRA proposal, password managers are treated as ‘critical products’, and further, stricter conformity assessments would also need to be implemented.
When imposing and enforcing security in products, the EU will, to some extent, do the ‘homework’ of other countries’ national authorities (i.e. help other jurisdictions make their contributions to the implementation of the UN GGE norms) as non-EU countries can be less advanced in terms of cybersecurity posture, budgets, and regulatory frameworks. The EU thus pushes non-EU companies to significantly enhance security for both EU and non-EU users. (Of course, it may also prefer to keep far less secure versions of its products for other markets, but let’s imagine that companies will treat all their users equally).
In this hypothetical situation, the quality and security of software development could be significantly enhanced far beyond Europe, no matter how advanced other governments are in understanding cybersecurity risks and the ways to manage them. The positive net effect is that the security for non-EU end users can also be ensured. This higher security of products with digital elements can potentially decrease vulnerabilities and cybersecurity threats, and therefore potentially bring more security and stability (this is an assumption which is difficult to measure in quantitative terms).
In this regard, when EU member states adopt such rules, they make their contribution to global cyber stability by implementing the following cyber norms endorsed by UN member states and as agreed in the 2021 UN GGE report (we have highlighted the text that could be relevant for the CRA proposal):
- Norm 1 (para. 13a): ‘Consistent with the purposes of the United Nations, including to maintain international peace and security, States should cooperate in developing and applying measures to increase stability and security in the use of ICTs and to prevent ICT practices that are agreed to be harmful or that may pose threats to international peace and security.’
- Norm 9 (para. 13i): ‘States should take reasonable steps to ensure the integrity of the supply chain, so end users can have confidence in the security of ICT products. States should seek to prevent the proliferation of malicious ICT tools and techniques and the use of harmful hidden functions.’
- Norm 10 (para. 13j): States should encourage responsible reporting of ICT vulnerabilities and share associated information on available remedies to such vulnerabilities, in order to limit and possibly eliminate potential threats to ICTs and ICT-dependent infrastructure.’
Is this a ‘happy ending’ for those seeking to achieve cyber stability?
If all this happens in the way that we have just illustrated, a potential ‘happy ending’ is still not guaranteed.
Since cyberspace has become a field of competition between states, the adoption of the CRA rules may (or may not) trigger friction with other states. Essentially, some questions would still remain open:
- What if other states come up with similar laws/requirements with extraterritorial effects? And what if such states are not considered like-minded?
- How would other states react to the proposed vulnerability reporting requirements to ENISA and national authorities? Would this be viewed as a national security matter?
- Would, in such cases, the decisions of states to introduce competing laws increase the fragmentation in cyberspace, and thus lead to less security and stability?
Looking at the history of safety regulation in the car industry, there is a high chance that the security and safety of digital products would be treated in the same way, i.e. some baseline mandatory requirements could become universal or at least interoperable for different jurisdictions (which would nonetheless increase the cost of competition and decrease the number of market players).
But what if the chances of achieving universal/interoperable requirements turn out to be low? And what if defining and enforcing the security and safety of digital products becomes more a question of national security and sovereignty? In other words, if only one obvious player (e.g. the EU) establishes the rules, would others (states) automatically and amicably conform to be followers? And would there be security and economic implications for stakeholders that didn’t follow?
These questions are difficult to answer at the moment, but now is definitely the time to follow this groundbreaking EU legislation process, and see if other states will soon follow.
—
1. Under ‘cyber norms’, we imply the norms that have been agreed by the UN GGE, and later reaffirmed by the UN Open-Ended Working Group (OEWG). See the 2021 UN GGE report (A/76/135).
2. Not specific to sectors or certain products with digital element.
3. The ‘end-of-life’ (EOL) gap is the gap between the end of security support and
the end of use. The definition has been provided in the 2021 OECD report titled Enhancing the Digital Security of Products.
4. Article 3(18) of the CRA proposal defines ‘manufacturer’ as ‘any natural or legal person who develops or manufactures products with digital elements or has products with digital elements designed, developed or manufactured, and markets them under his or her name or trademark, whether for payment or free of charge’.
Related blogs
Related events
Related resources
Subscribe to Diplo's Blog
Diplo: Effective and inclusive diplomacy
Diplo is a non-profit foundation established by the governments of Malta and Switzerland. Diplo works to increase the role of small and developing states, and to improve global governance and international policy development.
Want to stay up to date?
Subscribe to more Diplo and Geneva Internet Platform newsletters!
Leave a Reply
Want to join the discussion?Feel free to contribute!