Author: Stephanie Borg Psaila
Improving the practice of cyber diplomacy: Training, tools, and other resources – Final study
2021
Cyber diplomacy, the conduct of diplomacy with respect to a state’s interests in cyberspace, is too important to ignore. Yet, the participation of countries is far from ideal.
For some countries, diplomacy has adapted quickly, and cyber issues are now firmly on their diplomatic agendas. For other countries, especially developing countries and small states, there are several challenges linked to limited human and financial resources, which limit their participation and render them largely inactive in the cyber diplomacy policy space. Naturally, countries with limited resources are more likely to invest the few resources they have in what the country sees as more essential areas – and most often, cyber is not on that list.
Yet, many cyber issues transcend borders, and often prey on the weakest actors. Measures to protect against vulnerabilities need to be implementable – and implemented – everywhere. And no country should rise above the applicability of norms of state behaviour.
This study, which was prepared in two phases, analyses aspects of capacity development to increase the engagement of every country, that is, the availability of training opportunities, tools, and other resources and their reach and take-up.
While we appreciate that technical training (such as how to set up a Computer Emergency Response Team ) is extremely important, this study focuses mainly on the need for diplomats to engage in cyber diplomacy. By that, we mean the need to understand the cyber security aspects countries and organisations face, the laws that can address them, and how cross-border investigations work; how some countries engage in dubious activities to try to cripple each other’s critical infrastructures, and how laws can be interpreted to justify this behaviour; the policy measures a country needs to undertake to bring its hospitals back online if they are attacked, and how other countries can assist; the foreign policy a country’s ministry of foreign affairs (MFA) needs to develop for its diplomats to be guided by. The list goes on.
The survey we conducted as part of this study confirmed that training and tools are indeed available (whether there are thematic gaps is a slightly different story), but they are certainly not reaching everyone. The findings also uncovered the reasons why practitioners are often not taking any, or further, training, and why they were not making use of the whole range of tools available to help them in their cyber diplomacy work.
There are three main reasons. The first is simple: If they aren’t aware of training and tools in the first place, they can’t make use of them. The second is that even if practitioners know about existing training, they often do not have the financial means to enrol. The third is that practitioners are often too busy to spend time training or exploring tools and possibly not encouraged to do so.
We then used five case studies to look at good practices, identify gaps, and determine solutions. We based our recommendations on the findings, and on our own experience of training diplomats for close to 20 years.
When it comes to the recommendations, we’ve steered clear from one-size-fits-all suggestions, in the knowledge that practitioners, practitioners, providers, and funders all have different aims and needs.
For instance, a practitioner who has received a scholarship to undertake training should follow up with the training provider on how the training has impacted their work, or their institution’s work, even in cases where there’s no obligation for them to give feedback.
Providers should help instill a culture of institutional capacity development by incorporating this message in training programmes, such as during the feedback stage.
Funders should support practitioners in analysing what they really need, and involve providers in the process, as it can be more cost-effective in the long run. When analysing needs, the main goal of capacity development should be kept in mind: it’s not only about what people learn, but how practitioners apply the knowledge in practice.
This report, which we’re referring to as the ‘Full study’, is the culmination of two phases:
• Phase I, completed in September 2021, concerning the availability of training opportunities and other types of support and their take-up.
• Phase II, completed in December 2021, which includes the identified gaps, and makes recommendations on how to close them.