LevelUp Webinar Notes: Evaluating DigiSec Trainings for Effectiveness

During this webinar taking place on February 23rd, We:

  • Looked at the key challenges we face in evaluating digital security trainings
  • Explored how we measure our effectiveness according to what our beneficiaries are trying to achieve
  • Heard from our guest speakers: Natasha and Tawanda from Digital Society of Zimbabwe, and Erin from IREX
  • Looked at follow-up best practices
  • Asked: how can we improve our own training curricula?

You can view a recording of the full webinar.

What are we evaluating?

Establishing whether or not a digital security training has been effective is challenging, as it can be hard to prove that the tools and methodolgoies learned during a training have actually been put to use. It’s difficult to show a correlation between training, behaviour change and increased security. Likewise, when both individuals and those in their networks are at equal risk due to certain actions or behaviours, it’s difficult to ascertain how secure they actually are. Relying on solely quantifiable information connected to metrics is not that helpful in understanding training effectiveness in a  digital security context.

What is the change they are hoping to contribute to? It’s important for digital security trainers to clearly articulate what they are trying to achieve.  Are they:

  • trying to make a community more secure?
  • helping individuals to understand how things work and how certain technologies or behaviours can make themselves and their networks vulnerable?
  • showing how behaviour increases risk and contributes to vulnerability?
  • increasing competency to use a tool? (And also to get their networks to use it as well?)

Critical to understanding your beneficiaries is knowledge and awareness of the current political, cultural and infrastructural contexts they are operating under. And what are their goals? Are they fighting for rights, or advocating for change in laws or policy? Are they trying to end violence/conflict? Above all, it’s frequently very important for trainers to help participants understand how digital security tools and methodologies intersect with and impact their ability to achieve change.

We also need to establish indicators of success for providing digital security training:

  • Is greater confidence in their ability to use technologies safely an indicator?
  • Do we want them to become advocates for security and be able to engage and teach others?
  • Is there a connection between security and the capacity to accomplish change?
  • Do we want them to consider us an intergral member of their network or community?

Our Guest Speakers:

NatashaNatasha Msonza is a co-founder of Digital Society of Zimbabwe & a digital security trainer based in Zimbabwe. She has a background in media and communication strategy, and works primarily at the intersections of gender, information technology and human rights. She can be followed on twitter @NatashaMsonza @digisoczim

TawandaTawanda Mugari is a co-founder of Digital Society of Zimbabwe & a digital security trainer based in Zimbabwe. He has a background in management information systems, and a researcher of ICT innovations for developing countries. Passionate about Human Rights & Gender issues, Tawanda also is a lecturer at a local women’s university on Management Informations Systems & Internet Fundamentals. He can be followed on twitter @tawmug @digisoczim

Tawanda and Natasha shared how they strive to support all citizens in Zimbabwe to be more digitally resilient. They are motivated by the threats that Human Rights Defenders face in Zimababwe, and they utilise three layers of evaluation in guaging the effectiveness of their trainings:

  • DigiSocZimLayer 1 focuses on immediate outcomes for the training itself, conducting a training assessment at the beginning and the end of the training to measure changes in levels of knowledge.
  • Layer 2 goes deeper looking at needs assessments and threat modelling. This is often the first exercises they run during the training and will often determine the training’s content, based on the outcomes of these exercises.
  • Layer 3 looks at how learnings have been applied and behaviour change as the result of the training. These are conducted after a period of time post-training, say three months. They find that it’s better to have these visits be less formal, as people are then more comfortable about being candid about their digital security behaviour and practice.

Tawanda and Natasha find that risk assessments are often a more fruitful way of getting to know participants, and they use the results of these to devise the content of their trainings. They try to identify champions within organisations they work with and train, who can help reinforce and instill digital security practices internally among their colleagues and peers.

erinErin Murrock leads monitoring and evaluation efforts the SAFE Initiative at IREX, with her counterparts in five regional centers to better understand and integrate participant learning, behavior and attitude change, and risk perceptions into program design. Erin also managed several media development programs in the Caucasus and the Balkans.

IrexErin shared how their project, the SAFE initiative, integrates safety training for Media Practioners and Human Rights Defenders from Central America, East Africa, Eurasia, Middle East/North Africa and South Asia. They are trying to reach people in rural areas, and to engage media outlets working in those areas, on how to be safer. They see security as comprised of three components: Psycho-Social, Physical and Digital. During trainings, they try to run several exercises that can help bring to the surface participant attitudes, awareness, and practices regarding safety, and they try to assess how their training has impacted behaviour. They also focus on identifying more secure means of communication for follow-up during training, before participants leave, and especially in cases where it is otherwise difficult to maintain communication.

Follow Up and Improving Curriculum

It’s important to have a plan for conducting follow-up that is a routine in your training practice.  Be consistent – whether it’s 2 weeks, 2 months or six months after the training – as it can give you a framework that allows for comparison among before, during, and various stages following training. You will need to make sure you know how to communicate and get in touch with participants in the future, in particular by establishing a communications plan in-person with participants.

And in conclusion, we noted how an indicator for failure for a trainer is when they think their curriculum/methodologies are perfect.  Never set your curriculum in stone, and always ask before every training: How can this training be more effective, and how can I be a better trainer? In answering these, you should engage other digital security trainers on how they might answer those questions for themselves.

Participate in LevelUp!

The place you can engage other trainers about how to make your trainings more effective is via LevelUp. To continue the conversation about improving digital security trainings, please subscribe to the LevelUp list by sending an email to levelup (at) riseup (dot) net.