How do we evaluate learning new skills and growing confidence?

We’re often asked: how do you evaluate the Digital Action Plan?

Quite rightly, clients want to know their learning budget is being put to good use.

We don’t (yet) have a fancy evaluation framework, or a standard way of scoring people. This is because:

  1. this programme is about confidence as much as skills. Confidence is very difficult to measure in a meaningful, objective way, especially in the work environment. Confident-sounding participants often have as much to learn as those who consider themselves least confident, but who might be doing great things online, unrecognised.
  2. the needs of individuals, and the requirements of the teams they work in, varies so much between sectors. Benchmarks become meaningless. We want participants to put their audiences, customers and colleagues first, rather than use channels or techniques simply because there are some ‘accepted’ industry standards.

However, there are a variety of actions and data points that we can capture. Taken together help to determine whether or not a participant has learned something new and feels more confident.

1. Have participants told us they learned something or feel more confident?

In a workshop, over the phone, on email or in the forum, participants tell us what they’ve done, what they don’t get and what they think of their plan.

We don’t always share this feedback verbatim with clients. It’s fundamental to building confidence that participants trust us, and we rely on participants being honest with us about what they don’t know.

However we do capture and share feedback internally with the team here, so that we can help someone if we’re in the early stages of their plan, or hopefully celebrate their learning if they have fed back on something they’ve tested.

We’ll present this at the end of the programme as quotes (usually anonymised) grouped by theme.

2. What have participants achieved within their plan?

We can measure how much of a plan someone has said they have read, and completed. This programme relies on trust and for the most part this works well. Wherever possible we’ll try to find evidence of practice (see Commitment, below) to back this up.

However, we’ve taken a conscious decision not to test people and demand evidence. We’re not authorised to dictate how people allot their working time. Our starting point is that motivated, curious staff will make time to read, experiment and practice the things they feel are relevant to them.

So, if someone says they read and completed a task, we believe them and we’ll count that as progress.

This data is available throughout the programme and we’ll present a read out of completion percentages at the end.

3. How much effort has a participant put in?

Digital Action Plans are personal to each participant and within any given cohort most of these will look different. That’s because when you peel back the lid on digital skills and confidence, even the most accomplished teams bear huge disparities.

For this reason, measuring everyone on reading and completion rates (above) isn’t fair. We were a bit harsh on one participant who had only ticked a few goals, before realising that she had put maximum effort in to the goals she had completed. This is the difference between someone who barely uses the internet outside of work joining a social media channel, and an accomplished Head of Digital Something being a bit better at planning.

We’ll present this at the end of the programme as a brief personal assessment, based on the exchanges we have and observations we’ve made.

4. What have participants put in to practice?

This gets to the core purpose of the Digital Action Plan.

For us, this is the most rewarding and valuable progress that we look for: evidence of people doing things differently. This might include trying new channels, changing websites or internal processes based on data they are capturing, blogging openly about their work or testing and iterating content.

In this we’ll also include the contributions they make to their cohort’s forum on the Digital Action Plan platform.

We encourage people to put what they’re learning in to practice, as part of their plan and to tell us what happens.

Sometimes we’ll learn that an upcoming project is going to be delivered in a different way, or there’s some tougher evaluation happening around digital.

We capture this evidence as the programme progresses.

 

Someone skiing uphill
We want to recognise effort in proportion to experience and confidence

5. Has there been any impact on the organisation?

A surprising and very satisfying outcome from some cohorts has been teams coming together to change a project or working practice, off the back of something they’ve read collectively in their plan.

Sometimes this is an improvement, such as reviewing web stats and improving content on a page, other times it’s recognising a problem that needs addressing – most often when participants realise their work devices won’t let them access something linked from their plan.

We capture this evidence as the programme progresses.

 

As with the rest of the programme, we’re constantly improving our approach to evaluation. We’ll be testing new ideas with each new cohort of participants.

 

Image licenced under Creative Commons. Attributed to JM Fumeau.