This video cast is called Tips for Tools, because I'm always talking to learning leaders
about exciting tools. They look at me with a wistful look in their eye,
and they said, "Yes, yes. Very interesting, but in my organization,
the CIO, the firewall, the restrictions on software, impossible. Can't do it.
We just have to make do with what is supplied." Usually big enterprise programs, like SAP
or Oracle. They look depressed. How do you manage to persuade your
organization that these tools are not only useful, not only secure,
but they make a big difference? Hence, Tips for Tools. The first thing
I implore people is never find an interesting tool, and then go straight to
the CIO and say, "I think this tool could be fantastic for our organization."
Maybe it could be. Maybe not. Your CIA will just, with a glazed look on his face, say,
"No chance. No way. No." Be smarter. The way you're smarter is the following.
Never have 50 tools in your head on the go. Find one, two, or three that
can make a difference, and they're the ones you're going to focus on. Focus on them
for a good period of time. It's very easy to flit, to go from this Monday's tool to
this Wednesday's tool to the Friday tool, and no one is impressed when everything
is wonderful, everything is exciting, everything is cool. Focus on what you
genuinely feel will make a difference in your organization. For example,
if you want to build a more communication, a more improved and engaged learning environment,
and you're looking at Slack, look as well at
perhaps LinkedIn or Yammer or any other number of tools. Keep to three,
maximum of four. Then you start to be smart. Don't test them amongst your own staff.
They'll all love everything. Find people in the organization who are with you who are
willing to go a little bit of an extra mile for you, and get
a little coalition of maybe only eight or 10 people. Set up a mock use of
those tools, using not the company intranet, because that just won't happen,
but let them use their own home-based or their cellular connections.
Mock up how this software could be used in your organization. Get them
to play with the software. Get them to try to post things, try to communicate.
Whatever it might be. Ask them to work, perhaps a week on one, a week
on a second, a week on a third. At the end, listen to their advice.
What did they like? What was easiest to use? What worked best? What did they think would make
the biggest difference? Not what you think. What they think. With a bit of luck,
that conversation will allow you to narrow down the choice to two tools or
perhaps even better, to one tool. If you've got your one preferred tool,
you need a few more people, and you need a more extensive use. Spend a bit of time
working out how you might integrate it into your learning system if you were
able to do that. Build something that looks as close to reality as it's
possible to do, while still not breaking any rules. Get perhaps 50 people,
volunteers, to test that out. If you're going to run a new learning program,
get 50 of the participants if you've got hundreds, or 10 or 15 if a smaller cohort.
Get them to run the software alongside the learning program,
and then get them to tell you whether it made any difference,
whether the outcomes and the outputs from the program were better,
more successful, more deeply reaching in the organization, than the outputs from
the majority of people who didn't use the software. If the answer is,
"Well, no real difference. No real impact," move on. There's no point in arguing
for something that makes no real impact. If your results are astonishing,
if there's clear use, and if those people scream when you say we've got to take
this software away, you've got the basis for a case. When you present your case,
you don't present the case because I, as CLO, head of learning and development
in this organization, think it's a good idea. You get the group
to present the case and say, "We, as users, as learners, have found this unbelievably helpful."
Then you can chip in and say, "These are the quantified benefits."
You build a very, very powerful case using a cohort of people who are quite
vociferous in your support. Then you begin to talk turkey.
How could we make this work in the organization? Could we punch a hole in
the firewall as long as it's secure? Could we use it separately in the Cloud
and build it in, lock it into the main learning system, or do we need to keep
them entirely separate? Each of those three alternatives can possibly work.
Build a coalition and build a consensus. Then, if you get agreement and you implement,
that's not the end of it. No, no. This is the beginning of it.
At this point, you start to gather real evidence that it works, real evidence
that it makes a difference, and then, when you've done that, you can present that
to the people who approved it to the CIO, the CEO, whoever it might be in the organization.
Once you've proved that it makes a difference and that you were
right in the piloting, you then have a platform
on which to build and on which to develop other tools later on down the track.
My model is investigate, experiment, check,
do it again. Investigate, experiment, check, do it again
before you start to shout about what is going to work and what isn't going to work
in your organization. To give you a case study, I work with a big company,
a big pharmaceutical company, who had a very conventional LMS that was on the servers
in the company, very secure, firewall around it. They wanted to use a new
Cloud-based LMS, and the answer was no. They negotiated that they'd
leave the existing LMS, and they'd put two or three very small unimportant pieces
of learning on the new LMS, Cloud-based, and enroll a limited number of staff.
In this case, it was several hundred people. They'd review the output.
They'd compare ease of use, compare benefits, compare data,
compare ease of managing the system, of adding user-generated content, for example.
They gathered together over a period of six months an almost unstoppable case
for the use of the Cloud-based LMS. When they went to the CMO and the CEO
and the CIO and said, "This is a step change in the way that we're gathering data.
It's a step change in performance. I think we've got to work out ways
of moving forward." That's exactly what they did. When they sat around the table,
it wasn't saying getting permission. Sitting around the table,
they were worked out the means and the ways in which they could actually
deliver this to get the benefits. It's so much better to take it steady,
to experiment regularly. Not every experiment will work.
Not every tool you choose is going to make that much of a difference.
When you ask, you've got overwhelming evidence for
moving forward. That is a really successful model that I promise
has worked in countless organizations. They are my tips for tools.
Không có nhận xét nào:
Đăng nhận xét