Saturday, October 08, 2016

Move fast, fix things.

50 years ago, had we told people that citizens of the future would carry a supercomputer in their pocket, we might have imagined a race of ‘ubermensch’ – enlightened beings, polyglots, erudite and versed in everything from differential calculus to economic theory.

That this is not the case - that the vast majority of use to which our devices are put can be summed up as ‘3G’ (Gawking, Girls, Games) will not come as news to you. The proportion of time we spend engaged in self-directed learning is vanishingly small.

Again, this should not come as a surprise – whilst I am not a fan of conventional learning theory I do agree with the Piagetian notion of ‘equilibrium’: in essence that we learn the minimum needed to adapt to our environment. It is true that humans have tremendous learning capacity, but also that learning is cognitively effortful and so we tend to learn only to the point that learning is required. Learning, rather like reason, is "the slave of the passions", as David Hume famously remarked.

This explains why, when companies such as Google provide inspiring examples of learning using their technology, it is often someone for whom learning is a matter of survival.

Today, we should me mindful that for this reason, the more we engineer our environment to be usable (self-driving cars, intelligent artificial assistants), the less we will learn - tending towards a 'zero learning-curve world'. Learning is not a universal constant.

Against this backdrop, it is remarkable to me that we have persisted for so long with a misleading view of learning – one which has had a big impact on education (both corporate and public) and investments in education. As I write, people are continuing to launch ‘online universities’, ‘MOOCs’ – managing to attract investment, but failing to deliver success. The pattern is always the same: the vast majority of users will come and take a look, then move on. As I have explained elsewhere people went to university to get a certificate, which in turn entitled them to a good job. Since most people do not end up working in their chosen field it is largely a ‘cash for certificates scheme’, with some side benefits. The vast majority of learning is not learning for learning’s sake – instead people get what they need at the point of need (and Google already does a pretty good job of that). Frankly, if we are going to give people certificates, it should be for the work they have done.

In the workplace the situation would be laughable were it not so costly: people often say they ‘don’t have time’ to do their learning. They are too busy doing their jobs. How did we arrive at a state where corporate learning is so misaligned that most employees see it as a hindrance and a distraction, rather than an enabler?

Once again, the answer is that we misunderstood what learning is for: when a person joins an organization they – naturally – learn a great deal in the course of adapting to their new environment. But the failings of educators to systematically map those concerns and address them directly – the tendency to build ‘courses not resources’  - has resulted in stagnant pools of content that are more hindrance than help. It’s not hard to fix this – let’s get on and do it!

1 comment:

  1. Hi Nick, I totally agree with the challenge/problem that you address. However, I would like to challenge you on two things:
    1. Why do you think that providing resources supports learning necessarily? I think the idea that employees are strong self-directed learners only goes for a small % of people - basically the high performers. Many people don't don't how to learn effectively (they never learned how to learn basically). So, just providing resources is not going to help them perform better necessarily either?
    2. Where is the evidence that Google is doing "a pretty good job"? I have worked for Google myself and don't necessarily agree with your statement.

    ReplyDelete