There is often a tendency to show unparalleled enthusiasm when the figures seem to reveal that an experiment is positively conclusive. “I was right, we say to ourselves, the experiment worked so let’s implement it!”
We will tend to listen only to people who have had the same type of result, while the arguments of those who observed problems … will more or less fall through the cracks.
The problem is that sometimes the experiment, validated at first, will prove to be disappointing sometimes later. How is it possible? Everything is to be explained!
It is above all a matter of cognitive bias! Also, before making a decision (especially a purchase) that is too quick, it may be worthwhile to know a little more about it.
You are a tech-savvy trainer and want to introduce a large number of digital tools within your institution. “Breakthrough innovation says I!” You have your list in front of you and you feel strongly supported by your colleagues in this vast enterprise.
You want to take action but you frustratingly don’t understand why there are obstacles to this undertaking. Why block technology and progress? “There are two victims, in this case, my dear Watson,” I declare.
The first is the instructor who wants to calm down and keep his habits, because why change?
This is called cognitive dissonance! This means that your colleagues are probably irritated by your pro-tech enthusiasm. No matter how well you demonstrate the effectiveness of Edtech tools through figures and case studies, they find it difficult to accept a reality that does not correspond to the one they have built for themselves.
Rather than forcing the initiative, demonstrate pedagogy, measure your business, offer clear, targeted experiences, go step by step, so that you can maintain a constructive debate on the real effectiveness of certain educational tools on learners and instructors.
The second victim is yourself! You do not seem to face direct refusals on a daily basis when administrative obstacles prevent you from freeing up a budget or buying a tool.
You are the happy victim of a conformity bias, that is to say, a bias pushing you to accept almost exclusively the ideas emanating from a group with which you identify.
You are a pro-innovation, pro-tech, you certainly take into account only the opinions of colleagues having the same opinion as you and not enough those of others!
To avoid this, favour meetings with all interested parties and LI-STEN! Show empathy and try to understand why one is for and the other against. This will allow you to moderate your own thoughts and decisions.
Have the results of your learners on their assessments increased by 25%? Most are satisfied with the idea of using the new educational tool? You jump to the ceiling as this experience seems conclusive and so much your enthusiasm is verified? Maybe you are right! However, you may also be a victim of the Hawthorne effect!
Hawthorne Effect: When a new tool is tested, learners tend to receive extra attention from the trainer regarding the experience. This attention causes a redoubling of efforts among these learners, which will cause a marked increase in their learning performance. The Hawthorne effect is transient, which is why it is important to test a tool many times before going to an analytical and decision-making stage.
It may be interesting to test a tool over a quarter and analyze learners’ feedback by requesting feedback. How many times in a quarter? It all depends on the tool.
One per month, one peruse it all depends on the ease of delivering this feedback and it all depends on the motivation of the learners. From there, just look at the trend. If the learners’ satisfaction is continuously positive, the tool can be considered comfortable for the selected learners. Same principle for other learning data such as notes or comments from trainers and teachers. If the scores increase significantly and in the long term, a quarter or more, then the tool used can be considered an effective tool.
Taking into account the Hawthorne effect does not mean blocking all enthusiasm! Indeed, nothing better than enthusiasm to engage in a different learning experience. Rather, it means constantly remembering the steps necessary to validate an educational tool.
We can observe warning signs, problems, however, we often tend to stay in the course. Example: you are in the subway. A failure persists for 5 minutes. Rather than taking a path which is usually a bit longer, you prefer to stay in the wagon, by fear of seeing it start immediately after you left it.
Minutes pass and this reasoning weakens, but you persist until the last moment. Result: you will have lost 4x more time than if you had reacted quickly. Such a pity.
We call this loss aversion! It is a reflex which, when we invest time, energy or money in a project prevents us from changing. Well as for the subway, we can observe the same type of behaviour with educational tools.
If you see that an educational tool exceeds a threshold of failure, it will not have a positive impact on the learning experience, so go to plan B or analyze the situation and move on. However, you do observe, according to you, more and more regular auspicious signs on your experiment. Your enthusiasm is playing tricks on you, it’s the Pygmalion effect.
How then, to avoid making bad decisions? Once you know what can happen to you, there are three things to do before testing an educational tool:
Yes, applying these multiple steps to the decision making can slow down a process by a few weeks sometimes. However, we are talking about learning.
Succeeding in your digital transition in the best conditions is a crucial question involving the future of some and a question of time and money!
So don’t ignore these biases. Better knowing that they exist and how to manage them can lead to better collaboration between actors (teachers / trainers, learners, suppliers of educational tools), as well as to better management of change.
To go further, I recommend reading: