
The role of the MT/V5 brain region in the visual perception of motion
20 January 2020
Evidence for individual differences in stress response
29 January 2020The power of situation is often underestimated

Disclaimer requested by my University: Students are reminded copying this essay is plagiarism. Suggested citation at end of essay. Thank you.
James Wallace
Most of us can remember an occasion when someone or some people have persuaded us to do something we did not want to. On the face of it, this is understandable. People are social animals and it is in our nature to co-operate within our social groups – and within society at large – for mutual benefit and order.
Psychologists call this social compliance, and its force is more powerful than many realise. Imagine your own behaviour in the following situation. You are in a hotel in the middle of the night and you wake to the sound of the fire alarm. You hear panic among the hotel guests. A man dressed in a firefighter’s uniform directs you where to go and what to do; would you obey his instructions? Most people – without question – would out of respect for the legitimacy of the person’s apparent authority.
But sometimes this instinctive social compliance can make us do bad things. A controversial social experiment carried out more than half a century ago, by professor Stanley Milgram, demonstrated exactly that. Milgram showed that social compliance can be powerfully influenced by authority figures and group pressure which can direct us to behave in ways which are antithetical to our values and personality.
What was Milgram’s obedience experiment?
Milgram grew up in the US under the long shadow of the Second World War’s horrors, which later influenced him to devise an experiment to attempt to explain the psychology behind why hundreds of thousands of Germans were co-opted into the mass murder of six million European Jews during the Holocaust (Brannigan, 2013).
Milgram (1963) asked 40 male volunteers to discharge 30 successively higher electric shocks – up to 450 volts – to an unseen male participant every time he failed in a pseudo-scientific experiment studying the relationship between punishment and learning. Unbeknown to the subjects, the electric shocks and the purported experiment were both pretexts. The ‘learner’ was part of the researcher’s team and was faking being hurt. The real experiment was gauging the level of obedience by the participants administering the shocks, who had an ultimate choice over what level of intensity to administer.
But the real study subject was sharing the experiment room with two others posing as participants but was actually part of the research team. These “confederates” provided the social pressure impetus by unanimously suggesting an increased shock level each time the ‘learner’ failed in the word-pairing test. Also, in the room observing was a 31-year old man posing as the researcher, the authority figure, who firmly instructed participants to continue the experiment even as learner’s audible (faked) anguish and protests rose as the experiment progressed.
The results might surprise you. All 40 participants continued beyond the 20th electric shock – and 65% or 26 of the subjects continued all the way to the final 30th shock, which was 10 shocks after the unseen learner feigned unconsciousness. But was this obedience to authority or conformity to group pressure – or a combination?
The authority figure’s instruction played a significant role, the strength of the influence is related to the authority figure’s proximity (if he is in the same room, for example) and perceived legitimacy (is this person credible?). Less than a decade later in 1971, professor Philip Zimbardo, demonstrated the power of authoritarian authority on obedience in his controversial simulated prison study, known as the Stanford Prison Experiment. Three further factors were at play: the gradual increase in demands, the lack of information in a novel situation and an absence of responsibility by the participants and fake participants in unanimously agreeing to increased shock levels. So, if these influences were reduced or removed, would the experiment still yield the same results today?
Would people obey today?
Jerry Burger carried out a partial replication in 2006 asking that very question. He reported that “obedience rates in the 2006 replication were only slightly lower than those Milgram found 45 years earlier” (Burger, 2009, p.1). More recently, psychological illusionist and magician Derren Brown conducted an experiment as part of a TV show, called ‘The Push’, which asked if the power of social compliance could manipulate a person to commit murder. For those who have not watched the show, there will be no spoilers here. Watch it and imagine yourself in the shoes of the TV show’s protagonist. How do you think you would act in the same situation?
This essay was submitted as part of my MSc in Psychology at Leeds Beckett University in May 2019. Students are reminded copying this essay is plagiarism.
Suggested Citation, Further Reading & Viewing
Wallace, J. (2019). The power of situation is often underestimated. Retrieved from: www.james-wallace.uk
Richards, J. (Director). (2016, January 12, original aired date). Derren Brown: The Push (re-released title name). Retrieved from: http://www.netflix.com
Burger, J., M. (2009). Replicating Milgram: Would People Still Obey Today? American Psychologist, 64(1), 1–11. Doi: 10.1037/a0010932
Zimbardo, P., G. (2007). The Lucifer Effect: understanding how good people turn evil. Random House Trade Paperbacks.
References
Brannigan, A. (2013). Stanley Milgram’s Obedience Experiments: A Report Card 50 Years Later. Society, 50 (6), 623–628. Doi:10.1007/s12115-013-9724-3
Burger, J., M. (2009). Replicating Milgram: Would People Still Obey Today? American Psychologist, 64(1), 1–11. Doi: 10.1037/a0010932
Haney, C., Banks, C. & Zimbardo, P. (1973). Interpersonal dynamics in a simulated prison. International Journal of Criminology and Penology, 1(1), 69-97.
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67(4), 371-378.
Richards, J. (Director). (2016, January 12, original aired date). Derren Brown: The Push (re-released title name). Retrieved from: http://www.netflix.com
Upton, C., L. (2009). Virtue Ethics and Moral Psychology: The Situationism Debate. Journal of Ethics, 13(2-3), 103–115. Retrieved from: https://doi.org/10.1007/s10892-009-9054-2Zimbardo, P., G. (2007). The Lucifer Effect: understanding how good people turn evil. Random House Trade Paperbacks.