On a cycle ride the other day I just made it through the traffic lights but it caught my friend and she had to stop. This happens often. The last time it happened I wondered what the odds were that my friend would be caught in traffic lights so often. The answer is simple. But my friend pondered to herself “why do traffic lights hate me?”. And this behaviour is common. I ask myself most mornings why I don’t put my keys somewhere predictable so that I remember where they are every morning. My dad, just as often, asks why his keys go missing.
The name of this phenomenon is “locus of control” or “locus of causality” (depending on whether you read an English or American text, respectively). I have a locus of control very close to myself; I feel like I author my actions and the consequences (even though, intellectually, I don’t think that’s true). My friend and my dad have a rather distant locus of control; they feel things happen to them, and to account for that they own imbue things with sentience. In the words of my traffic-light-caught friend “I don’t know if there is a god of traffic lights, but if there is it hates me”. An external locus of control makes a person behave as if inanimate objects have some sort of intelligence.
There was an experiment done where people were asked to complete a task in exchange for money. The task was to lie. Subjects were given a very tedious task to complete and were then paid to tell a person that the job they completed was enjoyable. This is done to create cognitive dissonance; a disparity between what people believe and what they are saying. After the activity and lying, the two groups were asked how they felt about the activity. The group that was paid the least reported the happiest feelings about their work, while the group that was paid were more negative about the activity. This is considered evidence of cognitive dissonance. The idea is to motivate themselves, the group that was paid the least had to “internalise” the lie; they had to convince themselves it was true. Conversely, the group that was paid more can focus on the money to motivate themselves. (Festinger, L. and Carlsmith, J. M. (1959). “Cognitive consequences of forced compliance”. Journal of Abnormal and Social Psychology, 58, 203-211 [if you can’t access that, you can Google “Festinger cognitive dissonance” and there are a lot of articles]).
This is called a cognitive consequence of forced compliance. Consider the impacts of apostasy and heresy rules on a mind that does this. If you must lie everyday about your beliefs to avoid a community ostracising you or, worse, being burned alive, you will eventually belief that thing. Evidence has nothing to do with it. I know a man who was Catholic, quickly became a lazy Catholic before becoming an atheist. Eventually, he realised that he must seem to be Muslim to stop his childhood love’s (now his wife’s) family from killing him and disowning her. After learning Arabic and attending the Mosque, he is now a devout Muslim. He has no extra evidence, it’s just that he told the lie for long enough.
I have a hypothesis that has inspired a new experiment. My experiment would be to get two groups of people to do a boring but useful job, like stacking boxes or shelves, for a couple of hours. One group would be overpaid and the other would be encouraged to volunteer. The measured outcome is the quality of the work done and how they felt about the work. My hypothesis, based on the research above, is that those who are encouraged to volunteer are forced to internalise the locus of control; they will see the value of the job as inherent to the job. I would even expect comments about how different subjects looked for more efficient or neater ways of completing the task. Alternatively, I would expect the overpaid group to keep an external locus of control; they would value the external motivation of the money are care less about the job itself. I would expect this to manifest as low quality or slower work, coupled with comment that express dissatisfaction with the job.
Until I actually do this research, I cannot make any conclusions. But if evidence supports my hypothesis I think it would make a very interesting point about “value”. I think I have an internal locus of control. When things go wrong for me I ask what the odds are or what I’ve done to precipitate them. I do not ask what I have done to deserve them and I do not believe that traffic lights are out to get me. Equally, I value experience. I do not need an external locus of control (e.g. God) to give value to certain things. I value my contribution and efforts without needing external validation (too often).
I suspect God is a symptom of a society that does need an external locus of control. And to investigate that hypothesis, I would need to do another experiment. Assuming my first experiment supports my hypothesis (if it doesn’t, I’d abandon this line to research) my second experiment would collect qualitative data from theists (particularly those that believe in a personal God, excluding pantheism) and atheists (those that do not believe in any God, also excluding pantheism) about how they value morals. This would include questions about how they think laws, household rules and social expectations would affect their behaviour. If people assume these things will affect their behaviour then that suggests they have an external locus of control. And I assume theists are more likely to respond that way. Commenters who tell me morality is about God’s nature, regardless of what God’s nature is, and not the inherent intent or result or expected result of an action on my blog inform this hypothesis.
To do my experiments, all I need is an empty warehouse, a lot of boxes, and enough money to overpay some people (and maybe some eggs to measure how much care people take). Where can I get a grant for this stuff?
Love it!
Get your funding right here: http://www.templeton.org/
Let’s see if they truly are objective.
The next window for applications is 3rd Feb 2014. Remind me, I’ll genuinely try. I am think up a long route of experiments.
My basic hypothesis is that value in a mental construct and not a “real” thing. (Think: Daniel Dennett’s ‘what is sexy?’ http://www.ted.com/talks/dan_dennett_cute_sexy_sweet_funny.html)
I’s also propose re-doing the Obedience to Authority experiment… but spicing it up with with subject from different cultural backgrounds.
I seem to remember that one of those experiments demonstrated that social outcasts and people who choose to be alone were less obedient…
I wonder to what extent that comes back to locus of control (valuing another person–of authority–, not your own integrity)
I can imagine how this research would be interpreted on Fox.
“Atheists tries to prove atheists are mentally ill”
Question 1: Why do you feel the need to attack religion?
That would be cool.
In writing you abstract, you may need to also find a way for dealing with the tendency of people to act differently if they know they are under observation.
I think I’d not observe them. Instead I’d simply evaluate their work after they’ve left.
But I would have to account for the possible self-selection. People who opt to volunteer may be people who already have an internal locus of control. So I’m not looking at people’s ability to internalise it, but simply whether they can assign value…
Why not use history as hypothesis and conclusion. It more cheaper and also reliable. Let say, about pantheist, a thought of Baruch Spinoza can be referred.
A though of Georg Wilhelm Friedrich Hegel also can be referred.
That doesn’t tell me how they valued things.
[…] I vowed to investigate the psychological differences between religious people and atheists my research threw a spanner in the works: there is more than one type of religious person. To be […]
[…] investigated whether locally produced honey cures hayfever, pondered the look at comparing the locus of control of the religious and irreligious and then, as the previously mentioned investigation became increasingly complicated, I looked at […]
‘But my friend pondered to herself “why do traffic lights hate me?”. ‘
Or maybe this is what she told you, but would tell someone else something different.
When people make claims, it’s necessary to consider that some psychological defense mechanism (or something else) may be at work, so that what they say isn’t necessarily how they would think about a matter in a calm moment of clarity or in an epistemically friendly environment.