I am not sure the conclusions are particularly revelatory. The political system in the US appears to rely on using tribal association - Democrat/Republican, Conservative/Liberal - to get people to group themselves (by branding each tribe) then the political class just inserts whatever policies suit their goals and the masses fill in the blanks of justification. From the last decade alone there are countless examples of Bush policies opposed by D voters and supported by R voters yet when the same policies are enacted by the Obama administration people take the opposite positions. We all know it happens and see it every day.
Given that the study was funded by the Swedish research council and by a Swedish professor from Lund, and that the participants were recruited randomly from people in a park (assuming one in Sweden) I would guess they were Swedish. I share your disappointment about the current cliquish political environment in US, but that may not be a good explanation of the results. It might well be that such behavior is universal, but that will need more substantiation/exploration.
I think the GP was just giving a local example (for them, at least) to support an argument for a universal phenomenon. They probably didn't mean to imply that anyone associated with the study was American or particularly influenced by American thought (or, for that matter, that tribalism itself is a uniquely American quality).
Right. That's exactly what I was saying. I think this is a universal phenomenon, for which we have clear evidence in front of our eyes played out on the most public of stages. It was not meant to link the US and the study, nor was it to indict/laud the US system. Merely a neutral commentary that popped into my mind reading this study. You said it more concisely than me - thanks :-)
It's not just the political system in the US - this has been going on since the patricians and the plebs, and is unlikely to change. I'm not sure that it is some form of elitist conspiracy though - people form tribal associations all the time even without the encouragement of leaders online or offline.
Here's one example: some twenty years ago, "science" was a dirty word among liberals, Newton's Principia was a "rape manual", quarks were "constructed", and "scientism" was but another tool in our evil western civilization's arsenal for oppressing minorities and other cultures.
Nowadays, liberals are all about science. Anyone who'd dare to question the scientific method is corrupt or crazy or both. All it took was the climate change and the controversy around teaching evolution in schools.
Was Richard Rorty not a mainstream liberal? Anyway, http://en.wikipedia.org/wiki/Science_wars could be a good starting point if you care. The phenomenon was mainstream enough for say Steven Weinberg to have noticed and devoted some pages to it in his books.
Government surveillance. Government secrecy. Willingness to tolerate war crimes and "move on". Torture. Gitmo. The Bush Doctrine itself.
Not that I subscribe to the "no true Scotsman" style of argument, but ... Obama liberals, as opposed to liberals, don't seem to have noticed the lack of follow-through on what I thought were pretty cogent criticisms of Bush's entire oeuvre.
Obama changed his support for FISA updates and telecom immunity in the summer of 2008, well before the election. So some (I doubt more than a small vocal minority of progressives and civil liberty types) were upset for that, but I don't see them flip-flopping o. that.
> move on
Obama did not campaign on bringing Woo et al to justice. Progressives certainly angry about that.
> Torture
I'm unaware of Obama's deviancy from his proclamation to end support for the Bush/Woo policies. I doubt myself that torture has actually ended, but we have no evidence like we did with Bush.
> Gitmo
Some democrats do indeed perceive Obama to be the king that Bush was, but he has a little problem with Congress there...
> Bush Doctrine
Neo-Wilsonians would argue they aren't Neocons. I'd agree, they're not as smart as the neocons, and that isn't a complement. But given Democratic support for "containment" of Saddam via weekly bombings and Kosovo, I really don't think the mob of Dems have flip-flopped on
The question isn't whether Obama did X, Y or Z. It's whether Obama supporters will rationalize his actions and support them even when they are consistent with policies they opposed under Bush. Likewise whether Bush supporters now oppose similar actions by Obama (or previously under Clinton) that they supported under Bush.
Several of Obama's policies (not including the health care bill) have been exceedingly pro-capatilism, and Obama is making huge huge pushes to bring manufacturing back to the states and create American jobs. Both of which you would think would cater to the Republican philosophies, but no, he's just seen as even more "socialist" with them.
And to try to lend a little credit to my statement, I'm independent, and tend to vote conservative/libertarian.
Well, I'd say Republicans mouth those ideals and then vote for nafta/wto/etc. And so did many Democrats. So did Clinton. But I don't see bringing jobs back as anti-Democrat, I'd see it as more of a progressive/workers thing. The only people screaming about Nafta were unions, socialists and paleocons.
edit: I'd add that as for economic philosophies, Reagan Democrats are still Reagan Democrats.
> credit
I'm sure that in your mind, democrats flip-flop all the time. I just don't see any evidence of that regardless of your personal enlightenments.
So here I think you're nicely highlighting the earlier point. You are more interested in defending Democrats from some imagined charge of flip-flopping all the time. That's a charge I never made. The point I was making is that there is a lot of continuity in US policy from one admin to the others. This leads voters to rationalize their support/opposition to such policies based on who's in power. If you don't see that happening, that's great.
The word continuity does not appear anywhere in your OP. The statement that voters take opposite positions depending on who is in power does.
I'm more interested in you documenting, factually with polls, countless examples of D's doing this flip-flopping. Exactly who was against Afganistan but suddenly rationalized supporting it when they learned Obama was for it?
>> "I'm sure that in your mind, democrats flip-flop all the time. I just don't see any evidence of that regardless of your personal enlightenments."
I'm not entirely sure how I implied dems flip-flop, or do so more than repubs? I was more making the case that people will stick to their "tribe/pack mentality" even when the other side is doing something congruent with their "beliefs". I also wasn't attacking Obama above, but supporting some of his actions.
But now that you mention it... Obama pledged to bring change and transparency to the presidency, but he's been the harshest president on whistle blowers, and actually operates more behind closed doors than any before him. He also flipped on the gitmo/prisoner thing. They all flip-flop (politicians, red or blue), this is politics, say whatever it takes to get yourself in office, then say whatever it takes to keep yourself there. The American public is buyin.
Sure. Just without thinking about it I can pop off a couple...
The economic rescue packages enacted during the period spanning the end of the Bush era and the start of the Obama era.
Attitudes to military intervention abroad. When 80s Republican administrations were coddling Saddam Hussain as an ally, liberals were screaming about the injustices in that country and calling for intervention; when conservatives flipped to wanting to oust Saddam Hussain liberals were largely against it - though little had changed for the better internally in Iraq. The flip on Libya when it was the Obama admin working towards the overthrow of Gadaffi, etc. The Republican history of isolationism flipped to a bold militarism. Bush running against nation building/intervention then building a Presidency on it.
The free trade/human rights dichotomy re: China - where the parties swap positions from agitation to accommodation depending on who's in power. Similarly with NAFTA.
I suspect if you don't see policy continuation between administrations in the US, despite the extreme polarization of politics where most R voters will argue against ANYTHING done by a D administration and vice versa, which is exactly the point made, then it may be you seeing what you want to see.
As one of a handfull of liberals who protested Saddam with Iranian expats (many who then went back and died in that war), there was no liberal or democratic opposition to thatsupport, and much of it was secret at the time. Nobody flipped. Not one. When they became aware of it later (Iran/Contra) and the first Iraq war ramped up, they were consistently against it.
Republicans haven't been isolationist since the 40s.
There is little, if any extreme change in D voters over the last several decades. There is no vice-versa, it's false equivalence nonsense.
Where are all these 'Democrats' that hated the Afghan war and now love it? You do know that Obama campaigned on doubling down in Afghanistan? So did those Democrats flip-flop while he was a candidate or after?
Or are there 'Republicans' protesting against the Afghan war now?
As for bailouts, can you name a Democrat who was against before they were for it? Because I don't know a Congessperson with a 'D' who voted against the bailouts.
And yet most of the Democrats in Congress voted for the surge. Schumer did not switch to a withdrawal position until June of 2011. Is that a flip-flop, or soemthing more nuanced?
IMO, he had just decided the 'Decent Interval' strategy was decent enough.
TY for the link, certainly more Congressional dissatisfaction than indicated. I will note that both Administrations supported the bill, so I'm still not sure who flip-flopped except for the House Republicans (it had failed an earlier vote and Pelosi publically said she'd just wait for the Republicans to cough up enough votes, which they did).
The original point wasn't a party political one, but I think your reaction demonstrates it nicely. What the administrations do isn't the question, its that their tribes convince themselves that their team has been consistent and that they haven't changed their minds... which is what the study was trying to show as well.
That's the best description I've ever seen of the two party system. People will defend their fox news or cnn to the DEATH, no matter how much it goes against their actual self interest. Politics/tribes do crazy things to people...
You're assuming that one's political views should reflect one's self-interest. I think it's morally consistent to hold to principles even when they don't benefit you; it's not necessarily evidence of political trickery at work.
I think too many people think the is a difference between self-interest and morality. They are one in the same. For example, you could say helping spread wealth to the poor is not in your self-interest, but it is morally appealing to you. But that is wrong, if it appeals to your morals for whatever reason (you like helping people, you think it will create a better world, you think everyone deserves more opportunity) then it is clearly also in your self interest. (although not immediately apparent, as it means less wealth for you).
The issue here is that the political parties seem to have a "tribal" divide, where parties on both ends refuse to acknowledge that the other side may be doing things according to their ideology, simply because it is the other side...
This is different from the classic phenomenon explored in Yes, Prime Minister[1].
It's not that they manipulated the questions in advance -- it's that they reversed the meaning of the questions after the respondent had answered them. Even under these conditions, the majority of respondents were then prepared to support the position which they had, ostensibly, only a few minutes ago opposed and vice versa.
But ... This feels like the behavioural economist trick - getting people to behave morally or not in dividing up a pot of money. It turns out that people have comfort zones, areas of expertise where they are good at applying their skills and moral compasses. So in the economists trick, taking people out of the lab and making moral decisions at work they become more comfortable and less liable to change their minds
I suspect that if you asked a Palestinian aid worker just back from gaza their views and then bait and switched them, you might find they spot the move ! I also suspect the same person could be tricked over the morality of sugar tarrif duty in the US and Iowas farmers.
nothing remarkable. if any one of us were presented in a fabriacted "forwarded" quote by us,
> in this format...
and we didn't really care, we would probably defend the misquotation (unless we explicitly remember what we actually said.)
when was the last time you said: "If I said that, I was completely wrong: I feel the opposite on this matter."
that would leave someone looking like someone completely unreliable whose opinion shouldn't matter in the slightest. no shit people don't want to be that person and will protect what they think people believe they have said.
you could show this with a second study, in which people are asked to defend a statement others heard them make, but where the defendend KNOWS that what the other HEARD is not what they SAID. It's easier and probably in many cases will happen, that you defend what you know the other person thought you had said.
THIS IS WHY "DISRUPTIVE LISTENING" (sorry, I don't remembe rthe correct word at the moment, the meaning is "actively pretending to hear something else") WORKS SO WELL.
In other words. If you are told, "There is no way we can ever be interested", and you repeat to the caller, "so if I understand you you are saying that there is no way you can put it in this month's budget"
then the person will quite likely agree with that statement. You just changed their meaning and yet they will not contradict you.
I say "I was wrong" quite a lot in fact, and I think it's an important ability. Of course, it's easier when you've been called out on some trivia or an educated guess, but I'm not often confronted with something I said that is diametrically opposed to what I currently believe. Is anyone?
but we're talking about a few moments ago. it is a conversational thing - it doesn't make much sense to argue against what you've just been misquoted as saying a moment ago.
to justify the title, "how to confuse a moral compass" the methodology would have to remove the justification or public discourse aspect, and show that someone's internal or private belief actually changed due to the conversational obligation to defend what they are misquoted as saying. i.e. you would have to find out what they thought to themselves privately before and after.
maybe a way to do this would be to have an "outside friend" who is more trusted than any of the people involved in this fake context, and see if they would report a different opinion to this friend that they think has no connection or knowledge of the immediate social context, due to the misquotation in the immediate social context.
As it is,the methodolgy does not justify the conclusion. (as summarized for us, I didn't click through to the paper.)
1. How many people might notice, but think that the mistake was theirs, and even argue for the opposite of their position in order to save face?
2. How many of these people hold staunch views on the questions asked? E.g. would this have the same effect on a extremist skinhead as it would on the average college student?
> 1. How many people might notice, but think that the mistake was theirs, and even argue for the opposite of their position in order to save face?
Probably most of them. This mechanism has been described by Cialdini as "Commitment and Consistency". Marketers abuse this in a clever way by ramping up commitment. If you answer a bunch of seemingly innocent questions to make you look good ("I often go see opera"), you have made a small commitment. Then it can be leveraged to sell you an expensive opera discount card ("For a person who visits opera so often, it would be irrational to turn the offer down".)
In what way does it show ineffectiveness of surveys ? not challennging the assertion, trying to understand why you think so.
To me it seemed to be an indication of what was stated in the article, that people are less committed to their views than they imagine.
I would go one step further and conjecture that we have a tendency to defend a position, when confronted with "evidence" that we have endorsed that view before. I expect this to work on moral positions on which our convictions are not solid.
@pyre My confusion was not about whether surveys can be manipulated. I can well imagine that they can be. My question was more in the lines of: where in linked post is it demonstrated. Sometimes I dont read between the lines enough and thought I must have missed something. And yes, who in their right mind would want to leave any children behind ! Its mostly semantic gymanstics
> In what way does it show ineffectiveness of surveys?
Surveys are often manipulated. IIRC, there was a 'survey' a while back with regards to whether or not people wanted Congress to force ala carte pricing on cable providers. The question was something like "Would you like your cable provider to provide you with a better, wider selection of channels?" The majority answered yes. This was touted as, "Survey says that taxpayers don't want ala carte cable," by vested interests.
Take this with a grain of salt, because my memory on this is a little fuzzy, but even as a parable, I think it conveys the point I'm trying to make.
I'd agree with you conjecture, and would totally imagine myself reverting opinions on past decisions depending on the degree of interest I have on the matter.
The 53% number in the article was about reversing position on at least one question. It should mean most of the subject retracted their aledged answer on the other questions, probably the one or two were the faked answer was embraced are questions with no strong positionning.
The article doesn't give details on the strength of the reversed opinions, although the survey sheet seems to have a 7 or 8 point scale for each question. I'd guess this detail was brished out for dramatic effect?
>In what way does it show ineffectiveness of surveys ? not challennging the assertion, trying to understand why you think so.
>To me it seemed to be an indication of what was stated in the article, that people are less committed to their views than they imagine.
The problem is that a survey result is a pretty meaningless number if it can be changed by anyone you put on television with a fancy suit and a pie chart. What you really want to know from a survey is a) what the people who feel strongly think about it, i.e. the people who can't be swayed by an ad buy or a newscast, and b) what portion of the population are those people and what portion are the ones who can be easily swayed. But if people are overestimating how strongly they're committed to their answers then they're contaminating both results: The second directly and the first indirectly by inserting the uninformed answers of the disinterested into the sample that ostensibly measures those with strong feelings.
I agree with everything you're saying, except for the fact that you've got the assumption in there that the purpose of a survey is to genuinely understand people's opinions on a subject rather than to claim support for your own position or even to influence the views of the people you are surveying.
This may be a semantics thing, but surveys are extremely effective - just not as a tool for understanding opinions.
Slightly off topic, but when I scrolled to the bottom of the article, I was really excited to see they had proper citations for the research the article was talking about.
I actually thought this was a BBC article, hadn't been paying much attention to the design and URL. Then I realized that this was on Nature, so I suppose it is to be expected here.
I really wish more science journalism did this, even on mainstream outlets.
Did attitudes reverse, or did students that had no real opinion on some topics make up answers to them, and when they re-read the questions and they were different, accepted that because they didn't feel strongly either way?
Or, did students have some opinion on the topic, but when faced with a reworded question, be embarrassed that they must have misread it initially - which means they had not done what they were asked to do, which is answer the survey properly? There can be a lot of pressure to participate correctly in psychological experiments, I know that in my undergrad you had a requirement to do some amount of experiments to graduate (and non-psych students would be paid, which also puts pressure on you to do your role correctly). To say "sorry, I wasted your time - I misread these questions" would not be easy, much simpler to just make up new answers and leave.
I find it hard to accept their interpretation that attitudes actually reversed. Yes, some experiments - like Milgram's - show that people can do surprising things, that we would not expect in advance, and would even say "I would never do that", but likely we would. But this isn't the same - in Milgram's study, behavior was all that mattered. In this one, it is the interpretation of changing attitudes. I agree it is likely people will voice different attitudes, but I doubt it is because they actually changed them, instead it is either social pressure to not admit you failed at your task to read the questions and answer them honestly, and/or that the student doesn't care either way about the topic.
Of cause it could be also the case, that survey participants just do not care that much about the responses they give, i.e. they are not very interested in arguing about their answers but instead want to get out of the survey situation.
These must have been questions on which the participants did not feel particularly strongly one way or the other. I'm pretty sure I would notice a 100% reversal in my stance on something I have a prior opinion on.
Having a prior opinion is about having some previous understanding of the subject being discussed. If more data were to be provided to you then you might reason differently. Not because you are changing your mind or flip-flopping (like politicians say), but because you are a thinker and coherent individual. You will analyse the new data with fresh eyes and challenge the old data with it. Its all part of the scientific approach. Don't you agree?
Being given new information on a subject I know about which contradicts my understanding might make me change my mind. Just showing me that I apparently answered the question differently wouldn't, for the obvious reason that I know about the subject.
If I don't have any previous understanding of the subject then I wouldn't be able to give an immediate opinion on it. If pressed, I might choose to answer a question on the spur of the moment, but my choice would be mainly random. If it was mainly random then the magic trick described here would probably work, but it wouldn't really mean anything as it was a mainly random choice in the first place.
“I don't feel we have exposed people or fooled them,” says Hall. “Rather this shows something otherwise very difficult to show, [which is] how open and flexible people can actually be.”
--There is a "social" reversal stigma, likely at play here.[1] In isolation witout time constraint, check to confirm result.
_______________________
[1] "Participants were then asked to [read aloud] three of the statements, including the two that had been altered, and discuss their responses."
Our minds are much less logical, consistent, or rational than we like to imagine. The concept of cognitive shortcuts is not new, analytical thinking requires effort and energy, so our minds tend to filter out majority of input and process it subconsciously instead of expending energy on consciously analyzing and processing every little decision. Unfortunately the filtering mechanisms are automatic and kick in even for things that should be considered thoughtfully.
Some further reading on cognitive biases and shortcuts:
- Motivated Tactician model tries to explain why people use stereotyping, biases, and categorization in some situations and more analytical thinking in others [http://en.wikipedia.org/wiki/Motivated_tactician]
- Availability heuristic is the "if you can think of it, it must be important" heuristic, which leads people to fear flying more than driving and terrorism more than flying, even though their chances of dying from a car accident are far higher than ever being involved in a plane crash or a terrorist attack [http://en.wikipedia.org/wiki/Availability_heuristic]
In addition to "anecdotes are not data", everybody should learn that "self reported data is useless". Don't rely on it when making any kind of important decision. Surveys are just not a reliable way of learning anything about anything.
These findings suggest that if I'm fooled into thinking that I endorse a view, I'll do the work myself to come up with my own reasons [for endorsing it].
In the last paragraph, is the critic asking me to believe that the ethics of meat consumption are somehow more important to me than policy in Palestine? Really?
Why does that seem surprising? The vast majority of people will perceive that policy in Palestine neither directly affects them nor can be directly affected by them, while the ethics of meat consumption affects everyone who eats meat and everyone who chooses not to eat meat for ethical reasons.
I didn't really get the glue part, but the,results of this study point to some really interesting conclusions that,could be huge for copy in the future.