A few posts like this have been written already. There was this post by Greg Yudin. And then this by Alexei Miniailo DoRussiansWantWar.
A major misunderstanding appears to be that people used to dealing with public opinion surveys fall into thinking in terms of ‘majorities’ when a careful examination reveals big holes.
Greg Yudin has consistently attacked polling. On philosophical grounds (that polling frames narrowly questions that are very complex and a reality that ‘opinions’ are never fixed or coherent) and technically – that Russian polling in particular is deprofessionalized and politically sabotaged. Yudin points out the very narrow pool of people willing to answer polls in Russia which could indicate a core of conservative-irredentist-chauvinist support for the regime and its aims at around 15%. These are my terms by the way, not his. Second his point is about the meaning of polling in authoritarian states as ‘feedback’ to the regime. People can’t voice fears and complaints easily in other ways, so they do it in polls. This means we should treat them not as ‘preference’ sorting, but as a very partial and narrow form of communication. I would add that this also exacerbates a tendency that only those that think they will be heard are willing participants, because the regime has been effective in closing its ears to the genuine majority for a long time. Yudin makes another point more broadly that meaningful political subjectivity is not really possible, and a broad wave of privatisation of expression occurs where people retreat into private and personal matters. Maksim Kats supplements Yudin’s point about non-responses to polling, having apparently got only a tiny response from 31000 calls just the other day. His story is one of refusal to respond to polling.
Alexei Miniailo did some polling experiments two weeks ago. The main point is one that Samuel Greene has made previously. People in a state like Russia tend to give what they think is a consensus/agreeable answer and, according to Miniailo, clump out of fear to what they think is the politically mainstream opinion: that the ‘special operation’ is limited, is intended to denazify Ukraine or defend Donbas, etc. This means the 40% that follow prompts can be said to be a ‘group’, but what they actually believe is not clear. Similarly, maybe 7% lie about their opinion. What cuts through? Talks between relatives and close friends.
My own contacts, in both private and ‘state’-sponsored market research are more critical than Yudin of political and social polling. I collate the thoughts of four people who have special marketing or survey expertise.
- Boris Sanich (former big-polling company exec)
Yes, there’s a Hobbesian morass in society and it’s detectable. But that’s true of any society. And of course TV ‘pumping’ up has an effect – but it’s more emotional than rational-or even measurable – one week it’s there in the news and ‘relevant’, the next it isn’t. Does that make it opinion? No. It’s like a balloon you have to keep feeding. But Yudin I think underplays how much metastasised resentment there is internally as a result of the terrible economy and this bleeds out in all polls – against Gayropa, against Ukrainians now, but tomorrow it could be against Martians. So Putin in a sense rides this dangerous, clifflike wave. So like the person that says ‘I’m with our boys and our cause is just!’ in the next sentence says ‘but this is like Pinochet and that concert with Putin was just downright scary!’.
Since even 2000 there’s been a veneer of professionalization in polling – a use of software and training in the Western techniques. Statistical ‘education’. But this masks a stronger countercurrent – faking results, ‘correcting them’, and extremely poor field practices in reality. It’s like those badly maintained trucks in Ukraine. They look nice from a distance on parade, but really the oil is old and the tyres about to fall off. People don’t actually hit the numbers, because they pocket the money and can use their expertise to simulate the ‘correct’ results that in any case they know the client wants. And right at the top there’s corruption in polling. And field research is expensive, calls are expensive. Yes of course there are reputable firms that will take your money but there is always a weak link with people – the human factor. Also, any political or even social polling is so politicized so even when you do a ‘real’ good job, you end up binning it or changing it. So if you’re still in the market doing this research you are compromised down to your toes. And you ending up lying to yourself. Remember also that again, Levada, VTsiom, FOM they don’t do any of this work themselves. It’s almost all contracted out. Of course a person with expertise can tell when a result has been fixed or fiddled with, or corrected for a weird standard deviation. We call it ambiguously, ‘cleaning the database’. But like me, a lot of principled people left the business because anything political ends up as ‘mut’ i pizdezh’ [murk and BS].
- Gena Maximovich (owner of private market research firm since 90s)
Private sector market research and specialized social and political polling have parted ways. I agree with your previous commentator 100%. The biggest problem is poor field work, poorly trained polling staff and cleaning the data to suit the client. For my own part, this is why my firm is so strong, because we don’t do any of that and focus – until now – on private companies, particularly in the transnational sector. The elephant in the room that these political pollsters can’t openly discuss is their samples are shit – old and poor people answer and especially if there are inducements they spout all kinds of shit. ‘Bad field’ we call it. A lumpenaudience is all that remains. That’s why my firm only does scoping polling and any results we publish are based almost entirely on focus groups where I, or my subordinates have personally conducted the work and then signed off on it. Even there, with the main pollsters I don’t trust their focus groups because of the ‘bad field’ effect. Low professionalism is a problem with these mainstream pollsters who depend on the state. My customers need predictive power and if my focus groups are not corroborated with results in sales, I lose customers. But the political pollsters are focussed on supplying a self-fullfilling prophesy. Sure there is an impression of professionalism – auditing of results, recordings of call-centre polls, but they still are massively manipulated.
- Galina Vasilievna – survey conductor.
I did phone surveying, before call centres became the main method, and I also did a lot of street and door-to-door. I can just summarise that the supervisors knew we made up results. They taught us how to do it so that it would not be obvious, although sometimes we got caught and our data got thrown out or cleaned. We would never get enough respondents and so some of us would just go home early and fill out the rest by hand. We would adopt the mentality of the kind of people we’d got answers from and make up variants based on these. [note to readers – this person was also a fieldworker for one of the most cherished databases at the heart of Western research on Russia]
- Jeremy Morris – former focus group manager in ex-Soviet states.
My own experience is limited, but I think, revealing. I consulted for a Western media company who wanted market research on internet and TV consumption. They contracted to a Belarusian company. I was the ‘quality control’ and intermediary. The contractor continuously cut corners, basically bullied the focus groups into a particular line, and oversimplified the results to an egregious degree. When I tried to intervene the client was not really interested. There was a marked difference in the quality of the facilitator’s work between when I was physically present in the groups and when I was absent (and reviewing a recording). Just getting the contractor to hand over the tapes was hard. Of course, if I’d not cared, I would not have reviewed the tapes. Indeed, I was paid only to review 5% of the actual recordings. As I indicated with Galina’s story – this affects Western research too. The elephant in the room is how many scholars rely on ‘off-the-shelf’, contracted out data collection where more than one intermediary has a financial incentive to make the results ‘look good’.
Does public opinion exist? Does the majority support war? Opinion is highly managed via polling to produce clear results, legible to the political technologs in the Kremlin. Even now, Ukraine is not highly relevant to most Russians’ everyday lives and so any polling about the war is suspect. With the most recent results we see the clear effect of fear. Are people ignorant? No, not at all. Some avoid news. But that just delays the inevitable. The war is not ‘salient’, yet, but soon it will be. Lumping people into ‘war camp’, or ‘opposition’ is very problematic as these are not static, demarcated categories, even now. Like Aleksei Miniailo proposes, perhaps there are 40% who are agreeable – but is that ‘opinion’? Who are they agreeing with? Perhaps they align with the 20% (?) within that cohort who are genuinely enthusiastic for elite messaging around great power status, and a general conservatism. But regardless we cannot talk of a pro-war majority.
If you want more detail on polling debates and the background to Yudin’s interventions, then check out this post from 2019 on a similar topic.
Would that be the RLMS then?
I couldn’t possibly comment
lol that’s the basis for my whole phd….
Ironically, my project started out w the intention of using a mixed methods design, but a combination of factors (mostly covid-related) has whittled away at that until it’s almost entirely quantitative. I am still trying to triangulate w other sources so it has some connection to reality and isn’t just statistical castles in the sir
LikeLiked by 1 person
It’s hard to comment without knowing more. I think a key challenge which it is not impossible to overcome is maintaining the tools and kits to do qual in even the worse case scenarios. Feel free to mail me.
Pingback: Political ethnography and Russian studies in a time of conflict | Postsocialism