Research results can be biased when people are afraid to give honest answers. Scientists can also hope for “politically correct” results. Both types of bias were likely responsible for the failure to predict Trump’s victory.

KNOWLEDGE @ BI: Research biases

When all the ‘experts’ were predicting a Hillary Clinton presidency in 2016, I publicly predicted a Donald Trump victory because of my research on green marketing and research biases.

The surprise of that election result demonstrates how difficult it is to make accurate predictions when respondents are afraid to give honest answers, or when scientists warp the research process to support their own point of view.

Both types of bias were likely responsible for the widespread failure to predict Trump’s victory, but do offer important lessons for market researchers.

Predicting the Trump victory

Just before the US presidential election in November 2016, only 6 of my 150 MSc students raised their hands to predict a Donald Trump victory in response to my question: ‘who do you think will win?’.

After my classroom ‘vote’ I shared with students my explanation for predicting a Trump presidency, which has its roots in the overwhelmingly negative media coverage of the Trump campaign and his ‘deplorable’ followers.

Due to this unbalanced election coverage, I believed the polls would suffer from social desirability bias, which occurs when respondents are worried that a researcher will have a negative opinion about them if they should provide honest answers.

As it turned out one of the few polls to correctly predict Trump’s win utilized a projective technique that asked respondents ‘who they expected their neighbors would vote for’, and this method revealed considerably higher Trump support that allowed the poll taker to correct for the social desirability bias of most conventional polls.

Social desirability bias

Social desirability bias is also a problem for many research topics involving ‘politically correct’ issues.

For example, much of the research on green marketing topics finds a huge gap between the vast majority of consumers who say they plan to buy green products and the very small percentage that actually purchase them.

After all, honestly admitting in a survey that you do not buy green products might indicate that you are a terrible person who doesn’t care about the environment.

This value-action gap can therefore lead to expensive mistakes by inaccurately predicting market demand for any product that people might be afraid to give honest answers regarding their preferences and habits, which can consequently lead to over-estimating demand for virtue products such as green products, healthy foods and exercise, or under-estimate demand for vice products such as alcohol and gambling.

Advocacy bias of the researcher

Although research results can be biased when consumers are afraid to give honest answers, recent research suggests that bias issues are also a big personal problem for many of the scientists that conduct the research.

For example, a recent paper finds widespread advocacy biases caused by the leftist/liberal viewpoints held by the vast majority of social psychologists, which has resulted in research designs and results that favor the leftist/liberal position on issues involving gender, sexual orientation, race, or political affiliations (Duarte et al. 2015).

Similarly, prior to the 2016 election most political experts and poll takers had a personal hope for a Hillary Clinton victory, and hence wanted to have poll results that would support their preferred candidate.

This preference likely meant they subtly changed question wording or sampling frames to generate pro-Hillary results, or perhaps their Clinton preference blinded them to possible social desirability biases in their polling results.

These types of advocacy biases have also been the inspiration for several of my current work-in-progress projects that are addressing potential researcher bias in the study of environmental issues.

The preliminary results suggest that ‘green’ research in marketing does suffer from a ‘pro-green’ advocacy bias that likely leads to an over-estimation of public support for green causes and products, which can consequently result in the implementation of ineffective green public policies, (Olson and Biong 2015; Olson 2015), and the launch of unpopular and unprofitable green products (Olson 2013b).

Ways to reduce research bias

The Trump victory is a reminder that market researchers working with any ‘politically correct’ or ‘controversial’ topic or product needs to be extra vigilant about minimizing advocacy and social desirability biases if they desire to generate research results that accurately predict market preferences.

One technique that has proven helpful is the use of content analysis on user/customer generated Internet or social media content, where comments, videos, etc. have been voluntarily and anonymously provided and hence should be less affected by social desirability bias (Olson 2017).

The use of projective techniques, ignorant researchers (i.e. researcher doesn’t know the question the research is designed to answer), and multiple research methods (for example comparing research results with actual sales) can also be effective in minimizing social desirability and advocacy biases, which will hopefully provide more accurate results and better data driven decision making.

• Duarte, J. L., Crawford, J. T., Stern, C., Haidt, J., Jussim, L., & Tetlock, P. E. (2015). Political diversity will improve social psychological science. Behavioral and Brain Sciences, 38, 1-13.
• Olson, Erik L. (2013a). It’s Not Easy Being Green: The Effects of Attribute Tradeoffs on Green Product Preference and Choice. Journal of the Academy of Marketing Science, 41 (2), 171-84.
• Olson, Erik L. (2013b). Perspective: The Green Innovation Value Chain: A Tool for Evaluating the Diffusion Prospects of Green Products. Journal of Product Innovation Management, 30 (4), 782-93.
• Olson, Erik L. (2015). The financial and environmental costs and benefits for Norwegian electric car subsidies: Are they good public policy? International Journal of Technology, Policy, and Management, 15 (3), 277-296.
• Olson, Erik L. & Biong, Harald (2015). The Solyndra case: An Institutional Economics perspective on the optimal role of government support for green technology development. International Journal of Business Continuity and Risk Management, 6 (1), 36-47.
• Olson, Erik L. (2017). The rationalization and persistence of organic food beliefs in the face of contrary evidence,” Journal of Cleaner Production, 140, 1007-1013.

This article is published in BI Marketing Magazine 2018. BI Marketing Magazine is a Science Communication Magazine published by the Department of Marketing at BI Norwegian Business School.

Questions about this article? Other questions? Contact BI Business Review


You can also see all news here.
BI Business Review


Sign up for our newsletter to get the latest news from BI Business Review.

sign up