Market Research: How Cultural Differences Impact Survey Responses and Results

culture-international-market-research.jpg

Market Research: How Cultural Differences Impact Survey Responses and Results

culture-international-market-research.jpgYou might think that conducting an online survey in multiple countries is easy as pie, but there is more to this than meets the eye! The outcome might be influenced by the responses given by the participants, and culture it seems has an impact on how people understand and respond to questions.

One of the many benefits of the world wide web is that it has made the lives researchers who wish to conduct surveys on a multi-country scale a whole lot easier. However, it has created a new problem.

In an article on Research, Melanie Courtright, a researcher for Research Now, says that conducting multi-country surveys might be easier now, but researchers cannot always rely on the answers that are given: research she conducted for Research Now revealed that not all cultures respond to survey questions the same way.

Courtright believes global studies can be very useful – they can be used as a tool to compare countries to one another. However, she says research teams are often unsure whether the answers from different nationalities indeed stem from different opinions about matters, or that external factors play a part in the outcome of the study as well.

According to Courtright, Research Now is very interested in all factors that have an effect on responses given in a survey. This is the reason why the company recently investigated response styles and the factors that are can influence these styles.

When speaking of response styles, Courtright refers to three response styles that are commonly found in market research:

•    Extreme Response Style or ERS: respondents answers are on the extreme points of a scale.
•    Midpoint Response Style  or MRS: respondents answers are on the middle points of a scale.
•    Acquiescence Response Style or ARS: respondent answers are in agreement whatever is asked.

Previously, Coutright says, researchers thought the socio-demographics of the individual, the cultural characteristics of their country and the characteristics of the scales themselves were factors that influenced a participant’s responses.

As Research Now wanted to know whether this was true, they conducted the following study: participants in ten different countries were asked to fill out a seven minute attitudinal survey consisting of eight multi-item scales about a number of different subjects.

Participants had to answer questions on the topics by using a 4-, 5-, 7- or 10-point scale. These surveys were all filled out by 500 people per scale, which meant 20,000 people took part in the study. The returned surveys were then assessed to see if they could receive an ERS, MRS or ARS label.

The study yielded a number of interesting results, Courtright says. According to her, respond styles among the participating countries were very different. People from Brazil and China, for example, often gave extreme responses, while the Japanese leaned toward midpoint answers. Courtright believes this shows that data must be studied per country and researchers must not assume that countries in the same region respond in the same way.

According to Courtright, the study also showed that there are no differences in response styles when it comes to gender. However, age did seem to have an influence on the answers that were given, but these differences did not follow a specific pattern. Courtright gives the example of extreme responding, which tends to get higher with age; contrastingly, however, people over 65 were less extreme in their answers.
    
Research Now also conducted a follow-up study with participants from the USA. In this study, researchers investigated the difference between surveys that were filled out on computers and laptops, and those that were taken on mobile phones. The device on which participants took the survey did have an influence on the outcome, but was no pattern to be found. This is why no solid conclusions could be made, Courtright says. However, according to her, mobile sliders did produce more ERS. As the collected data did not produce clear results, Courtright believes the decision to collect data via mobile phones should be based on external factors such as the objective of the survey.

Courtright concludes by saying that researchers must know that the data they collect when using multi-country surveys might be influenced by the responding styles of the participating countries. She says this might mean researchers must correct the data accordingly: if this is the case,  they do have to use calibrated scales.

According to Courtright, response styles might sound very uninteresting, are of vital importance as researchers wish to gather high-quality data from them. Knowing what country responds in what way means researchers can distinguish between actual response differences and differences that simply arise because of cultural differences.

Katia Reed
No Comments

Sorry, the comment form is closed at this time.