Online surveys have become very popular, both in professional research and for amateurs who are looking for entertainment or conducting informal research. While free online survey platforms such as Survey Monkey and QuestionPro can provide an easy way to conduct a survey for the average person, designing, delivering and determining the results of the survey still require an understanding of best practices in order to ensure that the results are valid and add value to the project.
Pros and cons of conducting online surveys
The advantages of conducting online surveys are the convenience of collecting the information for both the researcher and the respondents, the automatic collection of data that ensures there are no errors in transcribing the data and the automated tools that assist with analyzing the data. In the days of paper based surveys it was a long process beginning with physically writing down the answers to responses on paper, transcribing them to a database, and then using knowledge of statistics to design and run analysis calculations. Once the work of designing the survey is done and the link has been distributed, the data will collate itself automatically in an CSV, Excel or SQL ready file for use in analysis.
The disadvantages of conducting online surveys include the need to navigate unwieldy and non-intuitive survey creating software. In the case of QuestionPro there were several attempts which ended in notifications that I could launch my survey if I paid to upgrade. For example, I began by using a template, and then deleting the questions that were there and replacing it with my own. As it turns out, it would allow me to create that survey, however I could not launch it without upgrading to a paid account. I therefore had to start over from scratch. Despite advising me several times that my questions had been saved, the questions I created kept disappearing. There is no save button, and the notification that questions were saved was the only indicator, although it turned out to be not working as hoped. The system was not as intuitive or easy to use as I had expected it to be.
Validity and reliability
There are several reasons to be suspicious of the validity and reliability of data which is collected using an online platform. First of all, anyone can respond to the survey. While this is random in one sense, it is not random in the sense needed for a survey to reflect a broader subpopulation. People will self-select to voluntarily respond to the survey, and this way result in skewed results. The marketing of the survey or the recruitment of the survey respondents is therefore a very important aspect to any research which uses an online platform to collect data about peoples’ experiences and opinions. People may not take answering the questions seriously, and may simply click answers to get through to the end of the survey rather than truly reflect on what they think before answering. Another possible issue concerns filling out the survey more than once. For example, if there is a promotion or incentive, such as a draw for a gift card, someone might become interested in filling out the survey for that purpose rather than the intended purpose. Typically, respondents for a survey should meet certain criteria. When a survey is conducted in person or over the phone, these criteria can be validated by asking the respondent questions. In the example given, the organization is seeking feedback from people under 18 years old, and this could be a recruitment and selection question. When it is a phone interview, the interviewer would ask the potential respondent how old they are, if they are older than 18 years of age, then the interviewer will thank them for their time and terminate the survey. When it is an online survey these questions to validate that the respondents meet the selection criteria must also be included. The online completion does change the social dynamics of the exchange, and people may be less likely to give accurate answers (Smieszek, Barclay, Seeni, Rainey, Gao, Uzicanin & Salathé, 2014).
Considerable research has been conducted on the use of online surveys in order to validate their use (Sauermann & Roach, 2013). Researchers actually found that certain groups that take part in online surveys in exchange for payment, such as users of Amazon’s mTurk, were more attentive in answering questions that live survey participants.
Validating the data
Two ways to validate the data that is collected using an online questionnaire include complementing the survey with more in depth semi-structured interviews, and finding other research with similar research questions to see if the results are aligned.
Semi-structured interviews which provide open answers will allow an investigator to explore the interpretation of the questions which were asked in the survey as well as the reasons for a respondent’s answer. This allows the researcher to better understand the responses, including whether they are taken out of context in the results.
By using other, similar research, it is possible to see if the results are aligned and applicable to a larger population. This is important since the whole purpose of doing a survey is to gain insight into what most people in each demographic category or situation are thinking, rather than just the people who are answering the survey.
Challenges of using an online questionnaire website
There are many challenges to using an online questionnaire website, and care must be taken if the the purpose of the research is academic or commercial. The online websites facilitate conducting surveys, but they will not by themselves ensure that the recruitment of respondents is appropriate, or that the collection of data results in usable information. Some of the issues are challenges of conducting surveys, such as finding enough interested volunteers to provide data. Others are related to the use of the platforms which are out of the control of the researcher, such as being asked for payment to complete the survey, ensuring that the survey is being completed accurately and ensuring that only the right respondents use the survey.
While designing and implementing the survey was a bit painful because of the poor navigation of QuestionPro, the most difficult part was actually finding respondents. This was even though it took only a few minutes to click the link and complete the survey. While the survey link had been distributed to ten friends and family, only five actually completed it. Even then, it took several reminders and resending the email to get those responses. I could have tried a method such as posting it to my Facebook page, but if I were doing a real survey that required usable results, there would be questions regarding the validity of the responses.
Online survey platforms can save a lot of time and make delivery and taking the survey more convenient, but all of the thought that goes into a regular survey is still required. There is also more effort required in setting up the survey online, in comparison to typing up the questions in a word document and distributing it by email. On the other hand, there is less work when it comes time to collate the data and analyze it.
- Hauser, D. J., & Schwarz, N. (2016). Attentive Turkers: MTurk participants perform better on online attention checks than do subject pool participants. Behavior research methods, 48(1), 400-407.
- Sauermann, H., & Roach, M. (2013). Increasing web survey response rates in innovation research: An experimental study of static and dynamic contact design features. Research Policy, 42(1), 273-286.
- Smieszek, T., Barclay, V. C., Seeni, I., Rainey, J. J., Gao, H., Uzicanin, A., & Salathé, M. (2014). How should social mixing be measured: comparing web-based survey and sensor-based methods. BMC infectious diseases, 14(1), 136.