Survey Research and Administration
Survey research is a method in which data is collected from a target population, called the sample, by personal interviews, online surveys, the telephone, or paper questionnaires. Some forms of survey research such as online surveys may be completed in an automated fashion. The professionals at Statistics Solutions provide survey administration help to master’s and doctoral candidates in the survey administration phase of their research.
The choice of survey instrument(s) used to gather data for your thesis or dissertation is critical. If you are planning to create your own survey instrument and administer it online (e.g., SurveyMonkey, QuestionPro, PsychData or Zoomerang), Statistics Solutions can help you create the survey questions and any subscales so they can be easily analyzed and answer your research questions. Our consultants can then help you validate your instrument and expedite the IRB approval process by helping you avoid the typical university and committee pitfalls.
If you are using an established instrument, our statistical consultants will help you understand the validity and reliability information and the statistical analysis appropriate for the instrument constructs. Our statistical consultants will then help you integrate this information into your dissertation.
Key Terms and Concepts:
Survey instrument: The questionnaire or response item posed to a respondent is called a survey research instrument. The instrument may be a questionnaire or an interview; it depends on the survey research.
Interviews and questionnaires: An interview uses face-to-face interaction, whereas a questionnaire uses the mail and other indirect methods of taking responses from a respondent.
Response structure: In survey research, the response structure is the format of the item. Structures may be open-ended, close-ended, multi-response, dichotomous, a ranking system, or a variety of other formats.
Survey error: In survey research, survey errors includes factors such as the selection of the wrong sample, the wrong coding in a questionnaire, a tabulating error, data processing errors, interviewer bias, researcher bias, and misinterpretation of data.
Pretesting: Pretesting refers to all the essential steps involved in survey research before selecting the final sample. According to Converse and Presser (1986: 65), two pretests should be conducted before selecting the final sample.
Analysis of non-response: In survey research, some respondents do not fill out the entire questionnaire. The unanswered questions in this case become the missing values. We should exclude those values during the analysis or we should fill those missing values by using missing value analysis.
Data Collection Methods:
Face-to-face interview: In survey research, this is the most expensive but reliable method for data collection. In face-to-face interviews, most of the respondents give complete and accurate answers. This method is used when the research requires deep exploration of opinion.
Mail Survey: This method uses the Internet or sends mail to the respondents. There is no bias on the part of the interviewer in this method, but there is no control over respondent interaction.
Telephone: This method is a fast method of data collection in survey research. This method supports open-ended responses and moderate control over interviewer bias.
Web survey: This is a less expensive method and it is also the fastest method of data collection. This method is appropriate when we need data from a large population or when we need international data. This method is more suitable when we need unscientific but quick responses.
Survey Design Considerations:
Survey layout: For Internet surveys or mail surveys, the layout of the survey should be attractive and easy to use; for example, the survey should avoid multiple fonts, the response area should be on the right side, there should be a clear separation of questions, and the survey should be an attractive color.
Survey length: In survey research, the length of the survey should be as long as needed within the constraint of the respondent’s attention span. The surveys need to have a minimum of three items for testing a particular hypothesis.
Item bias in survey research:
Ambiguity: Questions should be specific. We should avoid questions that make the respondent uncomfortable in giving the answer to that particular question.
Rank lists: Respondents should not be asked to rank more than four or five items. Beyond that, respondents may give an arbitrary ranking just to get past the item.
Unfamiliar terms and jargon: In survey research, we should not use unfamiliar words. Respondents must be able to answer the questions easily, and they cannot do this if the survey uses unfamiliar words or jargon.
Poor grammatical format: In survey research, weak grammatical format can introduce bias. We should avoid poor grammatical format.
Hypothetical items: We should not include hypothetical items. Hypothetical items make it difficult for the respondent to answer that particular question.
Language differences: Items must have the same meaning when the questionnaire is given to populations speaking different languages.
Types of items: Model items are those that measure variables in the survey model.
Filter items: In survey research, filter items are those items which eliminate the unqualified respondents during post processing.
Cross-check items: In survey research, cross-check items are those items which are used for consistency with the respondent. For example, at one place one can ask for the age of the respondent, and at another place, one can ask the data for the respondent’s birth. This will yield consistency of data.
Survey Administration Help Resources
Behling, O., & Law, K. S. (2000). Translating questionnaires and other research instruments: Problems and solutions. Thousand Oaks, CA: Sage Publications. View
Bourque, L. B., & Clark, V. A. (1992). Processing data: The survey example. Newbury Park, CA: Sage Publications. View
Bourque, L. B., & Fielder, E. P. (1995). How to conduct self-administered and mail surveys. Thousand Oaks, CA: Sage Publications. View
Bourque, L. B., Fielder, E. P., & Fink, A. (2003). How to conduct telephone surveys. Thousand Oaks, CA: Sage Publications. View
Converse, J. M., & Presser, S. (1986). Survey questions: Handcrafting the standardized questionnaire. Thousand Oaks, CA: Sage Publications. View
Couper, M. P. (2005). Technology trends in survey data collection. Social Science Computer Review, 23(4), 486-501.
Czaja, R., & Blair, J. (2005). Designing surveys: A guide to decisions and procedures (2nd ed.). Thousand Oaks, CA: Pine Forge Press. View
Denscombe, M. (2006). Web-based questionnaires and the mode effect: An evaluation based on completion rates and data contents of near-identical questionnaires delivered in different modes. Social Science Computer Review, 24(2), 246-254.
Dillman, D. A. (1999). Mail and internet surveys: The tailored method. New York: John Wiley & Sons. View
Diment, K., & Garrett-Jones, S. (2007). How demographic characteristics affect mode preference in a postal/web mixed-mode survey of Australian researchers. Social Science Computer Review, 25(3), 410-417.
Ehrlich, H. J. (1969). Attitudes, behavior, and the intervening variables. American Sociologist, 4(1), 29-34.
Fink, A. (1995a). The Survey Handbook. Thousand Oaks, CA: Sage Publications. View
Fink, A. (1995b). How to Ask Survey Questions. Thousand Oaks, CA: Sage Publications. View
Fink, A. (2003). How to design survey studies. Thousand Oaks, CA: Sage Publications. View
Fink, A. (2008). How to conduct surveys: A step-by-step guide (4th ed.). Thousand Oaks, CA: Sage Publications. View
Fink, A., & Kosecoff, J. B. (1998). How to conduct surveys (2nd ed.). Thousand Oaks, CA: Sage Publications.
Fowler, F. J., Jr. (2008). Survey research methods (4th ed.). Thousand Oaks, CA: Sage Publications. View
Fox, J. A., & Tracy, P. E. (1986). Randomized response: A method for sensitive surveys. Thousand Oaks, CA: Sage Publications. View
Göritz , A. S. (2006). Cash lotteries as incentives in online panels. Social Science Computer Review, 24(4), 445-459.
Göritz, A. S., & Wolff, H. -G. (2007). Lotteries as incentives in longitudinal web studies. Social Science Computer Review, 25(1), 99-110.
Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56(4), 475-495.
Healey, B. (2007). Drop downs and scroll mice: The effect of response option format and input mechanism employed on data quality in web surveys. Social Science Computer Review, 25(1), 111-128.
LaPiere, R. T. (1934). Attitudes vs. actions. Social Forces, 13(2), 230-237.
Lee, E. S., & Forthofer, R. N. (2006). Analyzing complex survey data. Thousand Oaks, CA: Sage Publications. View
Lee, S. (2006). An evaluation of nonresponse and coverage errors in a prerecruited probability web panel survey. Social Science Computer Review, 24(4), 460-475.
Nesbary, D. (1999). Survey research and the World Wide Web. Needham Heights, MA: Allyn & Bacon. View
Oishi, S. M. (2003). How to conduct in-person interviews for surveys (2nd ed.). Thousand Oaks, CA: Sage Publications. View
Peterson, R. A. (2000). Constructing effective questionnaires. Thousand Oaks, CA: Sage Publications. View
Rea, L. A., & Parker, R. A. (2005). Designing and conducting survey research: A comprehensive guide (3rd ed.). New York: John Wiley & Sons. View
Rubin, H. J., & Rubin, I. S. (2005). Qualitative interviewing: The art of hearing data (2nd ed.). Thousand Oaks, CA: Sage Publications. View
Salant, P., & Dillman, D. A. (1994). How to conduct your own survey. New York: John Wiley & Sons. View
Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The psychology of survey response. New York: Cambridge University Press. View
Willis, G. B. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage Publications. View