fbpx
Wikipedia

Survey data collection

With the application of probability sampling in the 1930s, surveys became a standard tool for empirical research in social sciences, marketing, and official statistics.[1] The methods involved in survey data collection are any of a number of ways in which data can be collected for a statistical survey. These are methods that are used to collect information from a sample of individuals in a systematic way. First there was the change from traditional paper-and-pencil interviewing (PAPI) to computer-assisted interviewing (CAI). Now, face-to-face surveys (CAPI), telephone surveys (CATI), and mail surveys (CASI, CSAQ) are increasingly replaced by web surveys.[2] In addition, remote interviewers could possibly keep the respondent engaged while reducing cost as compared to in-person interviewers.[3]

Modes of data collection edit

The choice between administration modes is influenced by several factors, including 1) costs, 2) coverage of the target population (including group-specific preferences for certain modes[4]), 3) flexibility of asking questions, 4) respondents’ willingness to participate and 5) response accuracy. Different methods create mode effects that change how respondents answer. The most common modes of administration are listed under the following headings.[5]

Mobile surveys edit

Mobile data collection or mobile surveys is an increasingly popular method of data collection. Over 50% of surveys today are opened on mobile devices.[6] The survey, form, app or collection tool is on a mobile device such as a smart phone or a tablet. These devices offer innovative ways to gather data, and eliminate the laborious "data entry" (of paper form data into a computer), which delays data analysis and understanding. By eliminating paper, mobile data collection can also dramatically reduce costs: one World Bank study in Guatemala found a 71% decrease in cost while using mobile data collection, compared to the previous paper-based approach.[7]

Apart from the high mobile phone penetration,[8][9] further advantages are quicker response times and the possibility to reach previously hard-to-reach target groups. In this way, mobile technology allows marketers, researchers and employers to create real and meaningful mobile engagement in environments different from the traditional one in front of a desktop computer.[10][11] However, even when using mobile devices to answer the web surveys, most respondents still answer from home.[12][13]

SMS/IM surveys edit

SMS surveys can reach any handset, in any language and in any country. As they are not dependent on internet access and the answers can be sent when its convenient, they are a suitable mobile survey data collection channel for many situations that require fast, high volume responses. As a result, SMS surveys can deliver 80% of responses in less than 2 hours [14] and often at much lower cost compared to face-to-face surveys, due to the elimination of travel/personnel costs.[15] IM is similar to SMS, except that a mobile number is not required. IM functions are available in standalone software, such as Skype, or embedded on websites such as Facebook and Google.[3]

Online surveys edit

Online (Internet) surveys are becoming an essential research tool for a variety of research fields, including marketing, social and official statistics research. According to ESOMAR online survey research accounted for 20% of global data-collection expenditure in 2006.[1] They offer capabilities beyond those available for any other type of self-administered questionnaire.[16] Online consumer panels are also used extensively for carrying out surveys but the quality is considered inferior because the panelists are regular contributors and tend to be fatigued. However, when estimating the measurement quality (defined as product of reliability and validity) using a multitrait-mutlimethod approach (MTMM), some studies found a quite reasonable quality[17][18] and even that the quality of a series of questions in an online opt-in panel (Netquest) was very similar to the measurement quality for the same questions asked in the European Social Survey (ESS), which is a face-to-face survey.[19]

 
US Navy 030618-N-2893B-001 Information Technician 1st Class Annette Leasure takes a few minutes to fill out the BUPERS Online Uniform Survey Questionnaire

Some studies have compared the quality of face-to-face surveys and/or telephone surveys with that of online surveys, for single questions, but also for more complex concepts measured with more than one question (also called Composite Scores or Index).[20][21][22] Focusing only on probability-based surveys (also for the online ones), they found overall that the face-to-face (using show-cards) and web surveys have quite similar levels of measurement quality, whereas the telephone surveys were performing worse. Other studies comparing paper-and-pencil questionnaires with web-based questionnaires showed that employees preferred online survey approaches to the paper-and-pencil format. There are also concerns about what has been called "ballot stuffing" in which employees make repeated responses to the same survey. Some employees are also concerned about privacy. Even if they do not provide their names when responding to a company survey, can they be certain that their anonymity is protected? Such fears prevent some employees from expressing an opinion.[23]

Advantages of online surveys edit

  • Web surveys are faster, simpler, and cheaper.[2] However, lower costs are not so straightforward in practice, as they are strongly interconnected to errors. Because response rate comparisons to other survey modes are usually not favourable for online surveys, efforts to achieve a higher response rate (e.g., with traditional solicitation methods) may substantially increase costs.[1]
  • The entire data collection period is significantly shortened, as all data can be collected and processed in little more than a month.[2]
  • Interaction between the respondent and the questionnaire is more dynamic compared to e-mail or paper surveys.[16] Online surveys are also less intrusive, and they suffer less from social desirability effects.[2]
  • Complex skip patterns can be implemented in ways that are mostly invisible to the respondent.[16]
  • Pop-up instructions can be provided for individual questions to provide help with questions exactly where assistance is required.[16]
  • Questions with long lists of answer choices can be used to provide immediate coding of answers to certain questions that are usually asked in an open-ended fashion in paper questionnaires.[16]
  • Online surveys can be tailored to the situation (e.g., respondents may be allowed save a partially completed form, the questionnaire may be preloaded with already available information, etc.).[2]
  • Online questionnaires may be improved by applying usability testing, where usability is measured with reference to the speed with which a task can be performed, the frequency of errors and user satisfaction with the interface.[2]

Key methodological issues of online surveys edit

  • Sampling. The difference between probability samples (where the inclusion probabilities for all units of the target population is known in advance) and non-probability samples (which often require less time and effort but generally do not support statistical inference) is crucial. Probability samples are highly affected by problems of non-coverage (not all members of the general population have Internet access) and frame problems (online survey invitations are most conveniently distributed using e-mail, but there are no e-mail directories of the general population that might be used as a sampling frame). Because coverage and frame problems can significantly impact data quality, they should be adequately reported when disseminating the research results.[1][24]
  • Invitations to online surveys. Due to the lack of sampling frames many online survey invitations are published in the form of an URL link on web sites or in other media, which leads to sample selection bias that is out of research control and to non-probability samples. Traditional solicitation modes, such as telephone or mail invitations to web surveys, can help overcoming probability sampling issues in online surveys. However, such approaches are faced with problems of dramatically higher costs and questionable effectiveness.[1]
  • Non-response. Online survey response rates are generally low and also vary extremely – from less than 1% in enterprise surveys with e-mail invitations to almost 100% in specific membership surveys. In addition to refusing participation, terminating surveying during the process or not answering certain questions, several other non-response patterns can be observed in online surveys, such as lurking respondents and a combination of partial and item non-response. Response rates can be increased by offering monetary or some other type of incentive to the respondents, by contacting respondents several times (follow-up), and by keeping the questionnaire difficulty as low as possible.[1] There are draw-backs to using an incentive to garner a response. Non-bias responses could be questioned in this type of situation. The most concrete way to gain feedback is to publicize what is done with the results. To take concrete actions based on feedback and to show that to the customer base is extremely motivating to customers to continue to let their voice be heard.
  • Acquiescence bias. Due to a phenomenon inherently present in human nature, many people have acquiescent personalities and are more likely to agree with statements than disagree - regardless of the content. Often, those people see the question-asker as an expert in their field which causes them to be more likely to react positively to the question asked.
  • Platform Issues. Lack of familiarity with the platform used can cause participants and clients confusion, or limit who may be willing and able to navigate surveys on digital platforms.[25]
  • Questionnaire design. While modern web questionnaires offer a range of design features (different question types, images, multimedia), the use of such elements should be limited to the extent necessary for respondents to understand questions or to stimulate the response. It should not affect their responses, because that would mean lower validity and reliability of data. Appropriate questionnaire design can help lowering the measurement error that can arise also due to the respondents or the survey mode itself (respondent’s motivation, computer literacy, abilities, privacy concerns, etc.).[1]
  • Post-survey adjustments. Various robust procedures have been developed for situations where sampling deviate from probability selection, or, when we face non-coverage and non-response problems. The standard statistical inference procedures (e.g. confidence interval calculations and hypothesis testing) still require a probability sample. The actual survey practice, particularly in marketing research and in public opinion polling, which massively neglects the principles of probability samples, increasingly requires from the statistical profession to specify the conditions where non-probability samples may work.[1]

These issues, and potential remedies, are discussed in a number of sources.[26][27]

Telephone edit

Telephone surveys use interviewers to encourage the sample persons to respond, which leads to higher response rates.[28] There are some potential for interviewer bias (e.g., some people may be more willing to discuss a sensitive issue with a female interviewer than with a male one). Depending on local call charge structure and coverage, this method can be cost efficient and may be appropriate for large national (or international) sampling frames using traditional phones or computer assisted telephone interviewing (CATI). Because it is audio-based, this mode cannot be used for non-audio information such as graphics, demonstrations, or taste/smell samples.

Mail edit

Depending on local bulk mail postage, mail surveys may be relatively lower cost compared to other modes. The field method tends to be longer - often several months - before the surveys are returned and statistical analysis can begin. The questionnaire may be handed to the respondents or mailed to them, but in all cases they are returned to the researcher via mail. Because there is no interviewer presence, the mail mode is not suitable for issues that may require clarification. However, there is no interviewer bias and respondents can answer at their own convenience (allowing them to break up long surveys; also useful if they need to check records to answer a question). To correct nonresponse bias, extrapolation across waves could be done.[29] Response rates can be improved by using mail panels (members of the panel must agree to participate) and prepaid monetary incentives,[30] but response rates are affected by the class of mail through which the survey was sent.[31] Panels can be used in longitudinal designs where the same respondents are surveyed several times.

Visual presentation of survey questions make a difference in how respondents answer them; with four primary design elements: words (meaning), numbers (sequencing), symbols (e.g. arrow), and graphics (e.g. text boxes).[16] In translated surveys, writing practice (e.g. Spanish words are lengthier and require more printing space) and text orientation (e.g. Arabic is read from right to left) must be considered in questionnaire visual design to minimize data missingness.[32][33]

Face-to-face edit

The face-to-face mode is suitable for locations where telephone or mail are not developed. Like the telephone mode, the interviewer presence runs the risk of interviewer bias.

Video interviewing edit

Video interviewing is similar to face-to-face interviewing except that the interviewer and respondent are not physically in the same location, but are communicating via video conferencing such as Zoom or Teams.[3]

Virtual worlds edit

Virtual-world interviews take place online in a space created for virtual interaction with other users or players, such as Second Life. Both the respondent and interviewer choose avatars to represent themselves and interact by a chat feature or by real voice audio.[3]

Chatbots edit

A chatbot are used regularly in marketing and sales to gather experience feedback. When used for collecting survey responses, chatbot surveys should be kept short, trained to speak in a friendly human tone, and use easy-to-navigate interface with more advanced Artificial Intelligence.[34]

Mixed-mode surveys edit

Researchers can combine several above methods for the data collection. For example, researchers can invite shoppers at malls, and send willing participants questionnaires by emails. With the introduction of computers to the survey process, survey mode now includes combinations of different approaches or mixed-mode designs. Some of the most common methods are:[35][16]

  • Computer-assisted personal interviewing (CAPI): The computer displays the questions on screen, the interviewer reads them to the respondent, and then enters the respondent's answers.
  • Audio computer-assisted self-interviewing (audio CASI): The respondent operates the computer, the computer displays the question on the screen and plays recordings of the questions to the respondents, who then enters his/her answers.
  • Computer-assisted telephone interviewing (CATI)
  • Interactive voice response (IVR): The computer plays recordings of the questions to respondents over the telephone, who then respond by using the keypad of the telephone or speaking their answers aloud.
  • Web surveys: The computer administers the questions online. See computer-assisted web interviewing (CAWI).

See also edit

References edit

  1. ^ a b c d e f g h Vehovar, V.; Lozar Manfreda, K. (2008). "Overview: Online Surveys". In Fielding, N.; Lee, R. M.; Blank, G. (eds.). The SAGE Handbook of Online Research Methods. London: SAGE. pp. 177–194. ISBN 978-1-4129-2293-7.
  2. ^ a b c d e f Bethlehem, J.; Biffignandi, S. (2012). Handbook of Web Surveys. Wiley Handbooks in Survey Methodology. Vol. 567. New Jersey: John Wiley & Sons. ISBN 978-1-118-12172-6.
  3. ^ a b c d Cook, Sarah; Sha, Mandy (2016-03-15). "Technology options for engaging respondents in self-administered questionnaires and remote interviewing". RTI Press. doi:10.3768/rtipress.2016.op.0026.1603.
  4. ^ Agley, Jon; Meyerson, Beth; Eldridge, Lori; Smith, Carriann; Arora, Prachi; Richardson, Chanel; Miller, Tara (February 2019). "Just the fax, please: Updating electronic/hybrid methods for surveying pharmacists". Research in Social and Administrative Pharmacy. 15 (2): 226–227. doi:10.1016/j.sapharm.2018.10.028. PMID 30416040. S2CID 53281364.
  5. ^ Mellenbergh, G.J. (2008). "Surveys". In Adèr, H.J.; Mellenbergh, G.J. (eds.). Advising on Research Methods: A consultant's companion. Huizen, The Netherlands: Johannes van Kessel Publishing. pp. 183–209. ISBN 978-90-79418-01-5.
  6. ^ "Mobile-ready. Event driven. Feature rich. Online customer surveys". QuestBack. from the original on 23 October 2015.
  7. ^ Schuster, Christian; Perez Brito, Carlos. "Evaluating Cash Transfers in Guatemala". Magpi. Retrieved 27 November 2016.
  8. ^ Revilla, M., Toninelli, D., Ochoa, C., and G. Loewe (2015). “Who has access to mobile devices in an online opt-in panel? An analysis of potential respondents for mobile surveys”. In D. Toninelli, R. Pinter, and P. de Pedraza (eds), Mobile Research Methods: Opportunities and challenges of mobile research methodologies, pp. 119-139 (Chapter 8). London: Ubiquity Press. ISBN 978-1-909188-53-2. DOI: https://dx.doi.org/10.5334/bar.h. License: CC-BY 4.0.
  9. ^ Callegaro, Mario (3 October 2013). "Do You Know Which Device Your Respondent Has Used to Take Your Online Survey?". Survey Practice. 3 (6) – via www.surveypractice.org.
  10. ^ . Survey Anyplace. Archived from the original on 2014-02-08.
  11. ^ Burger, Christoph; Riemer, Valentin; Grafeneder, Jürgen; Woisetschläger, Bianca; Vidovic, Dragana; Hergovich, Andreas (2010). "Reaching the Mobile Respondent: Determinants of High-Level Mobile Phone Use Among a High-Coverage Group" (PDF). Social Science Computer Review. 28 (3): 336–349. doi:10.1177/0894439309353099. S2CID 61640965.
  12. ^ Mavletova, Aigul; Couper, Mick P. (22 November 2013). "Sensitive Topics in PC Web and Mobile Web Surveys: Is There a Difference?". Survey Research Methods. 7 (3): 191–205. doi:10.18148/srm/2013.v7i3.5458.
  13. ^ Toninelli, D.; Revilla, M. (2016). "Smartphones vs PCs: Does the Device Affect the Web Survey Experience and the Measurement Error for Sensitive Topics? A Replication of the Mavletova & Couper's 2013 Experiment". Survey Research Methods. 10 (2): 153–169. doi:10.18148/srm/2016.v10i2.6274.
  14. ^ Global, OnePoint. "SMS surveys". OnePoint Global. Retrieved 27 June 2016.
  15. ^ Selanikio, Joel. "Getting More Data for Less Money". Magpi. Retrieved 9 November 2016.
  16. ^ a b c d e f g Dillman, D.A. (2006). Mail and Internet Surveys: The Tailored Design Method (2nd ed.). New Jersey: John Wiley & Sons. ISBN 978-0-470-03856-7.
  17. ^ Revilla, Melanie; Ochoa, Carlos (14 December 2015). "Quality of Different Scales in an Online Survey in Mexico and Colombia". Journal of Politics in Latin America. 7 (3): 157–177. doi:10.1177/1866802X1500700305. hdl:10230/28347. S2CID 56357343 – via journals.sub.uni-hamburg.de.
  18. ^ Revilla, M., and W.E. Saris (2015). "Estimating and comparing the quality of different scales of an online survey using an MTMM approach". In Engel, U. (Ed), Survey Measurements: Techniques, Data Quality and sources of Error. Chapter 5, pp. 53-74. Campus. Frankfurt. New York. ISBN 9783593502809. Available at press.uchicago.edu.
  19. ^ Revilla, Melanie; Saris, Willem; Loewe, Germán; Ochoa, Carlos (26 May 2015). "Can a non-probabilistic online panel achieve question quality similar to that of the European Social Survey?". International Journal of Market Research. 57 (3): 395–412. doi:10.2501/IJMR-2015-034. S2CID 167732979.
  20. ^ Revilla, M. (2015). “Comparison of the quality estimates in a mixed-mode and a unimode design: an experiment from the European Social Survey”, Quality and Quantity. 2015, 49(3): 1219-1238. Published online first 13 of June 2014. DOI: 10.1007/s11135-014-0044-5
  21. ^ Revilla, Melanie A. (30 December 2012). "Measurement invariance and quality of composite scores in a face-to-face and a web survey". Survey Research Methods. 7 (1): 17–28. doi:10.18148/srm/2013.v7i1.5098.
  22. ^ Revilla, Melanie (31 December 2010). "Quality in Unimode and Mixed-Mode designs: A Multitrait-Multimethod approach". Survey Research Methods. 4 (3): 151–164. doi:10.18148/srm/2010.v4i3.4278.
  23. ^ Schultz & Schultz, Duane (2010). Psychology and work today. New York: Prentice Hall. p. 40. ISBN 978-0-205-68358-1.
  24. ^ Wright, Kevin (1 April 2005). "Researching Internet-Based Populations: Advantages and Disadvantages of Online Survey Research, Online Questionnaire Authoring Software Packages, and Web Survey Services". Journal of Computer-Mediated Communication. 10 (3): 1034. Retrieved 6 March 2018.
  25. ^ Dwivedi, Yogesh K.; Ismagilova, Elvira; Hughes, D. Laurie; Carlson, Jamie; Filieri, Raffaele; Jacobson, Jenna; Jain, Varsha; Karjaluoto, Heikki; Kefi, Hajer; Krishen, Anjala S.; Kumar, Vikram; Rahman, Mohammad M.; Raman, Ramakrishnan; Rauschnabel, Philipp A.; Rowley, Jennifer (2021-08-01). "Setting the future of digital and social media marketing research: Perspectives and research propositions". International Journal of Information Management. 59: 102168. doi:10.1016/j.ijinfomgt.2020.102168. hdl:10454/18041. ISSN 0268-4012.
  26. ^ Salant, Priscilla, and Don A. Dillman. "How to Conduct your own Survey: Leading professional give you proven techniques for getting reliable results." (1995).
  27. ^ Kalton, Graham. Introduction to survey sampling. Vol. 35. Sage, 1983.
  28. ^ Groves, R.M. (1989). Survey Costs and Survey Errors. New York: Wiley. ISBN 978-0-471-67851-9.
  29. ^ J. Scott Armstrong and Terry S. Overton (1977). (PDF). Journal of Marketing Research. 14 (3): 396–402. CiteSeerX 10.1.1.36.7783. doi:10.2307/3150783. JSTOR 3150783. Archived from the original (PDF) on 2010-06-20.
  30. ^ J. Scott Armstrong (1975). "Monetary Incentives in Mail Surveys" (PDF). Public Opinion Quarterly. 39: 111–116. doi:10.1086/268203. S2CID 146397107.
  31. ^ J. Scott Armstrong (1990). "Class of Mail Does Affect Response Rates to Mailed Questionnaires: Evidence from Meta-Analysis (with a Reply by Lee Harvey)" (PDF). Journal of the Market Research Society. 32: 469–472.
  32. ^ Wang, Kevin; Sha, M. Mandy (2013-03-01). "A Comparison of Results from a Spanish and English Mail Survey: Effects of Instruction Placement on Item Missingness". Survey Methods: Insights from the Field (SMIF). doi:10.13094/SMIF-2013-00006. ISSN 2296-4754.
  33. ^ Pan, Yuling; Sha, Mandy (2019-07-09). The Sociolinguistics of Survey Translation. London: Routledge. doi:10.4324/9780429294914/sociolinguistics-survey-translation-yuling-pan-mandy-sha-hyunjoo-park. ISBN 978-0-429-29491-4.
  34. ^ Dandapani, Arundati (2020-04-30). "Redesigning Conversations with Artificial Intelligence (Chapter 11)". In Sha, Mandy (ed.). The Essential Role of Language in Survey Research. RTI Press. pp. 221–230. doi:10.3768/rtipress.bk.0023.2004. ISBN 978-1-934831-24-3.
  35. ^ Groves, R.M.; Fowler, F. J.; Couper, M.P.; Lepkowski, J.M.; Singer, E.; Tourangeau, R. (2009). Survey Methodology. New Jersey: John Wiley & Sons. ISBN 978-1-118-21134-2.

survey, data, collection, with, application, probability, sampling, 1930s, surveys, became, standard, tool, empirical, research, social, sciences, marketing, official, statistics, methods, involved, survey, data, collection, number, ways, which, data, collecte. With the application of probability sampling in the 1930s surveys became a standard tool for empirical research in social sciences marketing and official statistics 1 The methods involved in survey data collection are any of a number of ways in which data can be collected for a statistical survey These are methods that are used to collect information from a sample of individuals in a systematic way First there was the change from traditional paper and pencil interviewing PAPI to computer assisted interviewing CAI Now face to face surveys CAPI telephone surveys CATI and mail surveys CASI CSAQ are increasingly replaced by web surveys 2 In addition remote interviewers could possibly keep the respondent engaged while reducing cost as compared to in person interviewers 3 Contents 1 Modes of data collection 1 1 Mobile surveys 1 2 SMS IM surveys 1 3 Online surveys 1 3 1 Advantages of online surveys 1 3 2 Key methodological issues of online surveys 1 4 Telephone 1 5 Mail 1 6 Face to face 1 7 Video interviewing 1 8 Virtual worlds 1 9 Chatbots 1 10 Mixed mode surveys 2 See also 3 ReferencesModes of data collection editThe choice between administration modes is influenced by several factors including 1 costs 2 coverage of the target population including group specific preferences for certain modes 4 3 flexibility of asking questions 4 respondents willingness to participate and 5 response accuracy Different methods create mode effects that change how respondents answer The most common modes of administration are listed under the following headings 5 Mobile surveys edit Mobile data collection or mobile surveys is an increasingly popular method of data collection Over 50 of surveys today are opened on mobile devices 6 The survey form app or collection tool is on a mobile device such as a smart phone or a tablet These devices offer innovative ways to gather data and eliminate the laborious data entry of paper form data into a computer which delays data analysis and understanding By eliminating paper mobile data collection can also dramatically reduce costs one World Bank study in Guatemala found a 71 decrease in cost while using mobile data collection compared to the previous paper based approach 7 Apart from the high mobile phone penetration 8 9 further advantages are quicker response times and the possibility to reach previously hard to reach target groups In this way mobile technology allows marketers researchers and employers to create real and meaningful mobile engagement in environments different from the traditional one in front of a desktop computer 10 11 However even when using mobile devices to answer the web surveys most respondents still answer from home 12 13 SMS IM surveys edit SMS surveys can reach any handset in any language and in any country As they are not dependent on internet access and the answers can be sent when its convenient they are a suitable mobile survey data collection channel for many situations that require fast high volume responses As a result SMS surveys can deliver 80 of responses in less than 2 hours 14 and often at much lower cost compared to face to face surveys due to the elimination of travel personnel costs 15 IM is similar to SMS except that a mobile number is not required IM functions are available in standalone software such as Skype or embedded on websites such as Facebook and Google 3 Online surveys edit Online Internet surveys are becoming an essential research tool for a variety of research fields including marketing social and official statistics research According to ESOMAR online survey research accounted for 20 of global data collection expenditure in 2006 1 They offer capabilities beyond those available for any other type of self administered questionnaire 16 Online consumer panels are also used extensively for carrying out surveys but the quality is considered inferior because the panelists are regular contributors and tend to be fatigued However when estimating the measurement quality defined as product of reliability and validity using a multitrait mutlimethod approach MTMM some studies found a quite reasonable quality 17 18 and even that the quality of a series of questions in an online opt in panel Netquest was very similar to the measurement quality for the same questions asked in the European Social Survey ESS which is a face to face survey 19 nbsp US Navy 030618 N 2893B 001 Information Technician 1st Class Annette Leasure takes a few minutes to fill out the BUPERS Online Uniform Survey QuestionnaireSome studies have compared the quality of face to face surveys and or telephone surveys with that of online surveys for single questions but also for more complex concepts measured with more than one question also called Composite Scores or Index 20 21 22 Focusing only on probability based surveys also for the online ones they found overall that the face to face using show cards and web surveys have quite similar levels of measurement quality whereas the telephone surveys were performing worse Other studies comparing paper and pencil questionnaires with web based questionnaires showed that employees preferred online survey approaches to the paper and pencil format There are also concerns about what has been called ballot stuffing in which employees make repeated responses to the same survey Some employees are also concerned about privacy Even if they do not provide their names when responding to a company survey can they be certain that their anonymity is protected Such fears prevent some employees from expressing an opinion 23 Advantages of online surveys edit Web surveys are faster simpler and cheaper 2 However lower costs are not so straightforward in practice as they are strongly interconnected to errors Because response rate comparisons to other survey modes are usually not favourable for online surveys efforts to achieve a higher response rate e g with traditional solicitation methods may substantially increase costs 1 The entire data collection period is significantly shortened as all data can be collected and processed in little more than a month 2 Interaction between the respondent and the questionnaire is more dynamic compared to e mail or paper surveys 16 Online surveys are also less intrusive and they suffer less from social desirability effects 2 Complex skip patterns can be implemented in ways that are mostly invisible to the respondent 16 Pop up instructions can be provided for individual questions to provide help with questions exactly where assistance is required 16 Questions with long lists of answer choices can be used to provide immediate coding of answers to certain questions that are usually asked in an open ended fashion in paper questionnaires 16 Online surveys can be tailored to the situation e g respondents may be allowed save a partially completed form the questionnaire may be preloaded with already available information etc 2 Online questionnaires may be improved by applying usability testing where usability is measured with reference to the speed with which a task can be performed the frequency of errors and user satisfaction with the interface 2 Key methodological issues of online surveys edit Sampling The difference between probability samples where the inclusion probabilities for all units of the target population is known in advance and non probability samples which often require less time and effort but generally do not support statistical inference is crucial Probability samples are highly affected by problems of non coverage not all members of the general population have Internet access and frame problems online survey invitations are most conveniently distributed using e mail but there are no e mail directories of the general population that might be used as a sampling frame Because coverage and frame problems can significantly impact data quality they should be adequately reported when disseminating the research results 1 24 Invitations to online surveys Due to the lack of sampling frames many online survey invitations are published in the form of an URL link on web sites or in other media which leads to sample selection bias that is out of research control and to non probability samples Traditional solicitation modes such as telephone or mail invitations to web surveys can help overcoming probability sampling issues in online surveys However such approaches are faced with problems of dramatically higher costs and questionable effectiveness 1 Non response Online survey response rates are generally low and also vary extremely from less than 1 in enterprise surveys with e mail invitations to almost 100 in specific membership surveys In addition to refusing participation terminating surveying during the process or not answering certain questions several other non response patterns can be observed in online surveys such as lurking respondents and a combination of partial and item non response Response rates can be increased by offering monetary or some other type of incentive to the respondents by contacting respondents several times follow up and by keeping the questionnaire difficulty as low as possible 1 There are draw backs to using an incentive to garner a response Non bias responses could be questioned in this type of situation The most concrete way to gain feedback is to publicize what is done with the results To take concrete actions based on feedback and to show that to the customer base is extremely motivating to customers to continue to let their voice be heard Acquiescence bias Due to a phenomenon inherently present in human nature many people have acquiescent personalities and are more likely to agree with statements than disagree regardless of the content Often those people see the question asker as an expert in their field which causes them to be more likely to react positively to the question asked Platform Issues Lack of familiarity with the platform used can cause participants and clients confusion or limit who may be willing and able to navigate surveys on digital platforms 25 Questionnaire design While modern web questionnaires offer a range of design features different question types images multimedia the use of such elements should be limited to the extent necessary for respondents to understand questions or to stimulate the response It should not affect their responses because that would mean lower validity and reliability of data Appropriate questionnaire design can help lowering the measurement error that can arise also due to the respondents or the survey mode itself respondent s motivation computer literacy abilities privacy concerns etc 1 Post survey adjustments Various robust procedures have been developed for situations where sampling deviate from probability selection or when we face non coverage and non response problems The standard statistical inference procedures e g confidence interval calculations and hypothesis testing still require a probability sample The actual survey practice particularly in marketing research and in public opinion polling which massively neglects the principles of probability samples increasingly requires from the statistical profession to specify the conditions where non probability samples may work 1 These issues and potential remedies are discussed in a number of sources 26 27 Telephone edit Telephone surveys use interviewers to encourage the sample persons to respond which leads to higher response rates 28 There are some potential for interviewer bias e g some people may be more willing to discuss a sensitive issue with a female interviewer than with a male one Depending on local call charge structure and coverage this method can be cost efficient and may be appropriate for large national or international sampling frames using traditional phones or computer assisted telephone interviewing CATI Because it is audio based this mode cannot be used for non audio information such as graphics demonstrations or taste smell samples Mail edit Depending on local bulk mail postage mail surveys may be relatively lower cost compared to other modes The field method tends to be longer often several months before the surveys are returned and statistical analysis can begin The questionnaire may be handed to the respondents or mailed to them but in all cases they are returned to the researcher via mail Because there is no interviewer presence the mail mode is not suitable for issues that may require clarification However there is no interviewer bias and respondents can answer at their own convenience allowing them to break up long surveys also useful if they need to check records to answer a question To correct nonresponse bias extrapolation across waves could be done 29 Response rates can be improved by using mail panels members of the panel must agree to participate and prepaid monetary incentives 30 but response rates are affected by the class of mail through which the survey was sent 31 Panels can be used in longitudinal designs where the same respondents are surveyed several times Visual presentation of survey questions make a difference in how respondents answer them with four primary design elements words meaning numbers sequencing symbols e g arrow and graphics e g text boxes 16 In translated surveys writing practice e g Spanish words are lengthier and require more printing space and text orientation e g Arabic is read from right to left must be considered in questionnaire visual design to minimize data missingness 32 33 Face to face edit The face to face mode is suitable for locations where telephone or mail are not developed Like the telephone mode the interviewer presence runs the risk of interviewer bias Video interviewing edit Video interviewing is similar to face to face interviewing except that the interviewer and respondent are not physically in the same location but are communicating via video conferencing such as Zoom or Teams 3 Virtual worlds edit Virtual world interviews take place online in a space created for virtual interaction with other users or players such as Second Life Both the respondent and interviewer choose avatars to represent themselves and interact by a chat feature or by real voice audio 3 Chatbots edit A chatbot are used regularly in marketing and sales to gather experience feedback When used for collecting survey responses chatbot surveys should be kept short trained to speak in a friendly human tone and use easy to navigate interface with more advanced Artificial Intelligence 34 Mixed mode surveys edit Researchers can combine several above methods for the data collection For example researchers can invite shoppers at malls and send willing participants questionnaires by emails With the introduction of computers to the survey process survey mode now includes combinations of different approaches or mixed mode designs Some of the most common methods are 35 16 Computer assisted personal interviewing CAPI The computer displays the questions on screen the interviewer reads them to the respondent and then enters the respondent s answers Audio computer assisted self interviewing audio CASI The respondent operates the computer the computer displays the question on the screen and plays recordings of the questions to the respondents who then enters his her answers Computer assisted telephone interviewing CATI Interactive voice response IVR The computer plays recordings of the questions to respondents over the telephone who then respond by using the keypad of the telephone or speaking their answers aloud Web surveys The computer administers the questions online See computer assisted web interviewing CAWI See also edit nbsp Wikiversity has learning resources about Survey research and design in psychology Assessment Data collection and entry Survey administration Survey methodology Assessment Comparison of survey software Data collection systemReferences edit a b c d e f g h Vehovar V Lozar Manfreda K 2008 Overview Online Surveys In Fielding N Lee R M Blank G eds The SAGE Handbook of Online Research Methods London SAGE pp 177 194 ISBN 978 1 4129 2293 7 a b c d e f Bethlehem J Biffignandi S 2012 Handbook of Web Surveys Wiley Handbooks in Survey Methodology Vol 567 New Jersey John Wiley amp Sons ISBN 978 1 118 12172 6 a b c d Cook Sarah Sha Mandy 2016 03 15 Technology options for engaging respondents in self administered questionnaires and remote interviewing RTI Press doi 10 3768 rtipress 2016 op 0026 1603 Agley Jon Meyerson Beth Eldridge Lori Smith Carriann Arora Prachi Richardson Chanel Miller Tara February 2019 Just the fax please Updating electronic hybrid methods for surveying pharmacists Research in Social and Administrative Pharmacy 15 2 226 227 doi 10 1016 j sapharm 2018 10 028 PMID 30416040 S2CID 53281364 Mellenbergh G J 2008 Surveys In Ader H J Mellenbergh G J eds Advising on Research Methods A consultant s companion Huizen The Netherlands Johannes van Kessel Publishing pp 183 209 ISBN 978 90 79418 01 5 Mobile ready Event driven Feature rich Online customer surveys QuestBack Archived from the original on 23 October 2015 Schuster Christian Perez Brito Carlos Evaluating Cash Transfers in Guatemala Magpi Retrieved 27 November 2016 Revilla M Toninelli D Ochoa C and G Loewe 2015 Who has access to mobile devices in an online opt in panel An analysis of potential respondents for mobile surveys In D Toninelli R Pinter and P de Pedraza eds Mobile Research Methods Opportunities and challenges of mobile research methodologies pp 119 139 Chapter 8 London Ubiquity Press ISBN 978 1 909188 53 2 DOI https dx doi org 10 5334 bar h License CC BY 4 0 Callegaro Mario 3 October 2013 Do You Know Which Device Your Respondent Has Used to Take Your Online Survey Survey Practice 3 6 via www surveypractice org Mobile engagement becomes standard operating procedure Survey Anyplace Archived from the original on 2014 02 08 Burger Christoph Riemer Valentin Grafeneder Jurgen Woisetschlager Bianca Vidovic Dragana Hergovich Andreas 2010 Reaching the Mobile Respondent Determinants of High Level Mobile Phone Use Among a High Coverage Group PDF Social Science Computer Review 28 3 336 349 doi 10 1177 0894439309353099 S2CID 61640965 Mavletova Aigul Couper Mick P 22 November 2013 Sensitive Topics in PC Web and Mobile Web Surveys Is There a Difference Survey Research Methods 7 3 191 205 doi 10 18148 srm 2013 v7i3 5458 Toninelli D Revilla M 2016 Smartphones vs PCs Does the Device Affect the Web Survey Experience and the Measurement Error for Sensitive Topics A Replication of the Mavletova amp Couper s 2013 Experiment Survey Research Methods 10 2 153 169 doi 10 18148 srm 2016 v10i2 6274 Global OnePoint SMS surveys OnePoint Global Retrieved 27 June 2016 Selanikio Joel Getting More Data for Less Money Magpi Retrieved 9 November 2016 a b c d e f g Dillman D A 2006 Mail and Internet Surveys The Tailored Design Method 2nd ed New Jersey John Wiley amp Sons ISBN 978 0 470 03856 7 Revilla Melanie Ochoa Carlos 14 December 2015 Quality of Different Scales in an Online Survey in Mexico and Colombia Journal of Politics in Latin America 7 3 157 177 doi 10 1177 1866802X1500700305 hdl 10230 28347 S2CID 56357343 via journals sub uni hamburg de Revilla M and W E Saris 2015 Estimating and comparing the quality of different scales of an online survey using an MTMM approach In Engel U Ed Survey Measurements Techniques Data Quality and sources of Error Chapter 5 pp 53 74 Campus Frankfurt New York ISBN 9783593502809 Available at press uchicago edu Revilla Melanie Saris Willem Loewe German Ochoa Carlos 26 May 2015 Can a non probabilistic online panel achieve question quality similar to that of the European Social Survey International Journal of Market Research 57 3 395 412 doi 10 2501 IJMR 2015 034 S2CID 167732979 Revilla M 2015 Comparison of the quality estimates in a mixed mode and a unimode design an experiment from the European Social Survey Quality and Quantity 2015 49 3 1219 1238 Published online first 13 of June 2014 DOI 10 1007 s11135 014 0044 5 Revilla Melanie A 30 December 2012 Measurement invariance and quality of composite scores in a face to face and a web survey Survey Research Methods 7 1 17 28 doi 10 18148 srm 2013 v7i1 5098 Revilla Melanie 31 December 2010 Quality in Unimode and Mixed Mode designs A Multitrait Multimethod approach Survey Research Methods 4 3 151 164 doi 10 18148 srm 2010 v4i3 4278 Schultz amp Schultz Duane 2010 Psychology and work today New York Prentice Hall p 40 ISBN 978 0 205 68358 1 Wright Kevin 1 April 2005 Researching Internet Based Populations Advantages and Disadvantages of Online Survey Research Online Questionnaire Authoring Software Packages and Web Survey Services Journal of Computer Mediated Communication 10 3 1034 Retrieved 6 March 2018 Dwivedi Yogesh K Ismagilova Elvira Hughes D Laurie Carlson Jamie Filieri Raffaele Jacobson Jenna Jain Varsha Karjaluoto Heikki Kefi Hajer Krishen Anjala S Kumar Vikram Rahman Mohammad M Raman Ramakrishnan Rauschnabel Philipp A Rowley Jennifer 2021 08 01 Setting the future of digital and social media marketing research Perspectives and research propositions International Journal of Information Management 59 102168 doi 10 1016 j ijinfomgt 2020 102168 hdl 10454 18041 ISSN 0268 4012 Salant Priscilla and Don A Dillman How to Conduct your own Survey Leading professional give you proven techniques for getting reliable results 1995 Kalton Graham Introduction to survey sampling Vol 35 Sage 1983 Groves R M 1989 Survey Costs and Survey Errors New York Wiley ISBN 978 0 471 67851 9 J Scott Armstrong and Terry S Overton 1977 Estimating Nonresponse Bias in Mail Surveys PDF Journal of Marketing Research 14 3 396 402 CiteSeerX 10 1 1 36 7783 doi 10 2307 3150783 JSTOR 3150783 Archived from the original PDF on 2010 06 20 J Scott Armstrong 1975 Monetary Incentives in Mail Surveys PDF Public Opinion Quarterly 39 111 116 doi 10 1086 268203 S2CID 146397107 J Scott Armstrong 1990 Class of Mail Does Affect Response Rates to Mailed Questionnaires Evidence from Meta Analysis with a Reply by Lee Harvey PDF Journal of the Market Research Society 32 469 472 Wang Kevin Sha M Mandy 2013 03 01 A Comparison of Results from a Spanish and English Mail Survey Effects of Instruction Placement on Item Missingness Survey Methods Insights from the Field SMIF doi 10 13094 SMIF 2013 00006 ISSN 2296 4754 Pan Yuling Sha Mandy 2019 07 09 The Sociolinguistics of Survey Translation London Routledge doi 10 4324 9780429294914 sociolinguistics survey translation yuling pan mandy sha hyunjoo park ISBN 978 0 429 29491 4 Dandapani Arundati 2020 04 30 Redesigning Conversations with Artificial Intelligence Chapter 11 In Sha Mandy ed The Essential Role of Language in Survey Research RTI Press pp 221 230 doi 10 3768 rtipress bk 0023 2004 ISBN 978 1 934831 24 3 Groves R M Fowler F J Couper M P Lepkowski J M Singer E Tourangeau R 2009 Survey Methodology New Jersey John Wiley amp Sons ISBN 978 1 118 21134 2 Retrieved from https en wikipedia org w index php title Survey data collection amp oldid 1212850984 Online surveys, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.