Читать книгу Handbook of Web Surveys - Jelke Bethlehem - Страница 38
EXAMPLE 2.2 The ICT survey pilot
ОглавлениеStatistics Netherlands carried out a pilot with the ICT survey to find out whether it was possible to use the web for data collection. This survey collects information on the use of computers and Internet in households and by individuals. The regular ICT survey was a CATI survey. It was rather expensive. It also suffered from under‐coverage because the sample selection was from the telephone directory. It was not possible to select households with unlisted numbers and mobile‐only households.
The sample selection for the pilot was from the population register. Therefore, there was no under‐coverage. All persons in the sample received an invitation letter by mail. The letter contained the Internet address of the survey and a unique login code. Respondents had the possibility to complete the questionnaire on paper. To prevent those with Internet to respond by paper, the paper questionnaire was not included in the invitation letter. People had to apply for the paper form by returning a stamped return postcard.
After one week, a postcard was sent to all nonrespondents with a reminder to complete the survey questionnaire, either by web or mail. Two weeks after receipt of the invitation letter, the remaining nonrespondents were approached again. Part of these nonrespondents received a reminder letter, and another part was called by telephone (if a telephone number was available). The telephone call was just to remind the nonrespondents and did not replace the paper/web questionnaire form.
It turned out the postcard reminders worked well. Each time they were sent, there was a substantial increase in response. The telephone reminder did not work as well as the postcard reminder. Of the people that promised by telephone to fill in the form, only 40% actually did so.
For the survey administrator, a favorable environment for running web surveys is a closed population. For example, businesses or institutions' administrative database records for each employee contain individual data and corporate e‐mail address.
It is a similar situation for the universities. The university database records the student's e‐mail address (the institutional one and—sometimes—the private one).
Business customer's databases record detailed information on each customer including contact references, among them e‐mail addresses.
Other survey administrators, such as academic researchers, market research companies, or private businesses, may not have proper sampling frames available. One solution of this problem could be to let an NSI select the sample for them. Another could be to obtain a copy of the sampling frame after privacy‐related information has been removed. Nevertheless, privacy‐related laws may prevent NSIs to make available sampling frame information to third parties.
With respect to the topic of the survey, it should be borne in mind that NSIs and other government statistical bodies collect data primarily for policy decisions. There may be different surveys for different social and economic indicators. Many surveys are compulsory, which means that the contacted elements are obliged to respond. If they do not, they may be fined. Sometimes questionnaires are rather complex since many topics are covered. Surveys conducted by academic researchers, market research organizations, and other companies tend to be more heterogeneous, covering a number of different issues: product characteristics, customer satisfaction regarding products and services, employee satisfaction, trends in consumer preferences or behavior, health, use of technological products, and so on. Generally speaking, survey topics dealt with by this type of survey administrator are mainly devoted to a more or less traditionally defined target population, and therefore an appropriate survey frame definition becomes more difficult. Surveys carried out by this type of survey administrator could often make use of a simpler and shorter questionnaire.
With respect to the distinction between cross‐sectional and panel data collection, it should be noted that cross‐sectional surveys gather data about one moment in time, whereas panel surveys collect information at many successive points in time with the focus on investigating changes over a period of time. Panel surveys are discussed in Chapter 14. The main problem with panel surveys is the lack of representativity of the panel and of the samples selected from it. The real world is full of panelists recruited by means of self‐selection and therefore not representative for the population. In this case, the panel is a large group of elements that is approximately as large as the population, but the large size doesn't imply representativeness. Large probability‐based panels are representative, but more difficult to recruit.
If a panel is for longitudinal studies, all respondents are tracked back to the moment they entered the panel. Therefore, when doing a longitudinal analysis of survey results, DiSogra and Callegaro (2009) recommend computing cumulative standardized response rates (taking into account different recruitment waves), i.e., rates based on a multiple recruitment approach. This approach captures the dynamics of a panel member's history with regard to nonresponse and attrition, i.e., of loss of respondents of the recruited panel.
With regard to technical implementation of the questionnaire, there are two approaches possible i.e. online data collection and offline data collection:
Online data collection is a way of data collection for which the respondents have to remain online during the process of answering the questions. The questionnaire is implemented as one or more web pages. The respondent has to surf to the survey website in order to start the questionnaire. The questionnaire can be question based or form based. Question based means that every web page contains a single question. After answering a question, the respondent proceeds to the next question that is on the next page. If the questionnaire contains routing instructions and consistency checks, we recommend the page‐based approach. Form based means that there is a single web page containing all questions. This page looks like a form. Usually there are no routing instructions and no consistency checks. The questionnaire can be optimized for the mobile devices; in such a case, questions are presented in a user‐friendly format for reading and completing in smartphones or other devices.
Offline data collection. The electronic questionnaire form (an HTML page, an Excel spreadsheet, or another of interviewing software tool) is send to the respondent by e‐mail, or the respondent can download it from the Internet. The respondent fills in the form or spreadsheet offline. After completing the questionnaire, it is returned (uploaded, send by e‐mail) to the survey agency. Statistics Netherlands, for example, used this approach for a number of business surveys. A computer‐assisted interviewing program was send to the selected businesses. The businesses run this program offline and answer the questions. After completion, contact is with the Internet again, and the data are uploaded to the survey agency.
In the case of an electronic form, the advantages of a printable questionnaire can be combined with those computer‐assisted interviewing (routing and consistency checking). Note that it is also possible not to bother the respondents with consistency checking. This means no errors will be detected during form completion. However, errors may afterward be detected by the survey agency, possibly resulting in returning the form to the respondent for error correction. This is much less efficient.