In this section we describe the
major steps in carrying out a research study using a questionanaire : (1)
defining research objectives, (2) selecting a sample, (3) designing the
questionnaire format, (4) pretesting the questionnaire, (5) precontacting the
sample, (6) writing a cover letter and distributing the questionnaire, (7)
following up with nonrespondents, and (8) analyzing questionnaire data.
Step 1 : Defening research Objectives
Some researchers develop a
questionnairebefore they have thoroughly considered what they hope to obtain
from the results. It is important that you define your research problem and
list the specific objectives to be achieved. Or hypotheses to be tested, by the
questionnaire. You might start with a broad topic (e,g,, teacher involvement in
staff development) but you should sharpen its focus before beginning on the
design if questionnaire.
D.A. deVaus suggeste five types of question that you can ask yourself for this purpose. The are started below in relation to the above-mentioned topic. Techers involvement in staff development :
- what the time frame of your
interest? Are you interested in teschers current involvement in staff
development, or do you want to study trends in their involvement over a
period of years?
- what is the geographical
location of your interest? Do you want to study teachers in a particular state or
region. Or do you want to compare teachers in differen locations?
- are you interested in a broed
descriptive study or do you want to specify and compare different
subgroups? For example, will you compare elementary, middle school, and
high school teachers,
or will you study teachers in general?
- what aspect of the topic do you
want to study? Are you interested in teachers involvement in particular
types of staf development
activities, whether their involvement is mandatory or voluntary, or the amount
of involvement over a given time period?
- How abstract is your interest?
For example, are you interested in reporting facts, or do you want to
interpret the information, relate it to a broad social context, or develop
theory from the findings?
In describing the steps involved in
conducting a questionnaire study, we shall refer to a study by Corrine Glesne
and Rodman Webb. These researchers were interested in tracking the growing
emphasis on qualitative research in higher education in the United State. They
wanted to determine who teaches
qualitative research methods courses, the content of their courses, and their
teching method. Their questionnaire was designed to obtain this information :
The
survey (questionnaire) asked about the training and academic background of
qualitative research professors, content ofcourse, program requerements. And
faculty perceptions of and interaction with students pursing qualitative
research dissertations.
Glense and Webb noted the irony of
basing a study about the taching of qualitative research methods courses on a
quantitatively-oriented questionnaire survey. They chose to use questionnaires
anyway because of their usefulness in collecting both closed and open-ended
information from a widespread sample.
Step 2 : Selecting a Sample
Once your research objectives or
hypotheses are clearly stated, you should identify the target population from
wich your sample will be selected. (this and other sampling technique are
described in chapter 6) if you do not have thorough knowledge of the situation,
you might make a mistake of sending you questionnaire to a group that does not
have the desired information. For example, a graduate student seeking data on
school financial policies sent questionnaires to principals of elementary and secondary
schools. Many of the returned questionnaires were incomplete, and view specific
facts of the short wanted were obtained. This questionnaire failed because at
that time the school superintendent and district specialists handled most
matters concerning school finance. Because the principals who received the
questionnaire had little specific knowledge about the topic, they were unable
to supply the information requested.
The salience of the questionnaire
content to the respondents (i.e., how important or prominent a concern it is of
them)affects both the accuracy of the information received and the rate of
response. A review of 181 studies using questionnaires judged to be “salient,”
possibly salient,” or “nonsalient” to the respondentrevealed that the return
rate averaged 77 percentfor the salirnt studies, 66 percent for the those
judged possibly salient, and only 42 percent for those nonsalient. These
finding suggest the need to selest a sample for whom your questionnaire items
will be highly salient.
In the study by Glesne and Webb, the
researchers gained acces to a mailing list for the international journal of
qualitative studies in education, which is major journal publishing qualitative
research studies. They then sent a copy of their questionnaire to 360
proffesors whose names were the journals mailing list. The researchers
commented :
This
was, admittedly, a fishing-net approach. Our assumption was that this
readership would include people who teach qualitative research methods courses,
and not everyone on the list taught such courses.
Using this admittedly biased
sampling approach, they received usable questionnaires from 73 respondents in
37 different states. The sample included 40 men and 33 women. Twenty-five held
the title of professor; 28 the title of associate professor, 18 the title of
assistant professor, and 2 title of lecturer.
Step 3 : Designing the Questionnaire
Some research questionnaires appear
to have been thrown together in an hour or two. The some experience of
receiving these haphazard questionnaires has led many educators to depelov
negative attitudes about the questionnaire as a research approach, and so they
deposit them in the recycling box with little more than a quick glance. You
will need to overcome these negative attitudes by careful construction and
administration of your questionaire. Figure 8.1
summarizes guidelines for designing questionnaires. These guidelines are based
on research findings about factors that influence questionnaire return rate.
Anonymity
In
most educational studies, respondents are asked to identify themselves, but
anonymity migh be necessary if highly personal or threathning information is
requested. A questionnaire cdealing with sexual behavior, for example, might
receave more honest responses if the respondents remain anonymous.
The
major problem with anonymous questionnaires is that follow-ups to improve the
return rate are impossible. There are several solutions to this problem. One is
to create a master code sheet that contains a code for each individual in the
sample. The codes are put one the questionnaires. When an individual return the
questionnaire, the researcer can check off that person’s name on the master
code sheet. After a designated period of time, the researcer can determine
which individuals have not returned their questionnaires and send them a new
questionnaire.
The
method is not completely anonymous, because the researcher can link the
questionnaire (which has the code on it) to the individual’s name by referring
to the master code sheet. For complete anynomity, a variation of this approach
can be used. The researcher send each individual a prepaid postcard separately.
The postcard tells the researcher that this individual has completed the
questionnaire, but he does not know which of the returned questionnaires belong
to that individual.
Item
Form
Writing
items for questionnaires (and for interviews, too) may seem straighforward, but
it is actually an art form. You need to be able to write succinctly and
clearly. This is no easy matter. More importantly, you need to have a good
understanding of your respondents so nthat you can use language that they
understand, so that you can obtain all the information you need without
exhausting their patience, and so that the items engage their interest ang
willingness to respont honestly.
A
major difficulty in constracting questionnaire items is that educational terms
often have multiple meanings. For example, the terms charter school,
standards-based educational, and teacher empowerment may mean different things
depending on the individual educator and the region in which she works. If you
use such a term in a questionnaire item, it is highly advisable to include a
definition that corresponds to your researcher objectives. For example, suppose
a researcher is interested in educator responses to the charter school
movement, not as it is occuring nationnaly but within the state being studied.
Given this objective, the item might read: “The state department of education
adopted a statute in 2001 that allows school districts to start charter
schools, which are defined as schools that receive district funding but are
administered indefendently, albeit with mandatory conformance to standard of
the state department of education. What is the current status of charter
schools of this type in your district?”
A
questionnaire item can be either closed form, meaning that the question permits
only prefecified responses ( similar to a multiple-choice question), or open
form, meaning that respondents can make any response they wish (similar to an
essay question). Which form to use is determined by the objective of the
particular question. Evidence on the relative merits to closed and open
questions, however, suggests that the two formats produce similar information.
The
advantage of designing question in closed form is that it makes quantification
and analysis of result easier. For example, suppose you wish to know the size
of the teacher’s home town. Probably the least useful way to ask the question is: What is your home town? This
question requires that you be able to
read each teacher’s response and then look it up in an atlas to determine the
population. A some what better question would be: What is the population in
your home town? In this case you could classify the responses into population
categories such as those used by the U.S. Census Bureau. A still better
approach would be to ask: What is the population in your home town? (Check
one), and provide the following reswponse choices:
rural,
unincorporated
incorporated,
under 1000
1,000 to
5,000
2,500 to
5,000
5,000 to
10,000
10,000 to
50,000
50,000 to
250,000
over
250,000
don’t know
This
item requires little effort on your part to analyze the data, and also minimal
effort from the respondents.
To
determine the multiple-choice categories to use in closed-form questions, you
can pillot-test the question by the asking it in open form of a small number of
respondents. Their answer can be used to develop in categories for the
closed-form item. If you can un-usual responses, an “other” option can be
provided.
In
the questionnaire study on the teaching of qualitative research. Glesne and
Webb began by interviewing several qualitative researchers about their
training, teaching, and research. They used the interview information to
develop an open-ended pilot questionnaire, and sent it to six professor of
qualitative research. Feedback indicated that the open-ended question were
interesting, but time-consuming. There was a concern that few professors would
take the hour or more needed to complete the questionnaire. Based on this
feedback, the researchers redesigned the questionnaire into a closed-form format,
with open-ended optionsattached to most items.
Measuring
Attitudes
Questionnaire
typically contains item each of which elicits a different bit of information.
In effect, each item is a one-items test is quite satisfactory when you are
seeking a spesific fact, such as number
of years of full-time teaching experience, the number of wins and losses during
a particular football coach’s tenure, or the proportion of students failing
intermediate algebra. When question assess attitudes, however, the one-item test
approach is questionable with respect to both validity and reability. A
questionnaire that measures attitudes generally must be constructed as an
attitude scale and must use a substantial number of items (usually at least 10)
in order to obtain a reliable assessment of an individuals’ attitude.
If
you are planningto collect information about attitudes, you should first do a
search of the research literature to determine wether a scale suitable for your
purposes already has been constructed.
If a suitable scale is not available, you will need to develop one. Likert
scales, which typically ask for the extent of agreement with an attitude item
(for example, a five point scale ranging from “strongly disagree”) are a common
type of attitude scale.
If
you develop an attitude scale for your questionnaire study, you should
pilot-test it in order to check its reliability and validity. Also, the pilot
test should determine wether individuals in the sample have sufficient
knowledge and understanding to express meaningful opinion abot the topic.
Otherwise, their responses to the attitude scale will be of questionable value.
One
method of dealing with respondents who lack familiarity with a topic is to
include a “no opinion” option as
one of the response alternatives for
each attitude item. Even still, individuals with little or no information about
the topic might express
an opinion in order to conceal
their ignorance, or because they feel social fressure to express a particular
opinion. For example, Irving Allen conducted questionnaire study of respondents
attitudes’ toward individuals and organizations that were the subject of
considerable media attention at the time. The respondents could express a
favorable or unfavorable attitude using sax Likert-type categories, or they
could use a seventh category to express no knowledge of a particular individual
or organization. Ten percent of the sample expressed a favorable or unfavorable
attitude toward a fictitious organization, about which it was impossible for them
to have any knowledge! The subject respondingto the fictitious item were more
likely to express attitudes toward the other organizations and individuals
listed on the questionnaire than to check the “don’t know” category, and to
express more favorable attitudes.
As
we stated above, a “no opinion” option for each attitude item might alleviate
the problem identified in Allen’s study. Another strategy is to include several
information questions at the beginning that can be used to screen out
respondents who display little or no knowledge of the topics being studied.
Tidak ada komentar:
Posting Komentar