SOURCES OF KNOWLEDGE
- Beliefs
- Tradition
- Personal
- Experience
- Logic
- Intuition
- RESEARCH
§
Process for
developing
knowledge:
Identify
Problem
Conduct
Empirical Studies
Replicate
Studies
Synthesize
Research
Adoption and
Evaluation
§
Process
-Select problem
-Review literature
-Select specific
hypothesis
-Collect data
-Analyze data
-Interpret findings
-State conclusions
§
Characteristics
-Objective
-Precise
-Verifiable
-Explanatory
-Empirical
-Logical
-Probabilistic
§
Limitations
-Human subjects
-Public institutions
-Complexity of
research problem
-Methodological
dificulties
§
Functions of Basic
Research
-Concerned with
knowing, explaining,
and predicting
natural and social
phenomena
-Starts with theory,
principle or
generalization
-Tests theories
§
Functions of Applied
Research
-Conducted in the
field
-Deals with
practical problems
§
Functions of
Evaluation Research
-Assesses merit and
worth of particular
practices
RESEARCH DESIGNS
QUANTITATIVE
Researcher
manipulates
independent variable
to investigate
cause-and-effect
relationship between
independent and
dependent variable.
-True experimental
-Quasi-experimental
-Single-subject
Researcher describes
things that have
occurred, examines
relationships
without suggesting
causation, or
explores causal
relationships among
variables that
cannot be
manipulated.
-Descriptive
-Correlational
-Survey
-Expost facto
QUALITATIVE
Researcher describes
behaviors as they
occur in the natural
envionment.
-Concept
-Historical
-Legal
Data Collection
Techniques
Use numbers to
describe or measure
the results.
-Structured
observations
-Standardized
interviews
-Tests
-Questionnaires
-Unobtrusive
measures
Use words to collect
the data.
-Ethnographic
observations and
interviews
-Documents
RESEARCH REPORTS
QUANTITATIVE
Abstract
Introduction
Statement of
Research Problem
Review of Literature
Statement of
Research
Hypotheses/Questions
Methodology
Results
Discussion,
Implications,
Conclusions
References
QUALITATIVE
Introduction
Methodology
Findings and
Interpretation
Conclusions
RESEARCH PROBLEMS
SOURCES
-Casual observation
-Deductions from
theory
-Related literature
-Current social and
political issues
-Practical
situations
-Personal experience
SIGNIFICANCE
Determined by if
they:
-Provide knowledge
-Test theories
-Increase
generalizability
-Extend empirical
understandings
-Advance methodology
-Focus of current
issue
-Evaluate specific
practice or policy
-Are exploratory
studies
PROBLEM STATEMENT
Specifies the focus,
educational,
context, importance,
and the frameworks
for reporting the
findings.
-Use deductive logic
-Identify
population,
variables, and logic
of the problem
-Write statement
clearly and
concisely
-Write statement as
research purpose,
questions, or
hypotheses before
data is collected
-Suggests the design
of the study
-Descriptive
-Relationship
-Difference
Should:
-State expected
relationship or
difference between
two or more
variables
-Be testable
-Offer tentative
explanation
-Use inductive logic
-State problem
initially in
planning for the
study
-Write statement as
research purpose or
questions
-Reformulate problem
statement during
data collection
-Ethnographic
-Historical
-Legal
Evaluate in terms of
specific criteria
related to:
-General research
problem
-Significance of the
problem
-Research questions
and hypotheses in
quantitative
research
-Research questions
in qualitative
research
LITERATURE REVIEW
FUNCTIONS
-Defines and limits
problem
-Places study in
perspective
-Avoids replication
-Selects methods and
measures
-Relates findings to
previous research
Suggests further
research
STANDARDS OF
ADEQUACY
Judged adequate by 3
criteria:
-Selection of
literature
-Criticism of
literature
-Summary and
Interpretation
META-ANALYSIS
Uses statistical
techniques to
synthesize results
of prior
independently
conducted studies
Steps:
-Formulate research
synthesis problem
-Collect data
-Evaluate data
-Analyze and
interpret data
-Public presentation
STEPS IN LITERATURE
REVIEW
-
Analyze problem
statement
-
Search and read
secondary literature
-
Select appropriate
index
-
Identify descriptors
-
Conduct
manual/computer
search
-
analyze research
problem
-
determine type
of search
-
select database
-
select
descriptors
-
conduct
literature
search
-
analyze printout
-
Read relevant
primary literature
-
Organize notes
-
abstract
articles on
index cards
-
organize
literature by
developing
appropriate
classification
system
-
Write review
Quantitative
Research:
-Organize by
sections
(introduction,
critical review,
summary)
-Organize criticism
by dates,
variables/treatments,
research designs and
methods, general to
closely related
literature, or
combination of these
Qualitative
Research:
-Conduct preliminary
literature review
-Continually review
literature during
data collection and
analysis
-Alternative
presentations of
literature (a)
separate discussions
(b) integration
within text
DESIGNING
QUANTITATIVE
RESEARCH
PURPOSE OF RESEARCH
DESIGN
To provide a
credible answer to a
research question.
PROCEDURES
Must be presented in
detail and specify:
-when, where, and
how data will be
collected
-experimental
treatment (where
applicable)
-procedures used to
control bias
DATA COLLECTION
TECHNIQUES
Questionnaires
Standardized
Interviews
Tests
Standardized
Observations
Inventories
Rating Scales
Unobtrusive Measures
Basic
Principles Common to
All Methods:
§
Test Validity
Inferences made on
the basis of scores
from an instrument
must be appropriate,
meaningful, and
useful
§
Test Reliability
Refers to
consistency of
measurement
VALIDITY OF DESIGN
§
Internal Validity
Refers to extent of
control over
extraneous variables
§
External Validity
Refers to
generalizability of
results
Two general
categories:
Populations
external validity
Ecological
external validity
SUBJECTS
Subjects are:
-
individuals who
participate in the
study
-
referred as the
sample
-
selected from a
larger group called
the population
§
Sample Size
Determined by the
type of research,
research hypotheses,
financial
constraints,
importance of
results, number of
variables studied,
methods of data
collection, and
degree of accuracy
needed.
§
Methods of Selection
-
Nonprobability
sampling
-using available
subjects
-
Probability
sampling
-using following
procedures to select
unbiased sample:
-simple random
sampling
-systematic sampling
-stratified random
sampling
-cluster sampling
DESCRIPTIVE
STATISTICS
Indices that
summarize or
characterize a
larger number of
observations
APPROPRIATE
STATISTICS
determined by:
-
Purpose of the
research
-
Measurement Scale
-
Nominal._
numbers
represent
categories
-
Ordinal._
numbers indicate
rank
-
Interval._
numbers
represent equal
intervals
-
Ratio._ numbers
represent equal
units from zero
TYPES
-
Measures of Central
Tendency
Each provides a
numerical index of
the typical score in
the distribution
Mean._ average of
all scores
Median._ point that
divides distribution
in half
Mode._ score that
occurs most
frequently
Relationship among
mean, median, and
mode:
a)
Normal
distribution: all
indicates the same
b)
Skewed
distributions: mean
lies closest to
tail, mode lies
furthest from tail,
median lies between
mean and mode
Indicates spread of
scores from the mean
of the distribution
Range._ difference
between highest and
lowest score
Standard deviation._
indicates average
variability of
scores
Standard scores._
have constant
normative or
relative meaning
Indicates the
relationship between
variables
Scatter plot –
graphic
representation –
correlation
coefficient -
numerical
-
Graphic Portrayal:
provides pictorial
representation of
group data
-
Frequency
distribution._
indicates number
of times each
score was
attained
-
Histogram &
Frequency
Polygon._
pictorial
display of
frequency data
DATA COLLECTION
TECHNIQUES
TECHNICAL
CHARACTERISTICS OF
MEASURES USED OT
JUDGE OVERALL
QUALITY AND
APPROPRIATENESS
Refers to the extent
to which inferences
made from the
results are
appropriate and
meaningful
Four Components:
-Content-related
-Concurrent
criterion-related
-Predictive
criterion-related
-Construct related
Refers to the
consistency of
measurement
Types:
-Stability
-Equivalence
-Equivalence and
Stability
-Internal
Consistency
TESTS
1.
Standardized._
provide uniform
procedures
2.
Norm-referenced._
compare individuals
to norming group
3.
Aptitude._
predict future
performance
4.
Achievement._
measure prior
learning
5.
Performance
assessment._
measures proficiency
by observing student
perform skills of
interest
Includes inventories
that measure traits
such as interests,
attitudes,
self-concept,
values, personality,
and beliefs
QUESTIONNAIRES
Are economical, can
assure anonymity,
and permit use of
standardized
questions
-justify use
-define objectives
-write questions and
statements
(items can be scaled, ranked, or have open or closed form)
-decide on general
and item format
-pretest
questionnaire
INTERVIEW SCHEDULES
Oral questions and
answers
-construct
interviews schedule
(questions may be structured, semi-structured, or unstructured
-pretest questions
-remove or rephrase
leading questions
-consider
characteristics of
interviewer that may
influence responses
-decide on how
responses will be
recorded
UNOBTRUSIVE MEASURES
Provide data that
are uninfluenced by
an awareness of the
subjects that they
are the participants
-physical traces
-archives
-simple observation
-contrived
observation
OBSERVATION
SCHEDULES
Recording of
naturally occurring
behavior
-justify
observational method
-define precisely
what will be
observed
-decide how
behaviors will be
recorded
(duration, frequency count, interval recording, continuous
observation, time
sampling)
-train observers
NONEXPERIMENTAL
RESEARCH DESIGNS
DESCRIPTIVE RESEARCH
Concerns with
current state of
something.
Investigate changes
of subjects over
time.
Can be longitudinal
or cross-sectional.
CORRELATIONAL
RESEARCH
-
Simple Relationship
Studies:
Correlation
coefficient
calculated from
scores on two
variables.
-The criterion
variable is
predicted by a prior
behavior.
-Several predictor
variables are used
to make a more
accurate prediction.
-
Interpreting
Correlational
Research:
-Correlation does
not infer causation.
-Spurious
correlations over- o
under- represent
actual relationship
between two
variables.
-Correlation
coefficient
expresses degree of
covariance between
variables.
-Coefficient of
determination
expresses common
variance between
variables.
SURVEY RESEARCH
Uses questionnaires
or interviews to
describe the
characteristics of
populations.
1.
Define purpose
and objectives
2.
Select
resources and target
population
3.
Choose and
develop techniques
for gathering data
4.
Determine
method of sampling
5.
Write letter of
transmittal
6.
Send follow-up
letters to subjects
who have not
responded
7.
Check
nonrespondents
EX POST FACTO
RESEARCH
Investigates whether
pre-existing
conditions caused
differences in
groups.
1.
Formulate
research problem
2.
Identify
plausible rival
hypotheses
3.
Find and select
groups that will be
compared
4.
Collect and
analyze data
including data on
factors that may
constitute rival
hypotheses
EXPERIMENTAL AND
SINGLE-SUBJECT
DESIGNS
CHARACTERISTICS OF
EXPERIMENTAL
RESEARCH
-Statistical
equivalence of
subjects in
different groups.
-Two groups or
conditions that can
be compared are
needed.
-Manipulation
of independent
variable.
-Measurement of
dependent variables
in numerical terms.
-Use of
inferential
statistics.
-Control of
extraneous
variables.
SINGLE-SUBJECT
DESIGNS
-
a) A-B: target
behavior observed
during baseline (A)
and treatment (B)
phases to determine
effect of treatment.
-
b) A-B-A: same as
(a) with addition of
second baseline (A)
phase.
-
c)
Multiple-baseline:
treatment replicated
across two or more
students, behaviors,
or settings.
PRE-EXPERIMENTAL
DESIGNS
-
a. One-group
posttest only:
effect of treatment
given to one group
is observed.
-
b. One-group
pretest-posttest:
group is observed
before and after
implementing
treatment.
-
c. Posttest only
with nonequivalent
groups: similar to
(a) with one
addition. A control
group receives no
treatment or a
different one.
TRUE EXPERIMENTAL
DESIGNS
Subjects are
randomly assigned to
experimental and
control groups.
§
a. Pretest-posttest
control group:
experimental
group(s) receive(s)
pretest, treatment,
posttest; control
group receives
pre-and posttest.
§
b. Posttest only
control group:
experimental
group(s) receive(s)
treatment, posttest;
control group
receives posttest
only.
QUASI-EXPERIMENTAL
DESIGNS
No random assignment
of subjects
§
a. Nonequivalent
pretest-posttest
control group:
experimental group
receives pretest,
treatment, posttest;
control group
receives pre- and
posttest.
§
b. Time-series: one
group of subjects is
measured repeatedly
before and after
treatment.
THREATS TO VALIDITY
-
Threats to Internal
Validity
May include:
history, selection,
statistical
regression,
pretesting,
instrumentation,
subject attrition,
maturation,
diffusion of
treatment,
experimenter
effects, treatment
replications,
subject effects,
statistical
conclusion.
-
Threats to External
Validity
May include two
general categories:
population and
ecological.
STATISTICS
The researcher
employs an
inferential
statistics test to
determine the
probability that the
null is untrue.
Level of
significance
indicates the chance
that it is wrong to
reject the null.
-Inferential
Statistics: Are
used to make
inferences about
populations based on
data from samples.
-Probability:
A scientific way of
stating the degree
of confidence in
predicting
something.
-Null Hypothesis:
A statement of no
relationship between
two or more
variables.
-Level of
Confidence:
Expressed as a
decimal e.g., .01,
.05.
STATISTICAL TESTS
NONPARAMETRIC
Statistical
procedures used when
the assumptions
necessary to use
parametric tests are
violated.
-
Chi-Square: Used
with nominal data to
test relationships
between frequency of
observations in
categories of
independent
variables.
-
Median Test
-
Mann-Whitney U Test
-
Sign Test
-
Wilconxon
matched-pairs
signed-ranks test
-
Kruskal-Wallis
-
One-way Anova of
ranks
PARAMETRIC
Statistical test
that assume
normality in the:
-population
-homogeneity of
variance
-interval or ratio
scale data
Used to compare
means of 2 groups to
determine the
probability that the
corresponding
population means are
different.
-
Independent
Samples T-Test
Used to compare
means of 2 groups
that have no
relationship to each
other.
Used to compare
means of 2 groups in
which subjects are
paired or matched in
some way.
-
Analysis of Variance
(ANOVA)
-
One-way ANOVA:
used to compare
2 or more sample
means on one
independent
variable.
-
Factorial ANOVA:
used to compare
2 or more sample
means on 2 or
more independent
variables.
Two-way or three-way
ANOVA denotes the
exact number of
independent
variables.
-
Analysis of
Covariance (ANCOVA)
Two major purposes:
1.
To adjust
initial group
differences
statistically on one
or more variables
that are related to
the dependent
variable but
uncontrolled.
2.
To increase the
likelihood of
finding a
significant
difference between
group means.
A family of
statistics used when
there are more than
one independent
variable, more than
one dependent
variable or both.
Statistical tests
(e.g., Fisher’s LSD.
Tukeys HSD,
Scheffe’s Test) that
are used with pairs
of means.
Usually
conducted after a
test of all means
together.
DESIGNING
QUALITATIVE RESEARCH
ETHICS
Ethical principles
are similar to those
of quantitative
research.
PURPOSEFUL SAMPLING
STRATEGIES
-
Site selection
-
Comprehensive
sampling
-
Maximum variation
sampling
-
Network sampling
-
Sampling by case
type
PHASE OF DATA
COLLECTION AND
ANALYSES
1.
Planning
2.
Beginning data
collection
3.
Basic data
collection
4.
Closing data
collection
5.
Completion
CASE STUDY DESIGN
Researcher selects
one phenomenon to
understand in depth
-
Purposes:
-
To develop
concept or model
-
To describe and
analyze a
situation,
event, or
process
-
To evaluate a
program
-
To identify
policy issues
-
To contribute to
large scale
research
projects
-
Used as a
precursor to
quantitative
research
INTERNAL VALIDITY
-
Threats include:
-
history
-
maturation
-
observer /
researcher
effects
-
selection
-
attrition
-
alternative
explanations
-
Strategies to
Enhance Internal
Validity
-
lengthy data
collection
period
-
participants
language
-
field research
-
disciplined
subjectivity
EXTERNAL VALIDITY
-
Threats are effects
which limit
comparability and
translatability and
include:
-
selection
-
setting
-
history
-
theoretical
RELIABILITY
-
In Design:
Reliability is
enhanced by making
explicit 6 aspects:
-
researcher role
-
informant
selection
-
social context
-
data collection
and analyses
strategies
-
analytical
premises
-
In Data Collection:
Strategies used to
reduce threats to
reliability:
-
verbatim
accounts
-
low inference
descriptors
-
multiple
researchers
-
mechanically
recorded data
-
participant
researcher
-
member checking
-
participant
review
-
negative cases
ETHNOGRAPHIC
RESEARCH
FORESHADOWED
PROBLEMS
-
Anticipated research
problems which will
be reformulated
during data
collection
-
Reflect naturalistic
discovery-orientation,
and the initial
conceptual framwork
-
Indicate focus of
data collection
strategies
ENTRY INTO THE FIELD
Involves the
following
1.
Site selection
2.
Mapping the
field: social,
spacial and temporal
maps
3.
Selection of
interviewers
4.
Choosing the
research role
a.
observer-participant
b.
participant-observer
c.
interviewer
DATA COLLECTION
STRATEGIES
-
PARTICIPANT
OBSERVATION
-
On-site
observation:
researcher is
present in the
field or site
for an extensive
time.
-
Prolonged Data
Collection: data
is collected
until
naturalistic
event ends or is
no longer
relevant.
-
Obtaining
people’s
perceptions of
reality
expressed in
their actions as
feelings,
thoughts, and
beliefs.
-
Corroborating
field
observations.
-
Observing and
recording
phenomena
salient to the
foreshadowed
problems. Use of
field notes and
summary
observations.
-
INTERVIEWING
-
Selecting type
of interview
-
informal
conversational
-
interview guide
approach
-
standardized
open-ended
-
key-informant
-
career and life
history
-
Determining
content of
questions,
writing quality
questions, and
deciding their
sequence
-
Taking into
account factors
that influence
an interview
session-duration,
number of
interviews,
settings,
identity of the
individuals, and
informant style.
-
Deciding how
responses will
be
recorded-handwritten,
or tape
recorded, or
both.
-
Typing
handwritten
records, or
transcribing
tapes.
-
DOCUMENT AND
ARTIFACT COLLECTION
-
Selecting type
of document or
artifact
-
personal
documents
-
official
documents
-
objects
-
erosion measures
-
Analyzing and
interpreting
documents and
artifact
collection
ANALYTICAL RESEARCH
CHARACTERISTICS
1.
Topics of
analysis:
historical, legal,
policy
2.
Types of
sources: documents,
oral testimonies,
and relics
3.
Search for
facts: requires
locating primary and
secondary sources
4.
Analytical
generalizations and
explanations:
inductive logic
applied to
generalizations to
suggest causal
explanations
5.
Kinds of
analysis:
conceptual,
interpretative,
comparative, and
universal analyses,
edition, descriptive
narration.
USES OF
1.
Provides
knowledge and
explanation to the
past
2.
Clarifies
present legal and
policy discussions
3.
Creates a sense
of common purpose
about education in
the society
TYPES OF ANALYTICAL
RESEARCH
Focuses on the
meaning of a concept
(e.g., education,
literacy, knowledge)
by describing the
generic meaning, the
essential meanings,
and the appropriate
usage of the
concept.
Researcher uses 3
types of analysis:
-generic
-differential
-conditions
-
Educational
Historical and
Policy Events
Focuses on
biographies,
movements,
institutions,
practices, analysis
and distribution of
power, policy-making
processes, and
policy content
changes.
Researcher:
1.
Identifies
topic and develops
problem statement.
2.
Locates primary
and secondary
sources in documents
and oral
testimonies.
3.
Looks at the
relationship between
facts and interprets
them as
generalizations.
Synthesizes
generalizations and
provides causal
explanations or
conclusions.
Focuses on legal
issues to discover
what is the law in
specific situations
Researcher:
1.
Selects a
problem in terms of
party/parties
subject matter or
property involved,
nature of claim, and
object or remedy
sought.
2.
Locates primary
sources (federal,
state, and local
statutes, and court
decisions), and
secondary sources
(e.g., legal
periodicals,
yearbooks, casebooks
and others).
3.
Uses the case
study design to
analyze statutes and
court decisions,
synthesize primary
and secondary
sources, and to
state a definitive
position on a legal
issue.
QUALITATIVE DATA
ANALYSIS
An inductive process
of organizing data
into categories and
identifying patterns
(relationships)
among categories.
Data analysis
entails several
cyclical phases.
§
Analysis that occurs
during data
collection:
- Discovery Analysis:
Strategies include:
-writing observer
comments and
summaries
-playing with ideas
-exploring the
literature
-using metaphors and
analogies
- Interim Analysis:
Assists in making
data collection
decisions and
identifying emerging
topics and recurring
meanings.
CODING TOPICS AND
CATEGORIES
Typically occurs
after data
collection.
Developing an
organizing system to
divide data into
segments.
§
Steps:
1.
Get a sense of
the whole
2.
Generate topics
from the data
3.
Compare
duplication of
topics
4.
Try out
provisional
classification
system
5.
Refining
organizing system
Developing topics
into discrete
categories.
§
Predetermined
categories: derived
from research
problem, interview
guide, literature,
and researcher’s
prior knowledge.
§
Emic categories:
represent insider’s
view i.e. terms,
actions, and
explanations that
are distinctive to
the settings or
people.
§
Etic categories:
represent outsider’s
views i.e.
researcher’s
concepts and
scientific
explanations.
PATTERNS
Finding
relationships among
categories.
Techniques for
pattern-seeking:
§
gauging data
trustworthiness
§
using triangulation
§
evaluating
discrepant or
negative evidence
§
ordering categories
for patterns
§
sorting categories
for patterns
§
constructing
integrative diagrams
§
doing logical
cross-analyses
A pattern becomes an
explanation only
when alternative
patterns do not
offer reasonable
explanations central
to the research
problem.
PRESENTATION OF
QUALITATIVE RESULTS
Qualitative studies:
§
Present context and
quotations of
participant language
as data.
§
Are written in a
variety of formats;
detailed reporting,
descriptive-analytical
interpretations, and
abstract theoretical
discussions.
DATA MANAGEMENT
§
Develop data filing
system.
§
Manage data manually
(cut-and-file, and
file-card
techniques), or
using the computer
(word processing, or
text analysis
programs).
EVALUATION RESEARCH
PURPOSES OF
EVALUATION
-
Formative:
evaluation designed
and used to improve
a practice in the
early stages of
development.
-
Summative:
evaluation designed
to determine the
merit, worth, or
both of a developed
practice, and to
make recommendations
regarding its use.
EVALUATION
APPROACHES
-
Objectives-oriented:
determines degree to
which objectives of
a practice are
attained by a target
group.
-
Decision-oriented:
supplies information
for needs
assessment, program
planning, program
implementation or
outcomes.
-
Naturalistic and
participant-oriented:
uses multimethods to
provide an
understanding of the
divergent values of
a practice from the
participants’
perspectives.
CRITERIA USED TO
JUDGE QUALITY
-
Utility: does the
evaluation serve the
needs of a given
audience?
-
Feasibility: is the
evaluation
realistic, frugal,
and diplomatic?
-
Propriety: has the
evaluation been
conducted legally
and ethically?
-
Accuracy: does the
evaluation provide
accurate information
about the practices
studied?
POTENTIAL BENEFITS
-
Systematic
implementation of
school improvements
-
Cost analyses of
large expenditures
-
Assessment of
educational effects
on students
-
Appraisal of the
quality of education
-
Reduction of
uncertainty in
innovative practices
-
Legitimization of
decisions
-
Enlightenment of
influentials in
decision and policy
arenas to better
anticipate program
and policy issues
LIMITATIONS
-
Failure of studies
to improve
educational
practices and
educational policy
formulation.
-
Failure to
appreciate that
research is only one
of many influences
on educational
policies, practices,
and decisions.
POLICY ANALYSIS
PERSPECTIVE
-
Central concept is
choice
-
Uses two approaches:
-
Macro-based on
economic and
system models
-
Micro-incremental
activist, field
oriented, and
eclectic
CHARACTERISTICS
-
Multidimensional in
focus
-
Uses an
empirico-inductive
research orientation
-
Incorporates past
and future
-
Responds to study
users
-
Incorporates values
METHODS
-
Focused synthesis
-
Secondary analysis
-
Field experiments
-
Qualitative
interviews
-
Surveys
-
Case studies
TYPES OF
-
Cost analysis:
-
cost benefit
-
cost
effectiveness
-
cost utility
-
cost feasibility
-
Indicator systems.
Functions of:
-
provides
information
about the
operation of a
program
-
determines
success of a
program
-
suggest areas of
further study
-
accountability
-
Case Studies
-
multisite
studies
-
critical
ethnography
-
eclectic case
studies
POTENTIAL BENEFITS
-
Systematic
implementation of
school improvements
-
Cost analyses of
large expenditures
-
Assessment of
educational effects
on students
-
Appraisal of the
quality of education
-
Reduction of
uncertainty in
innovative practices
-
Legitimization of
decisions
-
Enlightenment of
influentials in
decision and policy
arenas to better
anticipate program
and policy issues
LIMITATIONS
-
Failure of studies
to improve
educational
practices and
educational policy
formulation.
- Failure to appreciate that
research is only one
of many influences
on educational
policies, practices,
and decisions.
GUIDELINES FOR
RESEARCH PROPOSALS
FORMS OF RESEARCH
COMMUNICATION
-
Research proposal
-
Thesis or
dissertation
-
Journal article
-
Evaluation and
technical report
-
Paper presentations
QUANTITATIVE
RESEARCH PROPOSALS
I.
Introduction
a.
General
statement of the
problem
b.
Review of the
literature
c.
Specific
research question
and/or hypotheses
d.
Significance of
the proposed study
II.
Design and
Methodology
a.
Subjects
b.
Instrumentation
c.
Procedures
d.
Data Analysis
and Presentation
e.
Limitations of
the Design
III.
References
IV.
Appendices
QUALITATIVE RESEARCH
PROPOSALS
Ethnographic
I. Introduction
a.
General
statement of the
problem
b.
Preliminary
literature review
c.
Foreshadowed
Problems
d.
Significance of
the proposed study
II. Design and
Methodology
a.
Site or social
network selection
b.
Research role
c.
Purpose
sampling strategies
III. References or
Bibliography
IV. Appendices
Historical and Legal
I. Introduction
a.
General
statement of the
problem
b.
Preliminary
literature review
c.
Specific
research historical
questions or legal
issues
d.
Significance of
the proposed study
II. Design and
Methodology
a.
Case study
design
b.
Sources:
search, selection
and criticism
c.
Inductive data
analysis
d.
Limitations of
design
III. References or
Bibliography
IV. Appendices
COMMON WEAKNESSES
-
Problem is trivial
and not delimited.
-
Objectives of the
study are too
general.
-
Methodology is
lacking in detail
appropriate for the
study.
|