Basic
That's Valid
Thank you, I designed it
Here's the plan
Askin all them questions
100

Name 3 types of Evaluation

Formative, Process, Outcome, Economic, Impact, Summative


100

Provide an example of History Bias

An event, not related to the intervention, occurs and influences the outcomes.

The event may be:

Exposure to competing intervention 

Exposure to similar information provided through media 

A political/social/community event

A natural disaster

Selection differences between the participants in the intervention and the comparison groups lead to differences in exposure or impact of historical events

100

Define the term "counterfactual"


The control group.

What would have happened to intervention group participants in the absence of the intervention.

100

Give an example of convenience sampling

Approaching students in Tidewater to complete a survey, interviewing public health professionals while on a break from a workshop, asking clients seeking services at your law firm to answer questions about their experience within the criminal justice system, etc.

100

What are the components that Reach is comprised of? 

Recruitment, Refusal, Retention, Attrition

200

List 3 strategies to keep stakeholders involved

Keep a communication line 

Clearly defined tasks and timeline

Engage them for tasks they are qualified for

Incentive when stakeholders are not professionals


200

Provide 2 examples of measurement bias

Instrumentation: reliability of the instrument changes; bias associated with data collection methods (type of methods used, social desirability, recall bias)

Testing: when the same instrument is used multiple times, respondents' scores improve because they are able to recall information from the pretest

Researcher effects: quality of data collection changes due to interviewer fatigue/experience; interviewer age/style influence responses; different interviewers/observers for the control and experimental groups

200

Provide a definition or an example of a switching replication randomized control trial.

Consists of two groups (experimental & control) and three waves of measurement.

In the first phase, both groups are pre-tested. Experimental group receives intervention, then both groups are post-tested.

In the second phase, the control group receives the intervention and is post-tested.

200

Define the term "study population"

The group of people that meets the eligibility criteria and to whom the results apply 

200

List one advantage and one disadvantage of using an existing standardized survey

advantage: already piloted, used, analyzed; reliability and validity established; allow for comparison with other studies

disadvantage: may not be culturally appropriate, may include many questions that are not useful for the purpose of your study

300

List 2 pros and 2 cons of hiring an Internal Evaluator

Pros:

more knowledgeable about program and context

more program staff and community trust

cheaper

Cons:

less objectivity

potentially less expertise with evaluation methods

potential conflicts of interest

300

What should you do to prevent Selection Bias?

Use a control group; match study and control people as closely as possible

Random assignment; do not allow self-assignment

Large sample

Ensure a complete sampling frame

300

What is a characteristic of true experiments that is lacking in quasi-experiments? 

A high degree of control; the ability to assign participants randomly to conditions

300

Give an example of a question that is addressed through process evaluation

Is the program being implemented the way it was planned? Did we do what we said we were going to do/planned to do?


300

Give an example of an indicator for dose delivered, include the numerator and denominator that you would use

% of training sessions provided over the total number of training expected to be delivered

%  demonstration exercises implemented over the total  number of exercises expected to be delivered

% flyers/handouts distributed

400

What is the purpose of the Evaluability Assessment?

Determine whether a program evaluation is plausible

Assess the views of the stakeholders regarding the program goals, objectives, & performance criteria

Prevent evaluation resources from being spent when it is premature or unfeasible

Determine evaluation readiness and decide what type of evaluation is needed

Assess if the resources are sufficient

400

Provide an example of maturation bias

A change occurs in participants as a result of maturation or passage of time (teens aging, and becoming better educated)

400

In some circumstances, elements of an intervention may need to be adapted. Provide 1 example of an acceptable adaptation and 1 example of a risky adaptation.

Acceptable: Changing language, translating and/or modifying vocabulary, replacing illustrations to make them more culturally appropriate, modifying some aspects of activities such as physical contact

Risky: reducing the number/length of sessions or how long participants are involved, lowering the level of participants’ engagement, eliminating key messages or skills, changing or not complying with the theoretical approach, using staff insufficiently trained or unqualified, using fewer staff members than recommended

400

What is the purpose of conducting a formative evaluation?

Evaluate the nature of the problem

Help determine the target population and their specific needs

Ensure intervention is culturally appropriate

Pre test instruments and materials

Determine the measurement procedure

400

Provide an example of a health outcome for the below scenario:

fitLIFE is an obesity prevention intervention program that specifically addresses childhood obesity in NOLA.

The intervention includes group workout sessions, tailored nutrition education, and development of a personal readiness to change plan. Participants' progress is assessed through a weekly zoom call throughout the school year.

Reduce the rate of adverse health outcomes associated with obesity among youth in New Orleans.

500

Why is it important to include Stakeholders?

increase buy in, foster trust and facilitate evaluation process, foster strategic thinking, impact quality/use of findings, increase evaluation capacity, diverse perspectives, expertise

500

Provide an example of a design issue that may threaten internal validity 

Altering the theory of change, altering the quantity or quality of activities, and/or program length

500

Describe a context where RCT is not a desirable design

The intervention has been previously tested and shown to be effective, sample size is too small to efficiently randomize people, the program was not planned to be an RCT, people were not randomly assigned to the experimental/control groups, the program has been altered during the course of the experiment

500

In outcome evaluation, what symbols are used to represent the intervention and an observation/measurement?

intervention = x

observation/measurement = o

500

Provide an example of a process indicator used to assess Reach, include the numerator and denominator.

Recruitment: # of eligible participants that were recruited/ total eligible expected

Refusal: # eligible participants who refused to participate/total asked to participate

 Attrition: #  of participants recruited that have dropped the program /total people recruited

M
e
n
u