Why It Matters
Common Pitfalls
Core Principles
Case Scenarios
Other
100

Why are surveys important?

  • Gather data directly from your audience  

  • Identify trends, preferences, and needs  

  • Support evidence-based decisions  

  • Measure satisfaction or performance  

  • Cost-effective  

  • Scalable (can reach many people)  

  • Quantifiable results  

  • Anonymous feedback encourages honesty 

100

What is a double-barreled question and what is our recommendation with them?

  • Double barreled questions ask about more than 1 thing and make it difficult to discern what the person is answering 

  • We recommend avoiding them 

100

What is one of the first things you should do after finalizing a survey and it has IRB approval.

  • Test it out 

  • Conduct a cognitive intervi

100

Your PI has a survey with over 50 questions. What are some recommendations?

  • Revisit the grant or service proposal to remind yourself of the survey focus 

  • Try to stay within scope 

  • Remind the PI about survey fatigue 

  • See which questions you can cut out 

100

What does table 1 of a report usually depict?

Table 1 in a manuscript (and a report) is typically a table of descriptive statistics to describeyour sample.

200

Why is data integrity important?

Data-driven decisions are only as strong as the data they are based on

200

What are consequences of poor data integrity?

  • Biased results  

  • Invalid conclusions  

  • Wasted resources 

200

What are some ways to improve User Experience In Surveys?

  • Cognitive interviews 

  • Soft launching 

  • Look and feel 

  • Navigation and logic flow 

200

Describe the difference between qualitative and quantitative tools in research.

  • Surveys – can get general attitudes, breadth,not depth

  • Focus groups & qualitative interviews – betterwhen you want to understand people'sperspectives in depth

200

Name a few ways to customize a survey to the participant

Pipping

Embedded data

300

What is data integrity?

Definition: Data integrity = accuracy, completeness, consistency, and reliability of data

300

What are common internal threats to survey data integrity? Give two examples.

  • Poor survey design  

  • Leading or confusing questions, inadequate response options, ambiguous language or poor translations  

  • Sampling bias  

  • Using only online surveys, excluding those without internet access, recruiting from clinics that don't servefull intended population, not translating surveys, only inviting participants from a limited list  

  • Inconsistent data handling  

  • Manual entry errors, improper data coding or cleaning, inconsistent protocols across sites or researchers  

  • Bias from survey administrators or AI  

  • Interviewer influence (in in-person or phone surveys), inconsistent instructions given to participants  

  • Non-standardized procedures  

  • Variability in survey delivery across devices or environments, inconsistent participant recruitment methods  

  • Data fabrication or falsification  

  • Intentional altering or creating of data by researchers or staff  

300

Name a few Programming Techniques That Improve UX

  • Skip and display logic 

  • Piping and embedded data 

  • Real time validation 

  • Coding 

300

You run a statistical test, and your p value is 0.03. Are your results statistically significant?

Yes, a p-value of 0.03 is generally considered statistically significant, especially if the chosen significance level (alpha) is 0.05. Statistical significance means the result is unlikely to have occurred by random chance alone, suggesting a real effect or relationship.  

300

What are examples of multivariate analyses. Please give an example

  • Looking at 3+ variables together  

  • Regression, logistic regression  

  • For instance: when you have several predictor variables in a regression equation 

400

Why does survey user experience matter?

  • Improves completion rates

  • Enhances data quality

  • Reduces survey fatigue and drop-off

  • Can improve efficiency of analysis process

400

What are common external threats to survey data integrity?

  • Nonresponse bias  

  • Systematic differences between those who respond and those who do not, may lead to under representation of key groups  

  • Careless participant responses  

  • Straight-lining or selecting random answers, skimming or rushing through the survey  

  • Fraudulent response or bots  

  • Automated bots completing surveys, individuals submitting multiple responses to gain incentives  

  • Participant misunderstanding or misreporting  

  • Misinterpretation of survey questions, deliberately providing false information  

  • Link sharing beyond the intended population  

  • Survey link shared publicly or on social media, responses collected from people outside the target sample 

400

What are some preventative strategies to maintain survey data integrity?

  • Internal: Soft launch, Data checks, validation and logic rules, document and train 

  • External: verify participant identity early/often, commitment question, consider best survey platform 

  • External survey fraud: platform security features, required eligibility questions, challenge questions, survey response limit, personalized links, and create/document data set qualifications 

400

What are common pitfalls to avoid during the survey design phase.


  • Poor response options

  • Name two items with same variable name (nightmare)

  • Ask more questions than you will have time to analyze

  • Surveys that are too long (high drop off rates)

  • Skip pre-testing the survey items for clarity or the online tool to check for skip logic

  • Too many complicated questions (e.g., matrix-style questions)

  • Too many free-text questions

400

What are common pitfalls to avoid during the survey build?

  • Default scores/codes for scaled questionnaire

  • Default variable names 

  • Overuse of matrices (5-7 matrix question on one page)

  • All questions in the survey on the same page (page breaks/Blocks onQualtrics and Sections on Redcap)

  • Not providing participants with an estimate time to complete survey 

  • Excessive use of dropdown menus 

  • Not including an "other" option when necessary - especially demographic questions. 

500

Why is the limitations section of a report important?

  • Generalizable to other populations? 

  • Social desirability bias

  • E.g., Pause in survey administration

  • E.g., Fewer fathers than mothers in sample

  • E.g., No pre-pandemic data with which to compare

500

What should be included in the methods section of the report. Name 5 elements.

  • Overall study plan

  • Study design – what tools or instruments did you use? Describe the information you collected (e.g., "Weused the NIH PROMIS measure for Family Relationship, which consisted of four items...." "Participants alsoprovided the following demographic information about their families:…"

  • Recruitment – how did you recruit participants? Eligibility criteria? 

  • Survey administration - How was the survey administered? Platform? Could participants skip questions? 

  • Analysis plan – what analyses will you use? Describe any coding of variables, dichotomizing, etc. 

  • (IRB/ethics statement if manuscript)


500

What are examples of univariate analyses. Please give an example

  • Looking at only one variable  

  • Means, standard deviations 

500

What is the survey response rate?

The survey response rate refers to the proportion ofparticipants who completed the survey out of all potentialparticipants who were invited to complete the survey

500

What is a chi-square test?

A statistical hypothesis test used to determine if there's a significant difference between the observed and expected frequencies in one or more categories