Introduction to Assessment
Assessment Design
Developing Outcomes
Data Collection Methods
Odds and Ends
100
What online tool does our division utilize in order to track and manage department level assessment plans?
•TaskStream
100
Statement that defines what the program or process should do, accomplish, or achieve for its own improvement.
• Operational outcome
100
Indicates what a participant will know, think, or be able to do as a result of an event, activity, program, etc.
• Learning Outcome
100
Complete Annemieke’s favorite quote “_______ with the end in mind”
• Design (or Begin)
100
Connective outcome to the bigger picture “Develop from the bottom to the top, deliver from __________”
• Top to bottom
200
State two reasons why we conduct assessment
• Demonstration of effectiveness • Understand if we are meeting goals • Accreditation requirements • Validating, understanding, improving our work • Rationale for decision making • Support of a division, university, or department strategic plan
200
Name two reasons why comprehensive assessment plans are important to a department?
• Shared departmental vision • Institutional memory • Demonstrates the assessment process to external audiences • Roadmap for the completion of assessment activities
200
You added a customer service training for staff, so that 90% of parents will be satisfied with their interaction with Financial Services during Welcome Week. This is an example of what type of outcome?
• Operational (or Program) Outcome
200
This type of data analysis focuses on numbers; and answers the questions who, what, where, when
• Quantitative
200
This type of data analysis focuses on text/narrative; and the questions why and how
• Qualitative
300
Collecting and interpreting information to be used in the planning and program improvement process- is this assessment or research?
• Assessment
300
What are the two primary types of outcomes?
• Operational Outcomes • Student Learning Outcomes
300
State three common challenges in developing outcomes
• Too vast/complex • Too wordy (must be specific and clear) • Multiple outcomes/audiences/verbs in one statement (very difficult) • Not specific enough • Not measurable
300
Define the difference between a sample and a population
• P-the whole group, when the survey goes to the entire group • S-a subsection of that group, when the survey goes out to 20% campus
300
Chart of action verbs helpful in constructing outcomes, with categories including knowledge, comprehension, application, analysis, synthesis, and evaluation.
• Bloom’s Taxonomy
400
**Image Question** What step is missing from the assessment cycle?
• Use findings to plan/results for improvement
400
**Image Question** What outcome set is missing from the chart of Division/RIT Outcomes Sets?
• Student Affairs Strategic Plan
400
What are the 3 M’s of outcomes?
• Meaningful: how does the outcome support the departmental mission or goal? • Manageable: what is needed to foster the achievement of the outcome? Is it realistic? • Measurable: How will you know if the outcome has been achieved? What will be the assessment method? Is it most appropriate?
400
State the difference between a direct and indirect method
• D- any process employed to gather data which requires subjects to display their knowledge, behavior, or thought process • I- any process employed to gather data which asks subjects to reflect upon their knowledge, behaviors, or thought processes
400
Name three cons of a focus group
• Facilitation requires skill/practice • Results are not generalizable • Time needed for training and analysis • Lack of control over discussion • Groups can influence individual responses • Challenge to get people to attend
500
Satisfaction, needs, and benchmarking are three kinds of assessment projects what is one more?
• Outcomes • Utilization/Tracking
500
**Image Question** What step is missing from this process of outcomes assessment planning?
• Systematically gather, analyze, and interpret evidence
500
What do A, B, C, & D stand for in the ABCD Structure of a Learning Outcome ?
• Audience/Who • Behavior/What • Condition/What • Degree/How Much
500
State 5 of the 10 most common methods discussed in the workshop
• Existing data • Survey • Rubric • Focus Group/Interview • Portfolio • Observation • Document analysis • One-minute/quick assessment • Visual methods • Case study
500
What is the value of examining past assessment data?
• Beginning with the end in mind • Gain an understanding of what data was used and not used and why • Discover what data was useful • Find problems or things you would change (wording, order of questions, missing questions, results that didn’t make sense or were difficult to analyze) • Prevent survey fatigue (don’t survey what information you already have) • Revisit feedback