Evaluating Teaching Project


From which voice are you collecting your feedback?  

Click on an activity for more information.
  • Beliefs about and dispositions towards disciplines
  • Understandings of the nature of what it means to learn/know content in those disciplines
  • Student self-efficacy in a given discipline or in general
  • How students feel/what they believe about the discipline.
    • People are either born with the ability to write novels or not.
    • Remembering dates and names is one of the most important parts of studying history.
  • How students see themselves in relation to the discipline (ability to do the work, motivation to learn, curiosity about a topic, etc.).
    • I can figure out most math problems.
    • Psychology will help me understand others.
  • Students with positive dispositions toward a subject and/or high sense of ability to do well in a subject will persist at higher rates, which can lead to improved performance.
  • Scores on surveys.
  • Scores disaggregated by student characteristics such as gender, major, race, etc.
  • Aggregate scores on related questions such as “beliefs about own ability” or “beliefs about how people view me”, etc.
  • Synopsis of students’ survey results with implications for student learning.
  • Take the survey along with your students and reflect on your results (e.g., describe how your beliefs might affect your instructional approach). This would satisfy SELF data source as well as STUDENT.
  • Summary and analysis*
  • Allow instructors to collect idiosyncratic data about a class
  • Is targeted and can be as short as a few questions
  • Can be given at any point in the semester
  • Can be used multiple times in a semester to monitor changes
  • How engaged/interested are my students during class, such as:.
    • Comfort level asking questions in class
    • Interest in/curiosity about a topic
    • Sense of belonging
  • Questions about students' lives and study habits:
    • Reason for pursuing degree/taking course
    • Number of hours working per week
    • Time set aside for coursework per week
  • Students who feel a sense of identity and/or belonging in a class are more likely to engage with the faculty/other students and are less likely to drop out. Knowing students’ expectations of time commitment and/or what learning looks like in a class can help faculty address those expectations.
  • Average scores for individual questions.
  • Average aggregate scores for group(s) of related questions.
  • Collections of free-responses answers organized or coded using themes.
  • Outline results that you will use to guide changes in your instruction.
  • Interpretation of your ratings, reflecting on both your strengths, as well as items that you feel you can improve.
  • Graphical representation of increases in and/or types of positive responses.
  • Summary and analysis*
  • An instructor teaching Introductory Biology is curious why students are not participating in class discussions or asking questions.
    • To probe this issue, she develops a short, 2-item survey on Qualtrics
    • The survey consists of one closed-ended item that lists various reasons why she thinks students might not be participating and another item that lets students enter their reason(s) in their own words
    • Based on responses, she adjusts instructional design.
    • Several weeks later, faculty gives the survey again.
  • Convene a small group of students (ideally 5-8)
  • Ask questions specific to instructional design and/or assignment efficacy
  • Ask open questions about class experience
  • How do students feel about the progress they have made in the course?
  • What do students like best about the way the course is designed?
  • How would students change the course design and/or assignments to improve their work?
  • Feedback on aspects of the course that are not included in SPOTs.
  • Focus groups that include questions that are directly relevant to a course can yield course and discipline-specific insights that go beyond the scope of standardized evaluation questionnaires (Fife, 2007).
  • Notes/quotes from the focus group session(s) with names and identifying features deleted.
  • Analysis and/or report from the person guiding the focus groups.
  • Video of the focus group (if permission has been granted by all of the participants).
  • Transcript analysis for themes.
  • Excerpts from the group conversation and the facilitator’s notes/report.
  • Written synopsis of themes and/or areas for improvement along with plans to adjust instructional design.
  • Summary and analysis*
  • An example of the type of data you might yield from a focus group from the University of Illinois at Urbana-Champaign
  • Can be administered by CAT or a peer
  • Questions generally focus on what is working and what is not.
  • Can include question about content such as “how would you rate your understanding of….”
  • Feedback from students is anonymous but guided to get useful information.
  • Person doing the MSF can synthesize the results for the faculty (This would then serve as a PEER data source as well.)
  • How do students perceive various aspects of the course and instruction, such as:
    • Comfort with the pace or workload
    • Utility of course resources (e.g., syllabi, Canvas module)
    • Their progress in mastering course content
    • Clarity when introducing new concepts
    • Relating new material to existing knowledge and real-world issues
  • Instructors can use the mid-semester feedback to make meaningful adjustments to course content and/or instructional design that semester. This can also increase students sense of belonging to a learning community.
  • Written synopsis of themes and/or areas for change.
  • Brief descriptions of student conceptions of key curricular ideas.
  • Synopsis of student perceptions of is working to help them learn and what is not working in aiding learning.
  • Outline findings of student difficulties and explain what changes were/will be made and why they will facilitate understanding.
  • Discuss student suggestions for improving the class and address whether and how they will be implemented.
  • Discuss rejecting suggestions with an outline for improving student understanding of rationale for instructional practices or design.
  • Summary and analysis*
  • Student comments
  • Let students know what useful feedback looks like (get better and more responses)
  • Can be used to identify effective practices in the class.
  • Note: Sharing qualitative SPOTs responses with a peer may help faculty look at the overall picture – rather than focusing on the more extreme comments. (This would then serve as a PEER data source as well.)

  • How do students feel about the class in their own words?
  • Are there common elements of the class / across classes that show up in student comments?
  • Insight can be gained on whether students are able to follow the flow of the class, find quizzes/drafts helpful for larger assignments, or whether students can keep up with class notes.
  • Themes that surface through coding of student responses. (Although a research tool, coding qualitative work can be done in a less formal way to provide faculty information).
  • Excerpts of comments along the same theme (even across sections/courses).
  • Counts of responses that fall into categories that you find valuable. For example, bin comments on homework based on “too much”, “just right”, “too hard”, “essential to passing the class”, etc.
  • Share excerpts or themes along with plans to expand on aspects of your teaching that are effective, and ideas to explore areas that can be improved.
  • Explore connections to student responses to the Likert-scale questions.
  • Summary and analysis*
  • Increase response rates by telling students about changes you have made based on student input.
  • Take the SPOTs yourself (prior to seeing your students’ responses). Reconcile your responses with those of your students. (This would then serve as a SELF data source as well.)
  • Did changes to the instructional design of the class work?
  • How does the new curriculum (course materials, textbook, etc.) affect student perceptions of the class?
  • Are there discrepancies between students’ perceptions of the class and my own?
  • Knowing how students perceive their class experience can guide faculty to strengths they want to emphasize.
  • SPOTs results by individual question or groups of connected questions.
  • Changes in response rates.
  • Changes in responses to a particular question, or averages for a category of questions.
  • Annotated SPOTs results with thoughts about the information it provides. Instead of defending against, try to explain why.
  • Share some of the responses (or excerpts) on the item(s) that you focused on as well as changes that you may have or plan to undertake.
  • Summary and analysis*
  • SRL is an individual’s influence, orientation, and control over his/her own learning behaviors.
  • It has been correlated with academic success.
  • Helping students refine their SRL skills can lead to considerable achievement gains (Hudesman et al., 2013).
  • Are students confident in the knowledge that they gained in this class?
  • Do my students know when and/or how to acquire knowledge that they need to be successful in this class?
  • Identifying discrepancies between students’ perceptions of their learning and their actual learning, as measured by exams, papers, etc., can help instructors incorporate learning skills or clarify expectations.
  • Results of the self-regulation survey/scale.
  • Patterns in the results or especially surprising outcomes.
  • Synthesis of the results.
  • Salient changes from one administration to the next.
  • Changes you plan to make in light of the findings to enhance students’ self-regulation in future terms.
  • Summary and analysis*
  • Culminating assessments such as comprehensive exams, final papers, research projects, and performances
  • In order to make decisions based on culminating assessments, faculty can confirm the efficacy of the assignment design by reconciling results of the assessment with student focus groups and/ or peer feedback. (This would then serve as a PEER or another STUDENT data source as well.)
  • Have my students met the learning outcomes that I have determined are most important for this course?
  • Do the instructional design and/or course materials do what I want them to?
  • How has an instructional / curricular change impacted student success in the course?
  • Students’ performance on assessments can be used as an evidence-based guide for modification to instructional / materials design, clarification of expectations, or even scope and sequencing of course content for subsequent courses.
  • Distribution of scores on a specific question or part of a rubric along with notes on student work.
  • Class average by semester on question(s) measuring a course learning goal.
  • Annotated samples of exemplary student work.
  • Rubrics used to assess student work.
  • Summary of student attainment/progress toward the identified key learning goal(s).
  • Graphical representation of student scores for a given skill/topic over several assessments.
  • Excerpts from student work and associated rubric.
  • Summary and analysis*
  • Final Exam
  • Capstone Paper
  • End-of-semester performance
  • Project
  • Presentation
  • At start and end of semester for major course learning outcomes.
  • Before and after a module to check student progress and/or identify content that needs more work.
  • Ask pre- and post- questions about how students think about gaining knowledge in the discipline. E.g. “what does it mean to think like a _______”
  • Use concept inventories to measure whether students have an operational understanding of foundational concepts. These are standardized so you can compare student groups. STEM disciplines are further along with widely used concept inventories.
  • What are my students' learning gains throughout the semester?
  • Have my students become better learners in the discipline?
  • How do my students’ learning gains compare to other student populations?
  • Students' performance on pre-tests can be used to gauge students' incoming level of knowledge or competency to plan for instruction. Students’ learning gains can also be used to gauge how instruction impacts students with different starting points.
  • Distribution and averages of class pre- and post-test scores.
  • Gains on pre- post assessments.
  • Isolate scores on questions intended to measure important learning goals or concepts.
  • Comparison of gains by semester after adjustments to course design and/or instructional methods.
  • Present results at workshop or conference.
  • Summarize effectiveness of course design and instruction using student learning gains.
  • Summary and analysis*
  • Brief, non-graded or graded, in-class/online activities give faculty and students real time feedback on the teaching-learning process.
  • Anonymous CATs put focus on learning content.
  • Are my students understanding the material being presented during class?
  • Is this new instructional practice or activity effective?
  • Did my students understand a particularly nuanced or commonly misunderstood concept?
  • Formative assessments like CATs are more like a check-in to help faculty and students find out if/how much progress is being made toward learning goals.
  • Screenshots of results from iClicker questions.
  • Notes on what the CAT results tell you about your students’ learning, and what changes you decided to make, if any.
  • Synthesis of the results and your response.
  • Summary and analysis*
  • Transparency of purpose, task and criteria help students know what they are learning, why, and how they can demonstrate it.
  • Peers can provide feedback on the level of transparency of an assignment’s purpose, tasks, and evaluation criteria.
  • See Transparency in Learning and Teaching (TILT) website for in depth transparency assignment guidelines and template. Winkelmes’ (2013).
  • Are connections between a significant assignment and intended career apparent in the assignment design/instructions?
  • Are connections between current course materials and work in subsequent courses clear?
  • Are the expectations of the type of work / level of rigor clear in an assignment?
  • Research on the effectiveness of transparency in assignments indicates a correlation to better performance for first-generation, low-income, and underrepresented college students.
  • Peer’s written version of what they think the assignment is asking for and why.
  • Peer’s attempt at the assignment
  • Brief discussion on the clarity of purpose, tasks, and evaluation criteria
  • Motivation or affect survey
  • Notes from the review process
  • Describe how your peer reviewed the assignment (attempted to do it themselves, peer reviewed the directions for clarity and purpose, etc.). Report what changes were made and the impact on student performance the following semester.
  • Before and after version of part of the modified assignment using the Transparency Template with explanation of improvements.
  • Summary and analysis*
  • Read your colleague’s syllabus carefully and note what you would conclude about the course and the instructor if this was the first introduction to both:
    • “If I was taking this course, here’s the questions I’d have.”
    • “After looking at this, here’s what I’d think about the instructor and how he/she will be conducting the course.”
  • How effectively does my course syllabus communicate the course learning goals and expectations?
    • What students will learn and why?
    • Personal relevance of course topics to my students’ lives?
    • The overall “tone” of my class?
  • An effective syllabus can enhance students’ learning experience in your course by communicating personal relevance to students’ lives, which has been shown to be particularly beneficial for enhancing self-efficacy in students from marginalized groups.
  • Completed and/or annotated checklist
  • A table that delineates the alignment between goals, assessments, and activities with comments suggestions places with strong alignment and others where the alignment is not clear
  • Peer provided list matching best practices to parts of the syllabus with suggestions for refinement
  • Peer friendly critique of course description in syllabus focusing on one or two elements such as student-friendly language or relevance to future coursework
  • Notes from the review process
  • List of areas on the syllabus that were modified, how they were modified, and rationale for modification.
  • A brief description of rationale for getting peer feedback on syllabus (students do not read it, desire for it to be a learning tool, etc.), general statement of recommendation from peer, and changes made.
  • Summary and analysis*
  • Add your colleague(es) as guests to your Canvas course to examine the course content, instructional design, student responses to discussion questions, etc.
  • Due to the lack of clear boundaries on a class period and so many possible components to review, it is crucial to discuss the instructor’s desired scope and aims.
  • How do my students perceive various aspects of my course’s online portal, including:
    • Ease of navigation
    • Clarity
    • Availability of resources
    • Tone
    • Transparency
  • Do the existing discussion questions elicit high quality responses that encourage my students to challenge their own ideas, as well as others’?
  • Many of the obstacles associated with teaching online courses, such as those related to student motivation and engagement, can be mitigated with a well-designed Canvas module.
  • Annotated syllabus checklist
  • Peer evaluation of student responses to discussion board
  • Annotated QM rubric
  • Recorded think aloud with peer while they navigate Canvas site and/or describe their understanding of an assigned task (ala Assignment Transparency Check)
  • Before and after screenshot of Canvas homepage with description and brief rationale for changes.
  • Summary of how your discussion board questions evolved after peer feedback along with samples of improved discussion threads. This may include changes in how boards were managed.
  • Summary and analysis*
  • Review a set of your most recent student ratings and write down the three general conclusions you’ve drawn from them and a couple of questions raised by them.
  • Share the ratings with your colleague but do not reveal your conclusions or questions.
  • Review each other’s results and write down three general conclusions and some follow-up questions.
    • Do you think the conclusions you have drawn will be the same ones your colleague arrived at after looking at your results?
  • How would a peer or colleague interpret my students’ perceptions of my teaching?
  • Among other things, SPOTs questions measure student perceptions of the quality and effectiveness of the student-instructor relationship, and how well the instructor helped them learn, which are all vital for cultivating a supportive classroom environment for all students.
  • The three take-away conclusions, follow-up questions, and suggestions from peer
  • Reflection from both peer and self on how the take-away conclusions were different with an exploration of why that may be.
  • Plan for adjusting in class practices based with reference to outcomes of SPOTs Sharing activity (rather than to SPOTs scores themselves).
  • Statement of how you will reinforce and/or change practice(s) based on reflection on different conclusions during SPOTs Sharing activity.
  • Summary and analysis*
  • An instructor who taught Introductory Biology for the first time last semester received her SPOTs results and wants the perspective of an instructor who has successfully taught the class several times.
    • She wants her peer to focus on interpreting the open-ended comments she received.
    • Her peer notes that several students mention they feel she often confuses due dates and deadlines, which confuses them—something she had missed in her review of the comments.
    • The instructor uses this feedback to make changes in how she discusses deadlines and due dates in class.
  • Strategy does not have to be a highly innovative approach or something that requires lots of extra preparation (e.g., the two of you may decide you would like to try a different approach to quizzing)
  • Pay attention to what happened and then get together to talk about the results and their implications.
  • How can I gather additional evidence to support the effectiveness of a new instructional strategy?
  • Trying a new instructional strategy with a peer can provide additional evidence of its effectiveness or some clarity if you yield different outcomes and can be helpful for discussing obstacles.
  • Work together with a peer to produce any/all of the following:
    • A rationale for testing a practice along with annotated citations
    • Reviewer notes from education research or content area expert
    • Observations of each other’s classrooms while implementing new practice.
    • Assessment of learning instrument validity
    • Evidence of learning gains: data, analysis and conclusions
  • A report of the innovation project with outcomes and implications for future iterations
  • Presentation of the innovation, outcomes, and implications. Can be at FISSS, DBER, department meeting, or local conference
  • A paired reflection on the innovation process with focus on the collaboration
  • Summary and analysis*
  • Two Biology instructors want to implement clicker quizzes as Classroom Assessment Technique in their Introductory to Biology courses, which has been tried in the past in their department with little success
    • The instructors read up on best practices in using clickers and implement the strategy together so they can jointly troubleshoot obstacles and gather evidence regarding effectiveness.
    • Throughout the semester, they exchange several email updates and meet at the end to discuss the process and their findings.
    • Students report that the quizzes are good practice for exams and final grades show an overall class average increase over the previous semester’s cohort.
    • Both instructors reflect on troubleshooting through challenges together and agree it was very helpful.
  • Observe and experience what it is like to be in one another’s classroom and then have follow-up conversations after each visit.
  • What are my students are doing in class while I am focused on teaching?
  • Which students seem engaged, and which ones do not?
  • What it is like to sit through one of my classes?
  • How is my presentation of course material with regards to level of rigor, accuracy, flow, accessibility, speed, and representation?
  • Focused classroom visits can help you gather targeted feedback on your instruction. Further, if you ask a peer to come in more than once during the semester, you can also gather feedback on how you’re improving in areas you targeted based on the original observation.
  • Field observation protocols for student engagement (i.e. seating charts with codes for what students are doing, nationally published protocols with rubrics, etc.). Note if you want to recommend observations with rubrics, this would require careful selection of an instrument and training for the observer.
  • Observation notes on a specific element of the class such as how much wait time is allotted for a question or at what point in the class do students appear most engaged.
  • Class notes annotated for content delivery.
  • If you engage peer observations more than once you can report improvements in areas you targeted based on the original observation
  • General description of findings/ observations from peer with explanation of new practices you want to try
  • Summary and analysis*
  • After a teaching session or online course module, take 5 minutes or so to jot down thoughts on:
    • What went well?
    • What could I have done differently?
    • How will I modify my instruction in the future?
  • In general, how am I feeling after implementing a new instructional technique? How is my confidence?
  • Consistently conducting post-class self-checks can be used to yield very insightful longitudinal data on practically any aspect of teaching that is of interest to the instructor.
  • Keep a log (text, video, or audio) to track your progress and improvement over time.
  • Quotes or excerpts from your log
  • Summary and analysis*
  • Can be broad (e.g., what was my overall impression of how my class went today?).
  • Can be narrowed down to a specific topic/issue that you would like to focus on in your teaching (e.g., how engaged were my students during class today?)
  • How can I track progress or change in specific aspects of my teaching over time? For example, related to:
    • Enhancing student participation
    • Planning meaningful learning experiences
    • Providing useful and timely feedback
  • Reflective journaling can help an instructor articulate rationale for specific teaching strategies and reflect on strengths and weaknesses of those approaches before, during, and/or after they are implemented.
  • The journal itself is a form of record keeping.
  • Excerpts from your journal
  • Responses to select guiding questions
  • Summary and analysis*
  • Completing the same course evaluation form completed by students can highlight discrepancies and prompt further reflective thinking
  • Are there any discrepancies between my students’ perceptions of my teaching practices and my own?
  • Identifying specific discrepancies can help you develop potential explanations for the misalignment and identify possible strategies to address issues moving forward.
  • Record discrepancies between your students’ ratings and your own
  • Summarize your interpretations of these differences
  • Describe potential strategies to align your students’ perceptions of your teaching with your own
  • Report any discrepancies between students’ ratings and your own that you find particularly meaningful, including what these discrepancies may imply, and potential next steps
  • Report your progress in aligning your students’ perceptions with your own
  • Summary and analysis*
  • An instructor wants to identify potential discrepancies in their students’ perceptions of their teaching and their own beliefs about their instruction, so they complete the SPOTs survey prior to students and compare then conduct a comparative review of the results.
    • An instructor wants to identify potential discrepancies in their students’ perceptions of their teaching and their own beliefs about their instruction, so they complete the SPOTs survey prior to students and compare then conduct a comparative review of the results.
    • The instructor then reaches out to 5 students and ask if they would be willing to participate in a focus group to discuss the course resources and uses this feedback to revise the materials and sources for the next semester
  • A brief survey that helps instructors assess their teaching approaches.
  • Often consist of multiple-choice questions on a Likert-scale and often take less than 10 - 15 minutes to complete.
  • To what extent are my instructional practices:
    • Characterized as learning-centered?
    • Encourage student engagement and inclusivity?
    • Promote an active learning environment for my students?
  • Teaching Inventories are useful for:
    • Highlighting strengths in faculty’s instructional approach
    • identifying specific areas/aspects of instruction that need refining but may be otherwise difficult to identify
    • Providing formative feedback or further developing specific aspects of instruction
  • Inventory results
  • Completed inclusion worksheet
  • Brief description of the inventory, your results, and steps taken (or to be taken) based on results
  • Summary and analysis*