top of page

Qualitative and Quantitative Modes: Overview and Features 

SeeMeTeach offers a tool that reduces subjectivity by providing tons of data and evidence, increasing confidence and trust in the process. Below are some key features of the SMT tool, followed by an example showing the contrast between qualitative and quantitative feedback. 

Qualitative Comments Mode - The upper-half of the graphic below shows how an observer can leave comments and recommendations, and post-observation easily locate such for feedback and coaching. 
Video Window - observation can be completed via video, live virtual, live in person, or from audio.
Timeline of Comments - shows where in the lesson an observer(s) placed a comment, or reacted to another team member's comment - click and link to the comment and the video segment.
Comment Buttons - observe, click to label a comment, type the comment or recommendation. Multiple sets of comment buttons to choose from.
SMT new pic 1.png
Comments - Shows single observer or team's comments. Scroll or use Find by comment category for pinpointing specific feedback.
Respond to a Comment - The user
can react, respond, or add on to a comment.
Shows data for each single student and the whole group regarding engagement and classroom management.
Seating Chart Heat Map - shows extent of student engagement or misbehaviors for each student or group of students. 
Data Analysis Options - Multiple ways the robust data can be analyzed and displayed via charts, tables, graphs and student seating chart heat map.
Data Timeline - black bars indicate individual student events, red bars are group events, all linked to video segments on a backdrop of the type of lesson. 
Quantitative Data and Analysis Mode - The lower-half of the graphic above shows the robust post-observation data analysis options. 
For a complete guide to the extensive features and options available when using this teacher observation tool download the Overview and Features Guide.

A Common Teacher Observation Scenario - Why the profession needs data and evidence infused into observations, feedback, and coaching of teachers.

Joe’s goal was to get his students highly engaged in the lesson and to get students to think deeply about the science concept of momentum. Joe taught a lesson, that was observed by his supervisor and cooperating teacher and then was provided the following feedback.

Professional impressions are useful!

Feedback to Joe - "For the most part, your students were on-task in this activity. Your questioning was good, and the students responded to your questions. Your wait-time seemed a bit short, so work on using more wait-time. Students were engaged in the lesson and responded to your questions. Your explanations were clear, and there was a logical flow to how you presented momentum."

• Feedback from the 2nd observation - “Your wait-time seemed longer compared to last time. You asked some good questions and students seemed engaged in the lesson.”

• Joe’s self-reflection - I think I improved my questioning and asked the students to think on a deep level. My wait-time was better this time, and like during the first lesson, students were engaged and responding to the questions.

General Qualitative Statements and Lack of Data - Here is what we know from practice and from research. Professional and novice impressions might be the same or might be different, and data might support the statements or maybe not. How will Joe, his supervisor, his mentor teacher, or an administrator know beyond the level of impression whether Joe is improving some of the core and critical aspects of teaching, such as questioning, responding, and use of wait-time, or not?

 

When observations and feedback consist of only general qualitative statements, there are two vital but missing components of teacher observation. This baseline data and objective evidence consist of discrete and identifiable teacher and student actions 1) that clearly define current practice and guide changes to instruction that have a greater impact on the learner and are more effective in the classroom, and 2) that form the data and evidence-based indicators that are markers of growth and blatantly show whether or not changes were made to instruction.

Let's compare this low-resolution qualitative feedback with higher-resolution feedback, as shown in the next column.

But professional impressions based on data are 9.5 x's more powerful!

Feedback to Joe Using Data - A teaching episode contains an immense amount of data that could and should be used for feedback, coaching, and teaching growth indicators.

•Teacher Actions (questions, responses, wait-time)

"Joe, you asked 56 questions during class today. 38 were yes/no type questions, 15 were recall type questions, and 3 questions required students to think at a higher level. You stated you wanted students to think deeply about momentum. The data from the first lesson to the second were quite similar. How does your questioning promote higher-level thinking? What would the questioning data look like after the next observation that would be more consistent with your goals for instruction?"

"Joe, after getting a student's response, the data for interaction patterns clearly shows that 45 times you clarified the student's answer vs. asking the student to tell you more about their thinking. So, following a short student response, you then talk and often launch off into a mini-lecture and add a whole lot of meaning and interpret that response with your words and your thinking. When you clarify for the student, you are doing the thinking, and we learn what you know, not what the student knows or doesn't know. How does that fit your goal of getting students to think deeply, and what might you do instead? What predominant pattern should we see at the next lesson observation that will be more in line with promoting student thinking?"

"Joe, we know that even for the average student to come up with a response, and especially to think deeply, they have to be offered the time to think, and few students think super-fast, especially the students with some learning challenges. Let's look at the data on your use of wait-time. For the first lesson, your wait-time one average was 1 second, and your wait-time two average was .5 seconds. For the second lesson, your wait-time one average was 1.2 seconds, and your wait-time two average was .7 seconds. Yes, you are showing a tiny bit of improvement, but a long way from the 3.5-second average that research indicates is the tipping point for seeing some of the positive benefits of using wait-time. By the way, the wait-time tables show that your wait-time for the questions requiring students to think more deeply was about the same as for yes/no questions. Let's talk about a specific strategy for you to employ to increase your wait-time and see if the data from observation #3 shows a substantive increase in wait-time, or not."

•Student Actions and Engagement

"Joe, the observer, and you both felt like students were engaged in the lesson. But looking at the seating chart heat map showing which students responded to questions, it shows quite clearly that five of the 30 students answered most of the questions, and a couple more students answered a few questions. We talked about some teaching strategies that get all students to respond so you can find out what most students are thinking and whether they are learning or developing more solid thinking about momentum. Which of those strategies could you employ here so that for the next observation, the heat map has color and multiple responses for all the students?"

•Misbehaviors and Teacher Reactions

"Joe, you said that 4th period was a class with many misbehavior, and it was a class full of misfits. Here is the data I gathered about the misbehaviors and your reactions. The seating chart heat map shows that all the misbehaviors came from five students - just five students. Look at the map, which shows that those students are all at tables away from the front of the room where you stand. The problem lies with only those five individual students causing problems, so let's talk about how you can minimize their misbehaviors and de-escalate the commotion vs. the interactions you have with them that escalate in nature. Let's pull up the misbehavior events data linked to the video, watch how you handled the problems, and then talk about what you could do instead. Let's connect this data with the seating chart heat map, showing only five students answered questions. What do you see as a possible connection between not being engaged in the lesson and, instead, finding other non-productive things to do in class?"

Data and Evidence at our Fingertips

Teaching episodes contain a digital fingerprint of teacher and student actions that are evidence of current practice and can also be used as indicators of change in practice. What data are now readily available and can be easily collected and used for feedback and coaching future or current teachers? Below are some brief descriptions and the Overview and Features document provides an in-depth review of such data along with the analysis that can be used to deconstruct a teaching episode and then develop a plan for improving teaching based on data and objective indicators of teaching effectiveness.

bottom of page