top of page

DATA ANALYSIS

Climate Survey

      The climate survey provided me with an overall picture of the feelings and attitudes of my students toward our classroom. The survey was given to students individually, by my CADRE mentor, because I wanted my students to feel completely comfortable giving answers they truly felt. The graphs provided show student responses to four of the survey questions. Three of the multiple choice student answers are depicted in a pie chart format. The colors align with the answers: blue (strongly agree), red (agree), yellow (disagree), green (strongly disagree). Included in the pie chart is a percentage that correlates with the percent students that picked that answer for each question. The short answer question is shown with student answers in a written out format. Student answers were anonymously gathered and listed. When I got the results, I was happy to see everyone agreed they felt safe in our room and that I believed they could learn. These were two questions that were important to me and I felt impacted my students’ growth throughout my research. A short answer question given to students was “What do you like about being in this class?” and the only responses relating to subject matter were about math. This showed that math groups were seen as engaging and helped get students excited about math. As far as changing and adapting things, a student expressed a dislike for math packets, which was one of the stations that students would visit daily. Packets were chosen because they were a good way for me to see students practicing skills. The packets ensured I had documentation of student understanding, misconceptions, and accomplishment during independent work time. The packets were intentionally utilized because even though they were not as interactive or engaging as games, I knew I needed something to assess in order to determine student understanding. I could then utilize the student demonstration of understanding to alter my daily instruction, in real time. I still feel some sort of documentation is needed, but a different approach may be necessary to support student attitudes. Student G responded that they disliked school and strongly disliked learning. With this data, I was able to be more intentional with this student. During my differentiated math instruction, I kept a closer eye on this student’s progress. As I looked at their scores prior to implementing the differentiated groups, it was clear their positive feelings towards the new math instruction format impacted their improvement from pre- to post test.

Safe chart.PNG
Learn Chart.PNG
Fun chart.PNG
Responses Chart.PNG
Screen Shot 2019-03-10 at 6.28.21 PM (1)

Topic Pre and Post Tests


The topic pre and post tests were curriculum prepared and aligned directly with instruction given. The graph shows an increase in the class average from each pre to post test given during the 6-week research. The blue bars show that for topic 10 the average increased from 82% to 99% on the post test with 17 out of 18 receiving 100%. The pink bars represent the average for topic 11 where 15 out of 18 students received 100% with the remaining 3 missing 1. From the pre to post test students increased their average from 88% to 99%. Finally, the green bars on the graph depict the growth shown for topic 12. Students average increased from 76% to 97%, which was the largest growth gap from each topic. This supports that differentiated math instruction, through small groups, improves student math achievement. Creating games that built upon prior math concepts, documenting understanding through packets, and engaging with IXL proved to positively impact students’ math success. It was good to see that the topic 11 pretest average was higher because it was a continuation of addition that was covered during topic 10. It was important, still, to see improvement from the topic 11 pre to post tests. Topic 12 covered measurement, which was a complete shift in what we had been previously doing in math, so it was validating to see students soar over non-computing math. The topic 12 pretest allowed me to identify student strengths and needs to create groups to better their math understanding. For this topic, the pretests were used to pair up students with students who had different strengths; whereas in topic 10 and 11, students were paired with similar level students. This might have affected their improvement as the post test average showcased less growth than the previous two topics. Topic 10 and 11 both covered addition and topic 12 was measurement, which is assessing a completely different kind of understanding; this difference makes it hard to compare and connect the growth of the differing topics. Prior to beginning research, topic 7 and 9 introduced addition, which made me nervous for major growth to take place for topic 10 and 11 because it was an extension of what we already practiced. It was good to see that growth still took place through instruction and practice, however, I feel more growth would have been showcased from pre to post test had these skills not previously introduced.

T-Test Pre and Post Test

       The pre and post test used focused on the main ideas from topic 10, 11, and 12. The radial graph shows data about all 18 students and the outcome of their growth after completing the post test. Comparing each section of the graph clearly shows that the students who increased take up a majority of the chart. 77.8% or 14 of the 18 students improved their math achievement through the implementation of math differentiation. Three students or 16.7% showed no improvement, but of those three, two of them received a 100% on the pretest making it impossible to improve. With students who received 100% on the pretest, it became difficult to determine how to further their learning. Through differentiated math instruction, higher level thinking questions and activities were created to push these students so they would not reach a level of disinterest. Their improvement was shown through daily observations and discussion during small group time, as well as a more difficult packet to challenge their thinking. This documentation allowed these students to increase their math achievement. It is hard to explain Student A from the bar graph, who went from a perfect score to missing a point on the post test. This data brought many thoughts and confusion, as this student is one who performs well. When looking at the assessment, the student missed the questions surrounding weight and comparing two objects. I feel this mistake could be explained by lack of interest to stay in pace with the class when the test was being given and speeding ahead through the questions without hearing the exact expectations of each question. This student often read instructions and worked ahead during coursework, but this question had no directions tied to it, which explained the error. Tests are given whole group and questions are read aloud. As a class, we take each question one at a time to allow for students to hear the specific directions because although some kindergartners may be able to read the question, it keeps students on track and creates less room for error. Student O was an exciting student to witness so much improvement as they were a student with an IEP and worked with resource to improve their math skills. They were able to work in differentiated math groups with me and receive more time outside of this block to practice skills. Small group instruction allowed this student to receive more one-on-one attention, which improved his math achievement overall.

Screen Shot 2019-03-10 at 6.03.53 PM (1)
0.png

Triangulation

      Each piece of data supported the implementation of differentiated math instruction was effective as students’ math achievement improved, overall. Connections between all the data was made through analysis. The climate survey explains some confusing questions that arose from the study. Student G was my student who reflected on the climate survey that they disliked school and learning, which I feel made an impact on their math growth. While looking at the pre and post t-test, this student showed no growth throughout the course of the study. They tested well to begin, but did not make any observed growth. This student showed a excitement for math small groups, but became hard to engage, which is shown through their data. A question that arose from this connection is how could I better my relationship with this student to allow them to be open to school and engaged in learning. Many students responded on the climate survey that they feel safe in the classroom, which I believe reflects on students ability to collaborate during small group time and learn from one another. Through daily observation and discussion, students showed a comfort with sharing ideas and working through problems. Feeling safe in a classroom allows you to open up and let your guard down so you are able to grow, which was happening during small group time. The growth shown through the topic tests and t-test aligned well and showed similar impact on student achievement. The average for both assessments showed an increase across time, even within the varying mathematical concepts. The topic pre and post tests enrich and support the data found from the t-test because they both showed a level of growth, which means that student were growing on specific math concepts throughout the study and increased their comprehensive math knowledge.

bottom of page