Feedback Loop Client Case: Notre Dame Engineering

Jan 20, 2025 | Blog, Client Case Study

Notre Dame College of Engineering uses Feedback Loop in their First-Year Engineering Program. This course includes over 500 students collaborating in teams on a high-impact team project involving the design of an air quality sensor housing. Below, Dr. Andrew Bartolini, Director of Notre Dame’s First-Year Engineering Program, details his experience using Feedback Loop.

Beyond engineering, Feedback Loop is used across all disciplines that involve student teamwork. Click here to view our library of customer case studies.

Client Overview

In 2025, Notre Dame Engineering adopted Feedback Loop for its 515 student First-Year Engineering Program. Dr. Andrew Bartolini, Director of Notre Dame’s First-Year Engineering Program, needed a peer feedback solution that is easy to use, worked well even in large courses and can easily provide performance insights to students so they can learn from their teammate evaluations.

Challenges

Before adopting Feedback Loop, Notre Dame used a less user-friendly peer feedback tool that they stopped using after reducing their volume of group work during the pandemic. During this time, they used a document-driven approach with one page reflection papers.

What problem was Feedback Loop implemented to address?

Professor Bartolini: The reflection papers were helpful, but they weren’t in a format that could be shared with other team members (i.e., the team members weren’t learning what other team members thought of their performance). Additionally, the memos lacked quantifiable data that was easy to review quickly for the faculty members.

Solution

What was the outcome in implementing Feedback Loop?

Professor Bartolini: With Feedback Loop, we conducted peer feedback one additional time (we conducted two rounds with the memo format, but did three with Feedback Loop) since it was so easy and quick to conduct the feedback. In our first year of using Feedback Loop, we didn’t observe a change in the percentage of students who agreed that their team worked effectively; however, we already had approximately 85% of students state that their teams worked effectively (so there isn’t much room to increase that percentage). My plan for 2026 is to incorporate peer feedback more into class discussions (this year, we simply replaced the assignments without adding any lecture content); I would be curious to see if that helps increase the number of students who think their team worked effectively. Some of our faculty members used the results to help identify team conflicts early, and the feedback helped in conversations with these teams. Additionally, I’ve used some of the general trends in understanding how individuals worked within a team setting when writing letters of recommendation.

Are there any key features of Feedback Loop that were especially helpful for your course?

Professor Bartolini: I enjoyed the ability for students to rate themselves and then also rate others. It was helpful for students to see how they rated themselves vs. how others rated themselves. This gave students a chance to calibrate their interpretation of how they were performing compared to their peers’ observations. In a similar way, I enjoyed the question where students had to distribute 100 points per student; this gave a great indication of how they thought effort and work were distributed. Finally, I also appreciated the ability to pull teams directly from Canvas, as that was a significant time-saver.

Did you use any optional features to customize your peer feedback experience?

Professor Bartolini: We released the results to the students (only the quantifiable results). We also used auto-grading (set simply for completion). Students were worried that their feedback would impact the grades of their peers (a fairly typical concern for first-year students), and so we graded solely on completion. In an upper-division course, I would consider grading based on peer feedback scores.

Did you use a Feedback Loop survey template or build your own?

Professor Bartolini: We used one of the Feedback Loop survey templates; this was quick, and I thought the wording of the questions was really strong. The wording was such that you could still say the person was a good teammate, but give them a rating of 4 out of 5 and identify an area for improvement. I also liked how the questions were themed in competency, leadership, and teamwork.

Key Features for Notre Dame First-Year Engineering Program

Canvas LMS logo

Seamless Team Sync with Canvas



Zero-sum Performance Ratings: Student’s Can’t “Give Everyone 5/5 Scores”



Self vs Peer Performance Insights for Students

|
Andrew Bartolini

Dr. Andrew Bartolini

Director, First-Year Engineering Program

University of Notre Dame, College of Engineering

Feedback Loop allowed the First-Year Engineering Program at the University of Notre Dame to seamlessly collect peer feedback on a team project through multiple iterations. The platform is integrated with Canvas, allowing the teams to be transferred without any additional administrative effort. The template surveys provided consisted of quality and thought-provoking questions that we used without customization. As part of the course, we administered three surveys after critical milestones in the project. The numerical aggregated results were shared with the students and faculty to enable students to individually assess how their perceived contributions compared to those of their team members, and for faculty members to identify potential team conflicts. Overall, Feedback Loop was an easy and effective means of collecting peer feedback!

Outcomes

1

Conducting Teamwork Peer Feedback More Often

With Feedback Loop, Notre Dame’s First-Year Engineering course conducted peer feedback more often, because it was so easy and efficient to use. This study conducted at the University of Maine demonstrates that doing peer feedback more often (along with sharing the results back to students) improves student performance within teams.

2

Active Learning: Student Performance Insights

Course-wide results in Feedback Loop

Notre Dame’s previous solution did not allow for sharing results back to students so that they could learn from their teammates’ ratings. Feedback Loop includes an option to release results back to students in an easy-to-read performance report that provides insights on student performance while protecting student privacy. Notre Dame chose to release quantitative feedback to students while withholding comments to customize the transparency settings of shared feedback to their needs.

3

Accurate View of Student Performance on Teams

Notre Dame used the points allocation question requiring each teammate to distribute points among their teams. This provided a deep view of how students felt their teammates’ effort and work were distributed.

Want to learn more? Schedule a Feedback Loop demo below and see if it is a fit for your school, department or course!

G

Recommended

Discover more from Feedback Loop

Subscribe now to keep reading and get access to the full archive.

Continue reading