Executive Summary
Team Based Learning (TBL) is a structured teaching approach in which students engage with core content independently prior to class, followed by collaborative work in stable, small groups. The TBL framework typically involves individual and team readiness assurance tests, as well as application exercises that require teams to apply key concepts to complex or authentic problems. This model is designed to promote deeper conceptual understanding, individual and collective accountability, and the development of higher order cognitive skills. TBL is also well suited to large cohort teaching, as it enables meaningful group interaction at scale and shifts classroom activity from passive content delivery to active, application‑focused learning.
There has been no specific one solution to facilitate Team Based Learning (TBL) in Imperial and FoNS has been using a myriad of tools and technologies put together e.g qualtrics, blackboard and excel for over 5 years. The Intedashboard tool was brought in by the University to improve support during and after TBL sessions. These insights relate to a pilot use of the InteDashboard platform, in the Department of Life Sciences during AY 24/25. The report includes perspectives from academic staff, students, EdTech and ICT teams. With its recommendations, the report can support decision making and highlight considerations to be made when rolling out new EdTech. We hope that inisghts derived during the pilot year can be useful to the wider university community e.g tools subgroup and in our vendor relations.
Findings
- InteDashboard fared well amongst students in terms of ease of use, meeting needs for most TBL activities. Out of a total of 120 respondents, 55% Strongly Agreed and 39% Agreed, that the tool was easy to use.
- 93% of the student respondents Strongly Agreed/Agreed that sufficient technical support had been provided for them.
- InteDashboard was preferred by students and staff when compared to Blackboard (used previously to deliver TBL)
- Some features are better suited to meeting academic curriculum requirements than those offered by Feedback Fruits, which was previously used for the Peer Review component of TBL. For example, forcing students to give different scores to each of their peers.
- Module leads suggested that the level of EdTech support required could be reduced with clearer documentation and additional time to become familiar with the tools.
Introduction
InteDashboard was introduced to streamline all stages of the TBL process, including pre-work, individual and team readiness assurance tests (iRAT and tRAT), application exercises (tAPP), and peer evaluation. The platform integrates with the existing learning management system, Blackboard Learn, allowing students to access activities directly and enabling staff to automate grading and monitor real-time performance data. The Department of Life Sciences (DoLS) has strong experience of running TBL sessions, embedding it as part of curriculum delivery since AY 2019/2020 across several modules. However, not all modules implement all stages of TBL.
Aims
The evaluation aims to capture key user perspectives and any challenges; and to make recommendations from recent sessions where InteDashboard was piloted as the main tool for TBL activities. The tool and impact on teaching practice are also being compared against other tools, Blackboard and Feedback Fruits. We hope to discover some answers to these questions:
- Do students find InteDashboard easy to use?
- Are academics able to use the TBL methodology effectively?
- Are Edtech staff able to provide support with ease?
- What are the benefits and drawbacks of using InteDashboard compared to Feedback Fruits.
The objectives are listed below and how they are measured:
| Objective | Action | Success Indicators | Metrics/methods of assessment |
|---|---|---|---|
| 1.Students to be able to use InteDashboard easily (where tech doesn’t interfere with their learning) | Use tool designed for peer evaluation of contribution |
Few/no student requests for help Reports on ease of use |
Number of students asking for help Response to survey regarding ease of use |
| 2.Academics able to run TBL activities with intended learning outcomes | Use of tool for TBL activities compared to bespoke solution |
Adaptions required are minimal Tool enables TBL facilitation Fewer issues/errors |
Feedback from interview |
| 3.EdTech staff are able to set up activities with ease and efficiency | Level of support required for tool in TBL activities compared to bespoke solution |
Work process streamlined with tool Fewer issues/errors |
Feedback from interview Support requests |
Table 1. Objectives and success indicators
This report evaluates the bespoke uses of the tool, InteDashboard for TBL across six modules across terms and Year 1 and 2 in AY 24/25:
| Modules | Title | Participants | Module lead | Date(s) | Previous platforms used |
|---|---|---|---|---|---|
| LIFE50016 | Applied Molecular Biology (AMB) | 136 | Pietro Spanu | 9th Oct 2024 and 15th Oct 2024 | Blackboard Feedback Fruits |
| LIFE50025 | Bioinformatics, Programming and Statistics (Biological Sciences) (BPS) | 138 | Josh Hodge | 2nd Nov 2024 and Wed 13th Nov 2024 | Blackboard |
| LIFE50026 | Bioinformatics, Programming and Statistics (Biochemistry) (BPS) | 154 | Josh Hodge | 12th Nov 2024 and Wed 13th Nov 2024 | Blackboard |
| LIFE40002 | Cell Biology (CB) | 158 | Josh Hodge | 21st, 22nd May and 3rd June | Blackboard Feedback Fruits |
| LIFE40007 | Ecology and Evolution (EE) | 174 | Josh Hodge | 21st, 22nd May and 3rd June | Blackboard |
| LIFE40006 | Cell Biology & Genetics (CB) | 160 | Ste Cook | 20th May | Blackboard |
Table 2. Pilot Modules information table
Methodology
Mixed methods were used for this evaluation: a student survey was used to collate feedback, and interviews were used for teaching and edtech staff (see Appendix A).
There were no exclusion criteria, and our sample size was the same as the total student population on the module ~ 900 students. All students in the modules were invited to submit feedback via an MS Form survey link available either via email or by scanning a QR code in the cluster room. The form included a separate section of questions on the Peer review aspects. Students were asked to rate their experience using a 5-point Likert scale. It was released after the sessions had been completed, and reminders were sent out by education office staff and academics. Support requests from academics were extracted and filtered from the Edtech mailbox.
The academic staff were under a lot of pressure and to minimise the interruption to their work we decided to carry out structured short interviews (see Appendix B). EdTech staff were asked to note their observations and experiences and submit their written responses addressing some prompt questions.
The service requests mailbox data was filtered for the keyword “InteDashboard” through Power Query and visualised using PowerBi.
InteDashboard deployment
InteDashboard was selected as the preferred tool by the University Tools subgroup for supporting Team‑Based Learning (TBL) because it met all key pedagogical and technical requirements and offered a modern, intuitive interface familiar to learning technologists and teaching staff comfortable with educational technology. A central aspect of its intended use was seamless integration with the institution’s VLE. Through its LTI integration with Blackboard Learn, the tool enabled:
- Automatic creation of an InteDashboard course when a teacher accessed it via the LTI link
- Synchronisation of enrolment data between Blackboard and InteDashboard
- Alignment of teaching/admin roles across both systems for course setup and collaboration
- Automatic creation of Grade Centre columns for each TBL component, allowing grades to be passed back with minimal manual input screenshot and brief outline of tool features
To support implementation at scale, FoNS ETL delivered a structured programme of training, onboarding and session support. This included:
- Demonstrations of InteDashboard for existing TBL practitioners
- Hands‑on training for new users adopting the tool
- In‑session support for live TBL activities to ensure smooth delivery
- Development of written documentation and user guidance tailored to Imperial workflows
FoNS ETL also revised an existing teaching session on delivering TBL with technology to reflect the workflows introduced by InteDashboard and delivered this updated training to multiple teams across the institution who were adopting or supporting TBL for the first time.
Results & Findings Objective 1: Student perspective
Students were provided with documentation, and some module leads also provided a familiarisation session. Amongst the students surveyed there is a mix of those with experience from previous TBL sessions using other tools and complete beginners to TBL itself as a learning method i.e 1st year undergraduates.
The total number of students who completed the survey was120. This gives us a total response rate of 13%.
| number of participants |
number of responses |
% of responses | |
|---|---|---|---|
| AMB | 136 | 30 | 22 |
| CBG | 160 | 78 | 49 |
| BPS | 292 | 6 | 2 |
| EE/CB | 332 | 6 | 1.8 |
| Totals | 920 | 120 | 13 |
Table 3. Response rate per module
Figure 1. Response rates across modules
There were 30 respondents (22% response rate) to the student survey in the Applied Molecular Biology (AMB) module, 78 respondents (49% response rate) for Cell Biology and Genetics (CBG) module, 6 respondents (2% response rate) for Bioinformatics, Programming and Statistics (BPS) modules and 6 respondents (1.8% response rate) for Ecology & Evolution (EE) and Cell Biology modules (CB). A plausible reason for the low number of respondents in BPS and EE/CB may be that the surveys clashed with exams and there was less involvement of EdTech staff, hence fewer points of contact to encourage students to fill the survey via a QR code or through discussion boards. This highlights that student feedback needs to be collated in a timely manner that works for them. Having said that due to the uneven response rates we must be cautious about how we generalise the findings and recommendations across the faculty.
Clarity and Ease of Access
Across all modules, 120 who responded to Question 1, which refers to the ease with which TBL activities could be accessed, 55% strongly agreed and 39% agreed.
Provision of Technical Support
93% of the respondents Strongly Agreed/Agreed that sufficient technical support had been provided for them.
Figure 2. Responses to Q2: Sufficient technical support was provided for using InteDashboard for the TBL activities (iRAT, tRAT and tAPP)
Comparison between InteDashboard and Blackboard
The AMB module students, who had experience of TBL in Blackboard from previous years, were asked if they preferred InteDashboard, 63% of respondents selected strongly agree or agree.
One reason students didn't favour InteDashboard was because of pedagogic decisions of the teacher, i.e. having the team members work through the tAPP questions together and giving different scores for team members to encourage deep thinking about contributions.
Students were frustrated by not being able to see all the questions for tAPP at once to enable them to share the workload and save time. However, this approach would defeat the purpose of everyone contributing to all questions and undermine consensus building and teamwork. While the tool doesn't allow all questions to be available concurrently, this could be worked around by handing out a hard copy of the questions.
Figure 3. Reponses to Q4: For delivery of the iRAT, tRAT and tAPP elements of TBL, InteDashboard was better than the Blackboard
Comparison between InteDashboard and FeedbackFruits system
In previous years, FeedbackFruits was used to capture peer evaluation of contributions. This activity accounts for 10% of the overall TBL mark. The evaluation report on FeedbackFruits by Helen Walkey, reviewed the use of this tool for the LIFE50016 module (AMB).
The module lead wanted to use a particular scoring method: requiring the students to provide different scores for each of their peers. FeedbackFruits does not have the capability for this type of scoring method. This meant that the scores needed to be manually calculated based on the data downloaded from FeedbackFruits. In InteDashboard, this scoring method is possible. This prevents students from giving their peers the same scores and forces them to think carefully about the contribution made by each of their peers.
Only the AMB students undertook the feedback peer review question in the survey, as they had used FeedbackFruits previously for peer evaluation of contribution. The response was less positive when it came to comparing the peer feedback element, with only around 50% of students preferring InteDashboard.
Dissatisfied comments that were reported by students reflected the teaching practice and method of scoring used and not the overall ease of use of the tool.
| AMB: Responses | |
|---|---|
| Strongly agree | 3 |
| Agree | 11 |
| Neither agree nor disagree | 14 |
| Disagree | 1 |
| (blank) | 1 |
| Grand Total | 30 |
Table 4. AMB responses to Q9: For peer assessment, InteDashboard was better than FeedbackFruits.
Open text comments
Some suggestions were made in open text comments on how these sessions could be improved, with multiple students mentioning that they would prefer to have automatic saving of answers and the option to delegate tAPP answering within the team. This feedback will be passed on to vendors for further exploration.
“I would have liked that the control of viewing questions was [not] reliant only on the reporter. This did not allow us to allocate work to different people and we all had to work on 1 question at a time.”
“so there's a "next" button to move onto the next question, there should also be a "back" button to go to the previous question. the current "back" button should be called "exit", because i accidentally pressed it and left the test, misleading!”
“Easy to use, liked that someone got appointed as reporter automatically”
“In team tests, you can only view the question that the reporter is looking at. This makes it hard to delegate questions and divide labor.”
“You have to save answers. I prefer that Blackboard automatically saves answers”
“You don't have personal timers. As someone with extra time on the iRAT, I prefer having a personal timer, as I feel that if something goes wrong- I'm more likely to still be able to use my extra time that way.”
Results & Findings: Objective 2 Academic Staff Perspective
Following the launch of InteDashboard at Imperial in September 2024, the EdTech team provided comprehensive support to academic staff, including demonstrations, user training, session support, and tailored documentation. There was also collaboration between EdTech labs, especially Faculty of Engineering (FoE) to adapt existing guidance and run updated training sessions. All Life Sciences staff that engaged with TBL were invited to test the tool through demos. Three module leads agreed to participate in the pilot roll out year and its evaluation.
InteDashboard as a tool to facilitate TBL
All three academics strongly agreed that InteDashboard effectively supported the delivery of TBL components (iRAT, tRAT, tAPP). Not all used the last stage of TBL (peer evaluation).
The preferences of the teaching staff led to pedagogical variations and differences when it comes to administrative and management perspectives. The academics are all very experienced at delivering TBL. They chose differing levels of support from the EdTech team: one academic received full assistance, another had partial support, while a third managed independently using mainly guidance documentation which the Faculty of Engineering EdTech Lab had drafted and wanted feedback on.
Level of support provided
All academics appreciated the support and guidance provided by EdTech, though some identified areas for improvement.
“Having an individual walkthrough and the FoE documentation were great; also receiving ongoing updates was really useful.”
“Understanding the specific wording about submissions is important… Guidance around this terminology would be helpful.”
Ease of administration setting
When asked about ease of use from an administrative and management perspective, not all participants were able to comment due to differing levels of support mentioned above. One academic highlighted aspects of facilitating extra time:
“Managing extra-time requirements and different start times is not as easy as it could be… The twelve-hour clock… is very annoying.”
Despite varied initial experiences, all academics showcased great confidence in future use when asked if they would be willing to set up sessions themselves in future.
“Yes – no problem… but better functionality to import questions would help.”
“Yes – but with availability of help and support if needed.”
In addition, all three academics showed appreciation for the real-time reporting and the Clarifications feature. The question bank feature which seamlessly copies questions across the modules created in InteDashboard was also appreciated. Academics also provided some valuable feedback for the vendor on how to enhance the usability and functionality of InteDashboard:
- Make the Clarifications function available for both the tRAT and tAPP. (The academic could not find it for the tAPP.)
- The iRAT and tRAT exports provide individual student grades, whereas the tAPP export only provides team-level grades. It would be beneficial if the grading outputs were more consistent across all components. Suggestion was that individual scores should be available for tAPP too.
“InteDashboard really needs to either allow you to export some of these data from the peer review into Bb directly via LTI, or provide a mechanism through the LTI where actual, meaningful, unique primary keys (CID or username) are being used to keep data associated with students, and that those appear in the export files.”
Results and Finding: Objective 3 EdTech Perspective
Support provided
All of the Life Sciences academics who were going to use InteDashboard to deliver TBL, were given a demonstration of how to use the tool.
Meetings were then scheduled with each academic to discuss the level of support needed. Where full support was required, this involved creating questions for each of the TBL activities, configuring the various settings (based on information supplied by academic and education office). Previously we had to return proofs in pdf format for academic to review and then corrected by us if there were any amendments. With InteDashboard, the academic (with instructions provided) was able to make edits directly in the tool.
Documentation to help students was created and communicated via Blackboard and TBL familiarisation sessions. In-class support was provided on the day to support students and academics to manage any technical issues arising from the tool.
For those who only required partial or no support, they were provided with various links to support documentation (vendor documentation and bespoke documentation). However, they were able to contact us directly with further queries. For those who had partial support, in-class support was also provided.
Figure 4. Support requests in the EdTech mailbox
Enhanced Workflow and Efficiency
The adoption of InteDashboard has significantly streamlined the workflow compared to the previous way of facilitating TBL, which relied on a combination of Blackboard tools such as quizzes, adaptive release, groups, and signup sheets. The integrated approach of InteDashboard reduces complexity, minimises the potential for errors, and saves considerable time in setup and execution. The table below provides an overview of required tasks to set up each TBL element.
| Tasks | Blackboard (BB) | InteDashboard |
|---|---|---|
| iRAT activity | Create questions in BB quiz | Create an iRAT activity and add questions |
| tRAT activity | Each question is set up as quiz with adaptive release to enable multiple attempts and setting scoring for each attempt. | Can set up the iRAT and tRAT as one activity which means that questions are copied over to tRAT and scoring only needs to be configured once and not for every attempt |
| tAPP | Create questions in BB quiz | Create a tAPP activity and add questions |
| Group and membership | Set up groups and add group memberships | Groups and membership not pulled through via LTI. Has to be set up manually. |
| Team reporter | Create for each team a self-enrollment group and set adaptive release to only allow members of a group to see the enrolment link. To change team reporter, requires manual intervention from learning technologist | No setting up required as the first who enters the activity will be given the role of Team reporter. Other members can take over as the reporter. No intervention is required from instructor |
| tRAT results | Downloading results and process in excel for tRAT discussion session | Real-time data presented on dashboard |
| tAPP results | Downloading results and process in excel for tAPP discussion session | Real-time data presented on dashboard |
| Peer evaluation | Use of external tool: FeedbackFruits | In built peer evaluation tool |
| Peer evaluation results | Download results and process in excel | Download results and process in excel |
| Grade column and publish scores |
Prepare grade columns for tRAT and tAPP score import. (this is because the scores for tRAT and tAPP are only displayed for the team reporter) |
When scores are pushed through via LTI, grade columns are created and scores are pushed through automatically. |
Table 5. Tasks involved in TBL setup in Blackboard and InteDashboard
Since InteDashboard connects to Blackboard Learn via an LTI integration, it means that student enrolments could be brought[SM1] across for the VLE and grades posted back to Blackboard from InteDashboard. Some features meet the teaching practice needs better, for example the option of forcing students to assign different marks to peers in the peer assessment. The reduced need for support was demonstrated by some of the academics successfully following guidance to set up sessions independently.
User-Friendly Interface
InteDashboard features an intuitive interface that is easy to navigate, requiring minimal training. It was observed how both staff and students engaged with the platform effectively without a steep learning curve.
While running any TBL sessions no major technical difficulties or queries arose. It was noticed how effortlessly an academic used the various analytics dashboards to facilitate discussion. For instance, they could easily identify not only the reporter from each team but also the other team members. Previously this involved asking “someone from Team X” to explain or the facilitator referring to a list of team members to find a specific student in the team and direct a question to them, which disrupted the flow of the session.
Whilst running sessions there were also notes made on potential improvements and where the interface was letting the experience down e.g. student’s landing page.
Table 6. lists issues noted in various sessions made by the EdTech staff.
| Issue no. | Description |
|---|---|
| 1. | Currently users are taken to the InteDashboard landing page “My Courses” from any LTI link, which present users with a list of all their courses. It would be a better user experience if we had deep linking between a Blackboard course and the corresponding course in InteDashboard. |
| 2. | Clarifications also to be available in tAPP activity as they are in the tRAT. |
| 3. | Announcement functionality to aid communication where TBL is being run in several rooms. |
| 4. | When managing extra timers for the tRAT activity, we have to use a workaround when we want extra-timers to finish at the same as the rest of the cohort (i.e. they need to start early). Currently the best workaround is to set up the assessment as asynchronous with password. |
Table 6. Issues noted in sessions
Real-Time Data and Facilitation Tools
The platform provides real-time analytics, which is a great benefit allowing educators to monitor team performance on a question-by-question basis. This was previously facilitated by EdTech being hands-on and in the room to download and process the data from the Blackboard Grade Centre and quickly turn this into a report for the teacher to facilitate discussion. Highly stressful process and room for error, which would distract from the teaching and learning. Additionally, EdTech staff noted that the ability to view clarifications during tRAT sessions supported academics in real time, enhancing their ability to monitor progress. This feature also fosters stronger teamwork and leads to more meaningful in-class discussions.
Figure 5. Screengrab of analytics
Study limitations
We are aware that our study has limitations in the way it was conducted. We have already mentioned the timings of distributing the survey could hav been more aligned with students’ academic schedules and workload. In similar vain the study could have been carried out in a less busy period for academics so they could fully benefit from EdTech support that was available to them attending demos etc.
Conclusions & Recommendations
Our findings indicate that InteDashboard is an efficient tool to facilitate TBL sessions. Staff and students in large prefer this tool over previous methods and tools used. Since we have used descriptive analysis only, caution is needed when interpreting these results. We want to empahsise that our findings are indicative due to uneven response rates.
We recommend continuing use of InteDashboard, with a focus on providing additional guidance and training for staff and students to address any initial hurdles. Consideration must be given to monitoring demand for the tool to ensure that licensing arrangements are sufficient to cover increased adoption. We also suggest further evaluation of a wider range of modules and ongoing collection of feedback to refine the implementation and enhance the overall TBL experience.
Contact us
Get in touch with the EdTech Lab and AV Support Team: