Abstract
The purpose of this manuscript is to report our experience in the 2021 SIIM Virtual Hackathon, where we developed a proof-of-concept of a radiology training module with elements of gamification. In the 50 h allotted in the hackathon, we proposed an idea, connected with colleagues from five different countries, and completed an operational proof-of-concept, which was demonstrated live at the hackathon showcase, competing with eight other teams. Our prototype involved participants annotating publicly available chest radiographs of patients with tuberculosis. We showed how we could give experience points to trainees based on annotation precision compared to ground truth radiologists’ annotation, ranked in a live leaderboard. We believe that gamification elements could provide an engaging solution for radiology education. Our project was awarded first place out of eight participating hackathon teams.
Keywords: Medical Education, Internship and residency, Informatics, Radiology
Background
Exposure of trainees to different diseases, exam types, or patient populations during residency or fellowship results in inconsistent training within the same and outside institutions. To address this issue, we developed a proof-of-concept electronic training module with gamification incentives during a medical informatics hackathon.
The Society for Imaging Informatics in Medicine (SIIM) began hosting hackathon events in 2014 to help solve problems through informatics solutions. Hackathons, a portmanteau of the words “hacking” and “marathon”, are events in which professionals involved in software development collaborate intensively on software projects with the intent to create a functioning computer program by the end of the event in a very short period of time. These professionals can be computer programmers, graphic designers, interface designers, project managers, or domain experts. The interesting approach SIIM supports is more collaborative and includes clinicians that can describe technical requirements to include “what can this application do for my patients.”.
Hackathons are very popular outside of the medical industry or academia, especially in software development and the video game industries. For example, one of the first hackathons was the OpenBSD (Open Berkeley Software Distribution, June 4–6, 1999) with 10 participants in Calgary, Alberta [1]. It was an informal event where ten people gathered to fix a piece of software over a weekend. Since then, we have seen many hackathons around the world.
In medicine or healthcare, hackathons started in 2011 with the Massachusetts Institute of Technology (MIT) promoting the MIT Hacking Medicine [2, 3]. Hacking events in medicine are about to make a significant impact by inspiring change (and eventual improvement) and innovation in healthcare [4, 5], as a way to encourage interprofessional health education [6], and promote diversity across individuals, ideas, and projects to address clinical challenges [7]. Many experiences were published. As Dr. Angela Walker and Dr. Naomi Ko mention, the participation of healthcare providers and medical researchers is paramount for the careful development, implementation, and evaluation of any technological intervention in medicine, alongside the usual hackathon participants, such as programmers and entrepreneurs [4].
Since 2014, there have been eight hackathons hosted by SIIM [8]. The SIIM Hackathon is a collaborative event welcoming anyone to solve problems in healthcare, not just medical imaging professionals. It also features domain experts, code heroes, a judging panel, and prizes. Because of COVID global pandemic, transitioning the SIIM Hackathon to online-only in 2020 became a solution to keep them alive. Over the years, SIIM has partnered with other societies such as ACR, the Radiological Society of North America (RSNA), Medical Imaging & Technology Alliance (MITA), and HL7 International’s FHIR DevDays. These partnerships offer training and access to the hackathon infrastructure and continue to expand thanks to dedicated resources such as a cloud-based server that is accessible 24/7/365 from anywhere, SIIM Hackathon’s Dataset, and easy-to-follow documentation suitable for beginners.
The SIIM hackathon welcomes participants from all disciplines, regardless of their experience in programming, facilitating the interaction of high school and college students, residents, fellows, professors, coders, and software engineers, to name a few. The important message about this kind of event is that its primary objective is to foster collaboration towards a common goal, learning, and meeting new people. Participants enjoy the interactions and helping one another. The main idea is not to build a final perfect project ready to be launched but rather to be creative and learn things with experts in other areas. Partial solutions that break or are on the edge of performance are welcomed! The competition is merely a byproduct but also adds incentives. Travel was not a factor in the last two years, inviting a wider international audience via audio–video teleconferencing.
In 2019, one of the authors (L.R.F) mentored a winning hackathon team applying gamification as a crowdsourcing incentive [9]. Gamification is defined by using game-playing components to other activities to encourage engagement with a product or service and make using it more fun [10]. Common game design elements include points (such as experience points commonly known as XPs, and levels), badges and achievements, leaderboards, and performance graphs. Gamification is utilized substantially in online marketing but uses in radiology education have also been proposed [10, 11]. Winkel et al. explored the potential of gamification in radiology education. They developed an e-learning concept that was capable of improving diagnostic confidence and reducing error rates in the diagnosis of pneumothorax while at the same time offering a fun interactive platform [12]. Their work featured health points, experience points, a leaderboard, and a game-over status, akin to many video games. See Appendix for more specifics about the SIIM hackathon.
We believe that providing the gamification incentives encourages participation and a natural inclination to improve based on peers’ achievements. In addition, gamification elements could make training modules more engaging, incentivizing participation. The purpose of this manuscript is to report our experience in the 2021 SIIM Virtual Hackathon, where we developed a proof-of-concept for a radiology training module (a tuberculosis diagnosis training module) with gamification elements within 50 h that could add value to residency training by creating gamification incentives to learning.
Methods
The 2021 SIIM Hackathon was 100% online for the second year in a row due to the global pandemic. Meeting registration and hackathon participation was free for students, residents, and fellows; and it occurred over four days (May 17 to 21, 2021), with the hacking portion from May 19 at 12 pm EST (Eastern Standard Time) to May 21 at 2 pm EST (spanning approximately 50 h) (Fig. 1). Anyone registered for the meeting could participate, irrespective of whether they worked in radiology or other clinical disciplines or had any experience with programming or coding. As a result, participants were from several industries and professional backgrounds, including radiology residents, high school, college, medical students, and software engineers. In addition, mentorship was provided by a variety of professionals: domain experts and code heroes for technical assistance and physicians involved in imaging-centric specialties (mostly radiologists, however, pathologists, and dermatologists are planning to participate next year).
Our team met during the hackathon for the first time. There were 14 participants in our team lead by P.V.S. and mentored by L.R.F. In addition, code heroes included J.A.A.S. and D.A. Other participants are listed in the Acknowledgment section of this manuscript. The participants were from five different countries (the USA, Brazil, Portugal, Colombia, and Canada).
During the hackathon brainstorming session, the idea proposed was to create a training module for trainees that automatically use performance metrics to grade and give XP points, which can be used in a video game-like leveling system. In our idea, performance metrics could include, for example, time to diagnosis, diagnosis accuracy, number of typos in a report, a Sørensen–Dice similarity coefficient (otherwise known simply as Dice score) in an annotation task, or others.
The initial idea involved creating a training module that could train the resident or fellow in a specific task, such as Lung-RADS (lung imaging reporting and data system) or Trauma AAST (American Association for the Surgery of Trauma) scoring tasks; or give exposure to various cases under the same category, for example, a call-prep type module with emergency radiology cases or a cancer staging module.
A set of training modules (similar to a sequence of different dungeons in a video game, Fig. 2A) would be used to improve a specific skill. Each module within the set (Fig. 2B) would comprise a series of imaging cases (Fig. 2C, D) for the trainee to annotate or choose an answer for in some way. Every subsequent module (dungeon) within a set of modules would be more challenging than the previous. Answer correctness would be compared to the ground truth specific for that task. Each correct answer would grant the trainee a certain amount of XP, with each module having a limited maximum number of XPs (Fig. 2). At the end of each dungeon, the trainee would amass XPs, which would move into the trainee’s total pool of XPs. Those XPs would amount to levels, and a trainee would keep increasing their individual level, just like a video game character. In addition, a resident would earn badges for accomplishing streaks and other achievements. Figure 2 shows a Lung-RADS training module as an example.
The benefit of having a leveling system is that it provides an objective number that represents milestones. In addition, these levels would prove that the trainee can perform well or unlock access to other activities or privileges during residency, such as taking call shifts or presenting tumor board multidisciplinary conferences. Different levels allow the trainee to have the privilege to access events or activities. Also, each trainee would have their profile website displaying their level, badges, and trophies (Fig. 3).
After the brainstorming session, several participants were interested in the idea and decided to help. Given the availability of a public dataset of tuberculosis chest radiographs, we decided to work together on a proof-of-concept of our training module idea to create a tuberculosis diagnosis training module with gamification elements. The module contained 68 chest radiographs showing different findings of tuberculosis for a trainee to go through, with a total maximum of 680 XP for this module, 10 XP for each perfectly annotated case.
In this training module, a trainee annotated tuberculosis cases on the computer. We used VGG Image Annotator (VIA), an open-source web-browser solution created by the Department of Engineering from the University of Oxford [13] for image annotations (Fig. 4). Annotators marked findings on the radiograph using a simple annotation tool (Fig. 4, bounding boxes #1 left upper and #2 right lower lung zones). The ground truth for our proof-of-concept was the annotations by L.R.F., a radiologist with over 30 years of experience.
Educational resources were made available, including a radiology preprocessor curriculum https://folio47.wixsite.com/rp-course, a slide presentation on annotating, a comprehensive chest radiograph resource https://robochest.niaid.nih.gov/, and the publicly available chest radiograph collection and VIA described above.
The Dice score was chosen as the primary gamification metric to translate to XPs, since using bounding boxes in the tuberculosis radiographs made sense as a method to compare the similarity between the trainee annotation and ground truth (experienced radiologist). A Dice score was calculated, ranging from 0 to 1, with 0 being an annotation that is entirely off and 1 being an annotation that is 100% perfectly similar to ground truth. For the XP calculation in the tuberculosis example, the trainee score is obtained as shown in Eq. (1):
1 |
where XP on this module is obtained by the sum of the Dice score obtained in all images in this module, and the result multiplied by 10 to ensure that the final result is always an integer number, assuming in that case that Dice scores would be rounded to one decimal point. These XPs were used to rank trainees on a leaderboard. Therefore, game design elements in our project included points (XPs calculated from Dice scores) and leaderboards.
To develop this prototype, the code-heroes in our project helped us further. From VIA, the trainee can export his annotations done on the tuberculosis images to a JavaScript Object Notation (JSON) file. This file contained the images information and corresponding bounding boxes drawn by the annotator. To compute the XPs and create a leaderboard, J.A.A.S. developed a JSON parser to retrieve such information and calculate the XPs using the Dice score between the trainee’s annotations and the ground truth and using Eq. (1), which are saved for the leaderboard. A JSON file is a lightweight data-interchange format, being both easy for humans to read and write and for machines to parse. The JSON format stores simple data structures and objects, such as reference images and bounding box locations and sizes, necessary for the computation of the Dice scores. Subsequently, to improve usability, an online platform with these functionalities was developed using Streamlit (Streamlit Inc., https://streamlit.io). This open-source Python library allows for creating and sharing custom web apps [14], allowing the upload of the annotation JSON files, automatic computation of XPs, and real-time ranking of the trainee into the leaderboard. The project source code was made available and can be found on https://github.com/JoaoSantinha/streamlit-leaderboard.
So, in summary, participants ranging from college students to radiology residents annotated the images by following the instructions, with very few questions. Then, they would export their annotations to a JSON file, which could be uploaded to the developed online platform (website accessible at: https://pokerad.herokuapp.com) to compare the ground truth using Dice scores, XP calculation, and leaderboard listing.
Results
Eight projects competed in the Project Showcase on Friday, May 21, 2021. The presentations can be found on YouTube (Google LLC, San Bruno, CA), on the SIIM YouTube channel, video “SIIM21 Session 6013 Hackathon Project Showcase” at minute 46:00 [15]. Presentations ran for 10 min, with a few questions asked by the evaluating committee. Our presentation demonstrated a pre-recorded demo showing dragging and dropping the JSON file into the bespoke website (Fig. 5) and how this website could automatically calculate a Dice score and XPs and compare the individual’s score with a ground truth score to rank the annotators. Figure 6 shows a summary of our proof-of-concept. The following week, at the 2021 SIIM Award Ceremony at the conclusion of the conference, we were ecstatic to find out that our proof-of-concept received the 1st place award [16].
Our team had a very positive experience in the 2021 SIIM Virtual Hackathon. Our participation allowed us to meet each other and members of other teams. Every team member contributed in some way. People without any experience in programming helped by pitching ideas, annotating tuberculosis cases for the proof-of-concept, building the presentation, suggesting user interfaces, or brainstorming solutions. Even members of other competing teams gave ideas, as the event’s spirit was collaboration over competition. Moreover, our experience included various professional training levels and experience including senior radiologists, trainees, software and biomedical engineers, and medical students. This enabled a proper identification of the problem to be tackled during the hackathon and the definition of the idea and proposed solution, followed by implementing a proof-of-concept that was presented live online as part of the virtual conference.
Discussion
In brief, we report our experience in the 2021 SIIM Virtual Hackathon, where we showed a proof-of-concept of how a radiology training module utilizing elements of gamification could work through a functioning prototype model. We are grateful for the 1st place award and the help of all hackathon participants, mentors, and organizers. Developing a functioning prototype in such a short period was a challenging but gratifying experience. We believe that leveraging online communication and gathering a creative team of various backgrounds were crucial elements that contributed to our success.
Our proof-of-concept had several limitations, as might be expected with a time limit of 50 h to complete a project. First, this was just a prototype created in a brief period to illustrate how gamification could work in a radiology training setting. Second, we could not package the entire set of 68 tuberculosis chest radiograph cases into a module that contained all the gamification elements, namely an interface for annotation, XP calculation, and leaderboard, in the same place. Also, we tested the leaderboard with only three annotators, hence did not demonstrate the trainee leveling system we envisioned. Finally, the Dice score is just one metric to assess similarity. Other methods may be more appropriate than the Dice score depending on the annotation task; however, given the extreme time constraints, we chose the most straightforward approach that was realistic to achieve with the team that came together.
Future directions include automating the gaming process so that any trainee annotating an image, typing an answer, or choosing an option from a list of choices would not need to export any files, thereby creating leaderboards that update automatically. It would also be interesting to develop modules utilizing other inputs, such as dictating a report or filling up an online form. Also, a visually attractive user interface and profile pages would be necessary. For example, a user-friendly profile page could show leveling-up icons, badges, and trophies. In addition, a user interface could display a graphical representation of the difference in annotation between the ground truth and a designated annotator accompanied by the Dice score. Each trainee would then see how they approximated their annotation to the ideal and learn in the process.
Our successful hackathon project may help improve how we teach, train, and evaluate radiology residents, replacing multiple-choice questions (MCQs) and objective structured clinical evaluations (OSCEs) with training modules that simulate real-life cases. A gamification approach could also make learning more engaging with anonymous leaderboards popularized by video games. Automated grading with Dice scores or other metrics could also provide the trainee immediate feedback. In addition, students of all levels can annotate imaging findings, with appropriate expectations based on the education phase, while helping identify students or trainees with talent in medical imaging. Finally, there could be a system in which senior residents or attending radiologists could submit interesting or rare cases to the platform in a future iteration. A committee could then evaluate these cases for approval, and these could then be integrated into an existing subspecialty training module or an interesting or rare cases module.
Conclusion
We described how an international team that had never previously met developed a prototype radiology training module that included gamification elements within 50 h in an annual conference hackathon. Our proof-of-concept of radiology training modules with gamification elements shows potential to improve how we teach, train, and even evaluate radiology trainees. The authors and participants had an excellent experience in the 2021 SIIM Virtual Hackathon.
Acknowledgements
The authors thank all 14 participants of our hackathon team. Participants: Pedro V. Staziaki, Stacy O'Connor, Jane Dimperio, Lillian Spear, Michael Do, Lucas Folio, Eduardo Farina; Mentor: Les R. Folio; Domain expert: Brad Genereaux; Code-heros: João A. A. Santinha, Diego Angulo, Marcelo O. Coelho, Mike Ciancibello, and Sivaramakrishnan Rajaraman.
Appendix
SIIM Hackathon
The annual SIIM Hackathon features servers with current open-source industry standards such as FHIR (the “Fast Healthcare Interoperability Resources” standard) and DICOMweb (the DICOM [Digital Imaging and Communications in Medicine] Standard for web-based medical imaging), in addition to the SIIM dataset, a synthetic dataset that combines clinical data and images, making it easy for beginners and experienced informaticians to utilize and test their prototypes with life-like data (devoid of personal health information) compared to the typical datasets found online which usually feature either clinical data or images but not both [1]. Additional resources include other APIs featuring other international standards such as IHE Standardized Operational Log of Events (SOLE) and IHE Artificial Intelligence Workflow in Imaging (AIW-I) and support staff to manage the otherwise complex systems. Lastly, the SIIM hackathon committee has a diverse team of coders, engineers, physicians, and young members that guide the development and support of the hackathon.
The SIIM Hackathon prides itself on reflecting SIIM’s values of collaboration, education, and advancement of patient care vs. the typical competitiveness of a hackathon. Additionally, participants are asked to consider open sourcing their projects for the benefit of the wider community. In addition to providing servers and datasets, the hackathon offers easy-to-follow start guides for those new to FHIR and DICOMweb and encouraged live dialogue between participants and mentors alike, helping each other overcome obstacles and forging friendships.
Finally, participants are invited to present their prototypes at the annual showcase during the SIIM annual meeting irrespective of how mature the prototypes are, as the focus is on the merit of the idea or problem being solved. The SIIM hackathon has collaborated with several societies such as the American College of Radiology (ACR), Radiological Society of North America (RSNA), Medical Imaging & Technology Alliance (MITA, the standard body behind DICOM), and HL7 International’s FHIR DevDays.
References (Appendix)
1. Kohli M, et al.: Creation and Curation of the Society of Imaging Informatics in Medicine Hackathon Dataset. Journal of digital imaging 31:9–12, 2018
Author Contribution
P.V.S conceived the idea and presented it at the SIIM event. J.A.A.S., M.O.C., and D.A. worked in programming the software. L.F. supervised the project, gave more ideas, and supplied the cases and ground-truth. All authors helped in writing the manuscript and reviewed the final version.
Funding
Not applicable.
Declarations
Ethics Approval
Not applicable.
Consent to Participate
Not applicable.
Consent for Publication
Not applicable.
Competing Interests
Pedro V. Staziaki’s - Committee: Member of the SIIM Hackathon Committee; João A. A. Santinha’s - Committee: Member of the SIIM Clinical Data Informaticist Task Force; Marcelo O. Coelho’s - Committee: Member of the SIIM Hackathon Committee; Les Folio’s - Boards, Committees: Member at large, SIIM board of directors; Advisory board, Carestream Health; Co-chair of the SIIM Hackathon Committee; Chair, fellowship committee, Society of Advanced Body Tomography; Radsite standards committee on Cone Beam Computed Tomography; Patent (no royalties): “Radiographic marker that displays upright angle on portable x-rays.” US Patent 9,541,822 B2; Patent (no royalties): “Multigrayscale Universal CT Window.” US Patent 8,406,493 B2; Research agreement, Philips Healthcare; Author royalties, Springer; Dr. Folio is supported in part by the NIH Clinical Center Intramural Research Program. Mohannad Hussain’s - Committee: Member of the SIIM Hackathon Committee; Principal Consultant, Techie Maestro Inc.; Technical Project Manager, SIIM Hackathon.
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.OpenBSD Hackathons. Available at https://www.openbsd.org/hackathons.html. Accessed June 8, 2021.
- 2.MIT Hackathon. Available at https://news.mit.edu/topic/hackathon. Accessed June 8, 2021.
- 3.DePasse JW, et al. Less noise, more hacking: how to deploy principles from MIT's hacking medicine to accelerate health care. Int J Technol Assess Health Care. 2014;30:260–264. doi: 10.1017/S0266462314000324. [DOI] [PubMed] [Google Scholar]
- 4.Walker A, Ko N. Bringing Medicine to the Digital Age via Hackathons and Beyond. J Med Syst. 2016;40:98. doi: 10.1007/s10916-016-0461-1. [DOI] [PubMed] [Google Scholar]
- 5.Poncette A-S, Rojas P-D, Hofferbert J, Valera Sosa A, Balzer F, Braune K: Hackathons as Stepping Stones in Health Care Innovation: Case Study With Systematic Recommendations. Journal of medical Internet research 22:e17004, 2020 [DOI] [PMC free article] [PubMed]
- 6.Aungst TD. Using a hackathon for interprofessional health education opportunities. J Med Syst. 2015;39:60. doi: 10.1007/s10916-015-0247-x. [DOI] [PubMed] [Google Scholar]
- 7.Wang JK, Roy SK, Barry M, Chang RT, Bhatt AS. Institutionalizing healthcare hackathons to promote diversity in collaboration in medicine. BMC Med Educ. 2018;18:269. doi: 10.1186/s12909-018-1385-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.SIIM Hackathon website. Available at https://siim.org/page/hacking_healthcare. Accessed June 8, 2021.
- 9.Castelblanco A, Semin A, Marques O, Folio LR, Do HM: Medical Imaging Annotation Game as a Dataset Crowdsourcing Incentive: Initial Development at the SIIM19 Hackathon, 2020
- 10.Awan O, et al. Making Learning Fun: Gaming in Radiology Education. Academic radiology. 2019;26:1127–1136. doi: 10.1016/j.acra.2019.02.020. [DOI] [PubMed] [Google Scholar]
- 11.Reiner B, Siegel E. The potential for gaming techniques in radiology education and practice. Journal of the American College of Radiology : JACR. 2008;5:110–114. doi: 10.1016/j.jacr.2007.09.002. [DOI] [PubMed] [Google Scholar]
- 12.Winkel DJ, Brantner P, Lutz J, Korkut S, Linxen S, Heye TJ. Gamification of Electronic Learning in Radiology Education to Improve Diagnostic Confidence and Reduce Error Rates. AJR American journal of roentgenology. 2020;214:618–623. doi: 10.2214/AJR.19.22087. [DOI] [PubMed] [Google Scholar]
- 13.VGG Image Annotator (VIA). Available at https://annotate.officialstatistics.org. Accessed June 8, 2021.
- 14.Streamlit website. Available at https://docs.streamlit.io/en/stable/. Accessed June 8, 2021.
- 15.SIIM21 Session 6013 Hackathon Project Showcase. Available at https://www.youtube.com/watch?v=jGIMpbc8AwM&t=5101s. Accessed June 8, 2021.
- 16.2021 Virtual Hackathon Awards. Available at https://siim.org/general/custom.asp?page=aw_hackathon. Accessed June 8, 2021.