Introduction
Learners in medical education are utilizing e-learning products, such as podcasts, question banks, and videos, at an increasing rate. Specifically, question banks are likely a popular method to help learners reinforce knowledge and assess their individual learning needs.1,2 However, there is great variability in question banks, which range from peer-created question banks for formative feedback to commercially created question banks for high-stakes board certification preparation. So it is not surprising that the features available also range from basic, including standard feedback, to advanced, such as the Challenge Us feature found within NEJM Knowledge+ Internal Medicine Board Review — learners can contact the editorial team by using the Challenge Us feature to challenge information contained in a specific question, answer, and/or feedback. Each submitted challenge is then reviewed by the NEJM Knowledge+ editorial team to determine if revisions are necessary. However, we wondered how the NEJM Knowledge+ learners used this feature and if they found it helpful or useful.
Methods
We collected NEJM Knowledge+ Challenge Us data from October 8, 2018, to October 8, 2020, and identified the learners who were the top submitters of challenges. A random sample of these learners were asked to participate in a 30-minute semistructured individual interview. The semistructured interviews were analyzed utilizing an iterative coding process.
Results
The total number of challenges submitted during this time frame from learners was 3021, excluding those resulting from the Pain Management and Opioids module. Of these challenges, 2209 (73.1%) resulted in no change, 788 (26.1%) resulted in a change, and 24 (0.8%) were in the review process. Three physicians participated in the semistructured interviews, which represented the internal medicine learning continuum. The participants included a resident physician, a physician in the middle of their career, and an experienced senior physician. The semistructured interviews focused on several topics and three themes: reasons for submitting challenges, developing a challenge, and the effect of submitting a challenge. These themes are described below and paired with selected illustrative quotes (please note that minor edits were made to some quotes for clarity).
Reasons for Submitting Challenges
The most compelling reason the participants were motivated to submit a challenge within NEJM Knowledge+ was that they perceived an error or errors within a question, answer, and/or the feedback provided (e.g., “I have been [submitting challenges] quite often and the reason I do it, sometimes, [is] I think there is a conflict in information [between] what I have [and what] is written by the author.”). Another compelling reason is that while the participants agreed with the question, answer, and/or the feedback provided, they felt that improvements could be made (e.g., “This question would be better if [NEJM Knowledge+] said this in the question because it’s implied in the answer.”). Another, less compelling reason was disagreement with the answer based on professional judgment (e.g., “If that’s my patient, I wouldn’t rush to put a $25,000 pacemaker in … until I’m sure that these other causes of syncope have been looked at.”).
Developing a Challenge
The participants felt that developing a challenge for submission was a detailed process, which involved reviewing external resources, such as DynaMed, MKSAP, UpToDate, and others (e.g., “Almost all of my challenges quote UpToDate. Sometimes, I’ll go somewhere else too, but … if I challenge 10 things, I think probably nine of the 10, or 10 of the 10 include UpToDate, and only one or two included an additional other source.”). The reason for reviewing external resources was to clarify, verify, and triangulate evidence related to the developing challenge, which provided the participants with added confidence and support in the challenge they were about to submit.
Effect of Submitting a Challenge
An important outcome of submitting a challenge was the preparation, where the participants acknowledged gaining a deeper understanding of the topic at hand (e.g., “I found when I challenged, I would do a lot of digging to support my challenge … that then deepened my learning on that particular question.”). While not all submitted challenges resulted in a change, the NEJM Knowledge+ editorial team does provide most challengers with a response, which on average has been within three days from when a decision is made (e.g., “I get a thoughtful, polite, and prompt answer from them.”). Specifically, for the participants who saw a change made based on their challenge, the ability to read the editors’ response and see the change within NEJM Knowledge+, due to its adaptive learning algorithm, provided a satisfying experience (e.g., “On one of my [challenges], I actually saw my correction incorporated into the discussion. That was really gratifying.”). Additionally, changes made can benefit the individual (e.g., “It really made it better for me because … I was confused or uncertain [at first] because I was getting two slightly different answers from UpToDate versus [NEJM Knowledge+].”) and the broader learning community (e.g., “I think anytime [questions, answers, and/or feedback are] that current and … stay that current, I mean, those are all real positives … for the whole learning community.”).
Discussion
Many question banks for internal medicine physicians have standard features, including questions (e.g., multiple-choice questions), question feedback, practice exams, and various reports.3–8 Additionally, advanced features continue to be developed, such as the inclusion of adaptive learning and the ability to challenge questions, which are, for example, included in both the Med-Challenger Internal Medicine Board Review and NEJM Knowledge+.3,9,10 While the utility of question banks is rooted in the neuroscience of learning principles, the ability to challenge a question can promote active learning opportunities for learners who are using these often self-paced study tools.
According to our results, the participants were motivated for several specific reasons to submit a challenge within NEJM Knowledge+. In developing a challenge, the participants would review external resources to clarify, verify, and triangulate evidence. By developing a challenge in this manner, the participants were engaged by identifying, interpreting, assessing, and analyzing various external resources that can align with higher-order thinking, which is a characteristic associated with active learning.11
When submitting a challenge, learners first examine the content of the question, answer, and feedback to see how each component and subcomponent relate and build upon one another, ideally providing a clear path to the underlying learning objective. Learners then assess the content of the question, answer, and feedback to determine if potential errors exist or where improvements can be made, which directs them in identifying and reviewing relevant external resources to determine the precise inconsistencies or gaps. Finally, learners then develop the content of their challenge by synthesizing all relevant external resources, which can be paired with their professional judgment, into an articulately crafted challenge. This process aligns with the analyze, evaluate, and create levels, which are all important components of the cognitive process dimension, one of two dimensions presented in the revised version of Bloom’s Taxonomy.12 Importantly, while not all challenges are successful, the NEJM Knowledge+ editorial team’s responses can provide added content for the learner to review, which may reinforce, enhance, or correct aspects learned during the active learning process.
Conclusion
Our findings indicate that the Challenge Us feature can be helpful and useful to learners using NEJM Knowledge+, and it can promote the active learning process. As further research is needed, we encourage others to examine and study this and other e-learning products moving forward.
References
- Baños JH, Pepin ME, Van Wagoner N. Class-wide access to a commercial Step 1 question bank during preclinical organ-based modules: a pilot project. Acad Med 2018 Mar; 93:486.
- Imran JB, Madni TD, Taveras LR, et al. Assessment of general surgery resident study habits and use of the TrueLearn question bank for American Board of Surgery In-Training exam preparation. Am J Surg 2019 Sep; 218:653.
- Med-Challenger Corporation. Internal Medicine Board Review. https://www.challengercme.com/internal-medicine-board-review
- American College of Physicians. ACP MKSAP 18. 2019. https://www.acponline.org/featured-products/mksap-18
- BoardVitals. Internal Medicine board review questions and practice tests. 2020. http://www.boardvitals.com/internalmedicine-board-review
- Exam Master Corporation. Focused review for Internal Medicine ABIM Certification Exam. 2020. https://exammaster.com/exam/internal-medicine
- MedStudy Corporation. Internal Medicine Study Strong Digital System. 2020. https://medstudy.com/products/internal-medicine-study-strong-digital-system
- NEJM Group. NEJM Knowledge+ Internal Medicine Board Review. 2020. https://knowledgeplus.nejm.org/board-review/internal-medicine-board-review
- Med-Challenger Corporation. How do I report suspect Q&A content? https://www.challengercme.com/support/courses-report-content-send-feedback-suspect-possible-errata-for-editorial-review
- NEJM Group. NEJM Knowledge+ Help — Internal Medicine Board Review. https://knowledgeplus.nejm.org/internal-medicine-board-review-help/
Bonwell CC, Eison JA. Active Learning: Creating Excitement in the Classroom. Washington, D.C.: The George Washington University, School of Education and Human Development; 1991. - Anderson LW, Krathwohl DR, Airasian PW, et al., eds. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, Abridged Edition. 1st ed. New York, NY: Addison Wesley Longman, Inc.; 2001.