Education Research: Can my electronic health record teach me something?
A multi-institutional pilot study
Citation Manager Formats
Make Comment
See Comments

On average, 4 clinical questions arise per patient encounter1 and about half the time, information needs are left unresolved.2 There is significant interest in capturing, sharing, and using knowledge within the daily work of health professionals in order to improve health outcomes. The 2009 Health Information Technology for Economic and Clinical Health (HITECH) Act offers up to $27 billion over 10 years to providers demonstrating “meaningful use” of electronic health records (EHRs).3 “Meaningful use” implies more than just recording information; use of the EHR should improve patient care.
Just as the stethoscope did in the past, EHRs with knowledge management capacities represent a new tool for the clinician. Knowledge management tools integrate collective knowledge into a common space such as a repository, emphasize the user community as a working unit (shared spaces, recommendation systems, collaborative learning systems), and also emphasize knowledge structures (information mediator systems, digital libraries, and ontology-based systems).4 Several studies have shown the Infobutton, a context-sensitive knowledge retrieval link in some EHRs, to be effective in anticipating clinical questions, providing answers, and positively impacting patient care.5,6 However, as of 2010, only 3.6% of nonfederal acute care US hospitals had a comprehensive EHR.7
Physician attitude will play a critical role in the adoption of EHR technology.8 Moreover, it is unclear how, and to what extent, these tools can be integrated into the academic training setting. In this multi-institutional pilot simulation, we sought to assess the perceived utility, user preferences, and barriers to implementation of a knowledge management tool in neurology and internal medicine. We anticipated that clinicians would endorse its potential utility, offer important user feedback, and identify challenging yet surmountable barriers to implementation.
METHODS
Software design.
In conjunction with The University of Miami Miller School of Medicine (UMMSM) Center for Computational Sciences, we developed a prototype clinical knowledge management system called KNOW-ET-AL (Knowledge Needs Organizing Wizard and Event Tracker for Applied Learning). This software addresses the active learning needs of a physician by automatically generating enriched patient overviews (EPOs) based on real-time EHR information (figure). These personalized reports include information necessary for optimal continuity of care, enriched with educational resources. The system mines electronic fields within the health record for key words, such as diagnosis, ethnicity, and age, which are then used to perform automated searches for articles in PubMed. The software then chooses references from peer-reviewed journals that are associated with the highest impact factor. Priority is given to the most recent date of publication referencing practice guidelines, humans, English language, and the journal groups subset core clinical journals. The system also uses the key words to search tagged content stored in a document management system including a variety of educational materials, such as PowerPoint presentations, video content, and Web-based learning modules, as well as patient education material. Through “continuity-of-care alerts,” the software can notify a clinician when a patient returns to the emergency room or is readmitted to the hospital. “Live lecture alerts” notify users of on-campus didactics that may be relevant to their patients. Users are invited to rate the utility of resources presented (figure) in order to help the software choose future references.
This enriched patient overview (EPO) prototype was the first case simulation e-mailed to all potential survey respondents. EPOs consist of a brief summary of the hospitalization with links to additional information in the electronic health record, as well as a series of educational resources that can be rated by the user in order to adapt subsequent resource selection according to individual preferences.
Study design.
After UMMSM institutional review board approval, this simulation was disseminated to neurology and internal medicine faculty, fellows, and residents of UMMSM, University of Rochester School of Medicine and Dentistry, University of Pennsylvania Perelman School of Medicine, Harvard Medical School, and Weill Cornell Medical College. E-mail reminders were sent out repeatedly to encourage participation.
Four software simulations, each including clinical scenarios of mutual interest to internal medicine and neurology, were designed using SurveyMonkey.com. Each simulation included the aforementioned EPO, followed by a few clinical questions, followed by questions assessing clinician perspectives. Case topics included cognitive neurology (age-related cognitive impairment/Alzheimer disease), vascular neurology (cerebral ischemia/congestive heart failure), seizure (due to lung cancer with brain metastases), and neuroinfectious disease (headache/meningitis). Eligible participants were first e-mailed an informed consent and a single case (cognitive impairment). A month later, a second consent and invitation was sent that prompted participants to choose one of the 3 remaining cases.
Participants were invited to utilize resources provided in the EPO to answer clinical questions posed about the patient. They then rated the links according to their likelihood to change one's clinical practice (Yes, No, or Unknown). Survey questions in 5-point scales, multiple choice, and free response formats also assessed perceived utility, user preferences, and barriers to implementation (table). We collected user feedback on likelihood to use the aforementioned categories of resources. Additionally, all links included in the simulation were tracked using ClixTrac.com in order to determine the percentage of respondents who actually clicked each link presented.
User perceptions and preferencesa
All data were stored on the secure, password-protected SurveyMonkey.com and were exported to Microsoft Excel and GraphPad Prism for descriptive statistical analysis. χ2 Tests were performed to compare responses between neurology and internal medicine participants, and between faculty and residents/fellows.
RESULTS
A total of 189 patient simulations were completed (134 for the cognitive neurology case, 21 vascular neurology, 15 seizure, 19 neuroinfectious disease). Responses were collected from 739 invitations (18% response rate overall for the cognitive impairment case [34.3% University of Pennsylvania, 32.4% UMMSM-Neurology, 22.7% Harvard, 14.3% Cornell, 13.8% University of Rochester, 11.6% UMMSM-Medicine] and 7.4% for the second simulation). A total of 60% of those participating in the second simulation had already participated in the first; therefore, there were 156 unique respondents (21% of those invited to participate in the study). The participation rate of internists (n = 414) (from UMMSM) was 14%. The response rate of neurologists (n = 325) from each of the 5 participating institutions was 27%. The participation rate varied from 14% to 38% across the 6 programs (p < 0.01).
Results are summarized in the table. Regarding perceived utility, more than 80% of unique respondents agreed that there is a need for EHR tools to improve medical education and continuing medical education, as well as to inform clinical decision-making. Of the educational resources provided, clinical guidelines (95%) and peer-reviewed publications (87%) were the most popular. The most commonly cited barriers included overly numerous e-mails (47%), lack of time to read a resource (46%), and overly numerous educational resources (45%). A majority (63%) stated that they would prefer to receive no more than 1–2 e-mails per week.
ClixTrac data confirmed that users who rated resources highly also clicked upon the links. In the first survey, the highest rated resources were clinical guidelines (72% 4 or 5 stars) and one of the peer-reviewed articles (57%). A total of 72% and 84% clicked these links, respectively. The lowest click rates correlated with links that were entirely simulated (2 continuity-of-care alerts, 20% and 12%; live lecture alert, 40%) or provided little education value to the participant (patient education materials, 35% and 39%).
There was one statistically significant difference of opinion between faculty and residents: 38% of faculty, vs 7% of residents, responded that software such as this would be most valuable for clinical questions outside their own specialty (p < 0.01). Results were also compared between medicine and neurology participants using χ2 tests, and none were found to be significantly different between the 2 groups.
DISCUSSION
Given the nascent technology, adoption of fully functional EHRs with integrated knowledge management systems is currently low.9,10 Therefore, little is known regarding how and to what extent these tools can be implemented in academic training settings. Perhaps most important are the time challenges in a real-world practice setting. We sought to assess clinicians' perceived utility, user preferences, and barriers to implementation regarding a novel knowledge management tool developed at UMMSM.
The major limitation of our study was the low response rate. We attempted to minimize this by sending several e-mails over several weeks. However, only 18% of those initially e-mailed actually participated in this study and the response rate was even lower (7.4%) for the second simulation. While this response rate is similar to one published industry average for medical services, our goal was higher and may have been improved with a more optimal “call-to-action” subject heading.11 Variability in recruitment style by institution likely also contributed. Time pressures may also have played a role. The most conservative interpretation would be that the nonresponders were not interested in this type of technology and would have rated its utility poorly. However, the response rate might have been higher if the software had sent e-mails about clinicians' real patients rather than mock patients. We postulate that the lower response rate for the second case represented responder fatigue.
A vast majority of respondents agreed that there is a need for better EHR-based tools to improve medical education and clinical decision-making. The software consistently identified resources that “could change practice,” demonstrating the ability of a basic search algorithm to identify valuable clinical resources. There was also clinician demand for innovative education approaches such as EHR-linked examination preparation and EHR-linked continuing medical education. Interestingly, faculty were more likely to express interest in using the technology for learning outside their own specialty. Also, despite the continuing trend toward using handheld devices as medical schools adopt these devices in teaching and patient care, clinicians preferred desktop to device access. In general, clinicians worried that too much information would be presented. Results were similar across specialty and institution, lending support to the generalizability of these findings.
Several prior studies have demonstrated the efficacy, and the potential impact on practice, of knowledge management technologies in health care. The KnowledgeLink Infobutton at Partners Healthcare answered 84% of clinical questions, and altered 15% of patient care decisions.6 At a different institution, 74% of Infobutton users felt it had a positive impact on patient care, with 20% reporting a specific positive impact.5 In one study, Infobutton technology retrieved pertinent papers to over 55% of online discussion threads.6
Our study has other limitations. Our survey was not validated, but we used closed-ended questions, with simple wording and balanced rating scales, and attempted to provide logically ordered nonleading questions. With 154 unique respondents, our margin of error was approximately 8%. Stratified random sampling, which improves the representativeness of the sample population to the population as a whole, was not performed. Furthermore, when asked hypothetically outside of a busy clinical setting, clinicians might optimistically rate the utility of available resources. However, click tracking showed that those who rated a given resource highly (4 or 5 stars) also took the time to click on the resource. Finally, participation rates varied among programs and we did not collect information about nonresponders. This would contribute to a nonresponder bias if differences in recruitment approaches led to differences in characteristics of responders. Strengths of the study include the multi-institutional design and the inclusion of more than one medical specialty.
Our findings suggest that information technology tools should be integrated, with protected time, into graduate and continuing medical education, yet wisely and sparingly. We identified that time is the most important barrier to implementation. The low response rate may have represented this aforementioned time barrier, or may have represented a lack of perceived utility, or a combination of the two. If this software were delivered during protected time, relating to real cases with active knowledge gaps, even more favorable survey results might have ensued. Identifying the most important types of cases (e.g., those with common patient safety issues) or targeting clinical weaknesses might be another way to address time challenges.
There is value in having a venue, or a social space, that enables both explicit (e.g., research articles) and tacit (e.g., clinical experience) knowledge sharing to take place.12 Mobley and Rosenberg13 recently predicted that “trainees will benefit by search engines specifically targeted for the care of neurology patients.” In future steps, we hope to assess the effectiveness of this tool, vs comparable knowledge technologies and vs independent research, in the context of protected time. We hope to identify whether these tools can improve medical knowledge and clinical decision-making, as measured by the Internal Medicine and Neurology Script Concordance Tests.14 We might find that, indeed, these tools can achieve important quality-of-care goals by freeing clinicians from spending time looking for relevant patient information, and allowing them instead to spend time delivering optimal care.
AUTHOR CONTRIBUTIONS
Alon Seifan: study conceptualization and design, data analysis, manuscript preparation. Morgan Mandigo: study conceptualization and design, data analysis, manuscript preparation. Raymond Price: study design, study recruitment. Steven Galetta: study design, study recruitment, manuscript preparation. Ralph Jozefowicz: study design, study recruitment, manuscript preparation. Amir Jaffer: study design, study recruitment, manuscript preparation. Stephen Symes: study design, study recruitment. Joseph Safdieh: study design, study recruitment. Richard S. Isaacson: principal investigator, study conceptualization and design, manuscript preparation.
STUDY FUNDING
Study supported by the Evelyn F. McKnight Brain Research Foundation and the University of Miami Miller School of Medicine Department of Neurology.
DISCLOSURE
A. Seifan, M. Mandigo, and R. Price report no disclosures. S. Galetta has received consulting honorarium from BiogenIdec and Teva. R. Jozefowicz and A. Jaffer report no disclosures. S. Symes reports no financial disclosures for the current academic year and reports being a consultant on an advisory board for Gilead Pharmaceuticals for the prior academic year. J. Safdieh reports no disclosures. R. Isaacson has served as a consultant (advisory board) for Novartis and Accera, and received student loan payments from the NIH. Go to Neurology.org for full disclosures.
ACKNOWLEDGMENT
The authors thank Dr. Andrew Tarulli (Harvard Medical School) for assistance with study recruitment, Dr. Shara Brody (UMMSM) for advice and administrative support, Chris Mader, Harsha Venkatapuram, Luz Maristany, and Nick Tsinoremas at the Center for Computational Science (UMMSM) for developing the EHR software/prototype, and Dr. Ralph Sacco (UMMSM) for guidance on study design and support.
Footnotes
Go to Neurology.org for full disclosures. Funding information and disclosures deemed relevant by the authors, if any, are provided at the end of the article.
- © 2013 American Academy of Neurology
REFERENCES
- 1.↵
- 2.↵
- 3.↵
- 4.↵
- Cobos R,
- Esquivel J,
- Alamán X
- 5.↵
- Cimino JJ
- 6.↵
- 7.↵
- 8.↵
- 9.↵
- 10.↵
- 11.↵Average open, click-through, and bounce rates. Constant Contact Resource Center. Available at: http://constantcontact.custhelp.com/app/answers/detail/a_id/3194/∼/average-open,-click-through,-and-bounce-rates-of-constant-contact-customers. Accessed August 14, 2012.
- 12.↵
- 13.↵
- 14.↵
Disputes & Debates: Rapid online correspondence
NOTE: All authors' disclosures must be entered and current in our database before comments can be posted. Enter and update disclosures at http://submit.neurology.org. Exception: replies to comments concerning an article you originally authored do not require updated disclosures.
- Stay timely. Submit only on articles published within 6 months of issue date.
- Do not be redundant. Read any comments already posted on the article prior to submission.
- 200 words maximum.
- 5 references maximum. Reference 1 must be the article on which you are commenting.
- 5 authors maximum. Exception: replies can include all original authors of the article.
- Submitted comments are subject to editing and editor review prior to posting.
You May Also be Interested in
Related Articles
- No related articles found.