EHR Ease of Use - Does it Exist and How do we Define it?

How many times have you heard the term ease of use when inquiring about the capabilities of an EHR system or vendor? Often, EHR usability is based on a subjective opinion by the vendor or their end users. However, there is typically no industry standard of measurement applied to substantiate these claims for ease of use.Other industries use hard data elements to substantiate performance ability claims. For example, in the automobile industry, it is common to see miles per gallon, the amount of time to go from zero to sixty, horsepower, the cost of ownership over time, and other markers. The personal computer industry will use terms like processing speed, memory size, and battery life. The airline industry measures on-time departures/arrivals, lost luggage rates, number of destinations offered, and other standards. The examples are endless.

Why are there no similar metrics for measuring ease of use for EHRs? Setting this standard should not be difficult when you consider most EHRs provide similar capabilities yet different methods to attain the same results. For example, some systems inherently include automatic appointment reminders built into the program, whereas others may require a third-party bolt solution to achieve the same functionality. Technically, both vendors can claim to offer automatic appointment reminders, but one may need additional interfacing and set-up efforts, while the other is a built-in feature. This claim would be like comparing two vehicles, where one comes with an integrated GPS, and the other requires a third-party bolt-on device. While both vehicles can provide access to GPS, there is distinction that will impact the driver who is operating the system.

Previously, the industry has mostly surveyed its users to measure their EHR’s performance. Though surveys do provide some useful information, the data can be significantly skewed due to the lack of segmentation of the variables, nor can it distinguish between users who are properly trained and those who are not.

Example: Do you feel like the EHR has increased productivity?

A) strongly positive, B) positive, C) neutral, D) negative, E) strongly negative.  

It was baffling to see some vendors score an equal amount of positive and negative answers on the same question. How is this possible if the software is the same? Not surprisingly, when we drilled down into the data, we discovered employed providers with income guarantees were reporting higher positive feeling towards their EHR compared with those who were in private practice who would take a financial hit with a slow-down due to the system. We also saw higher positive feelings for groups with certified in-house super users compared with those who exclusively relied on an external helpdesk and/or a vendor call center for support. While this is information is useful, it is subjective and can vary significantly based on who (or what type of practice) is responding. Further, in fairness to the vendor, they have no way of controlling if the person responding to the survey was properly trained, has adequate in-house support, or even a reliable computer, or  is not running Windows 95.

While Coker is in no position to mandate the measurement standards for defining ease of use, we have seen a lot of EHR systems and participated in thousands of demos. We are often asked to provide our non-biased assessments of vendor performance evaluation to help our clients make informed decisions. Instead of using subjective methods to rank and score vendors, we take a more objective approach by measuring factors such as the number of clicks, the time to close a note, how many patients are seen using the EHR, time spent charting, and other indicators. On a more granular level, we look at how easily the system supports compliance standards such as MIPS and MACRA. Some vendors provide this reporting directly to CMS on behalf of their clients, while others leave it to the clients to report themselves. The amount of time to implement, write a template, and program new features can also be important components to measure.

Here is a list of areas that are easily measured and tracked:

  • Time to add a new patient
  • Time to schedule a new appointment
  • Time to check in a patient
  • Compliance with MIPS/MACRA
  • Eligibility
  • Prior Authorizations
  • Average time spent with the patient

Estimated average time to complete the following:

  • Physician note
  • RX
  • Order
  • Phone note
  • Referral
  • Payment posting
  • Work denials
  • Submit claims
  • Charge capture

If you want more input on how your EHR compares with industry averages and/or best practices, feel free to reach out to Coker Group for a free consultation on what expectations should look like and ways to optimize the performance.

Related Insights