Anti-User Pixels

This entry is part 44 of 44 in the series Words

I have used speech recognition for my medical charting for decades. Not all that long ago, we switched from Dragon Network Enterprise to Dragon Medical One (DMO). Overall it has been a significant improvement. DMO integrates with electronic medical record systems such as Cerner or Epic at the server level. This brings better recognition and makes it easy to use speech recognition to complete or amend charts from home.

However, one lasting complaint from my partners was the sign-in for DMO. When you start up Cerner or Epic, a separate sign-in dialog pops up. You have to put in your username, and then pick a vocabulary; every time we logged in, we had to change this from “General Medicine” to “Emergency.” Couldn’t we set this as the default and not have to change it each time? Finally, after many months, we heard that it would default to General Medicine for everyone and we should leave it there.

After this, I spoke with someone from our Nuance/Dragon support team about this. He explained it to me this way. Those different choices we had been forced to choose from? They did nothing. Absolutely nothing. It was, as he said, “a placebo button.” The vocabularies that we each were assigned, two of them, were set by Nuance when they set DMO up for us. Neither we nor his support team could affect this, and our prior choice of “Emergency” every time we logged in was totally nonfunctional: it did absolutely nothing. The IT support person told me that that button is still entirely nonfunctional, and so they asked Nuance to remove it. Nuance said that, due to the structure of the system, they couldn’t remove it.

We had discussed data pixels in another post. We also discussed anti-data pixels, things on the screen that distract you from the real data. Well, those pixels where you can pretend to choose a Dragon vocabulary but it does nothing, making you do work that does nothing? Or pixels in some other program that look like they do something but don’t? Or those that entice you into doing the wrong thing,or wandering off into dusty back hallways of the software. And are so enticing that you accidentally click them on a regular basis? Let’s call those anti-user pixels. Perhaps a more formal definition of anti-user pixels might be: “pixels on the screen that have not been removed, because they’re so tightly tied to the underlying code structure that taking them out requires major effort, or other reasons such as corporate requirements, that are not only useless and distract from data pixels but mislead and make a user’s job harder.” I challenge you to report other examples of anti-user pixels, or better definitions, in the comments section.

This relates to some other user interaction design and coding concepts that we can apply (with a little cutting and fitting) to this issue:

  • Information Hiding: in this context, if it’s not immediately relevant to the particular user or task, hide it so people can better see the forest rather than the trees, and are less likely to click something that will lead them astray and perhaps lost amongst the trees
  • Task/Work Process Analysis: in this context, analyzing users’ tasks and work processes and omitting needless pixels (like Strunk and White’s “omit needless words”), pixels that don’t contribute to the task at hand.
  • Discount Usability Testing: cheap and easy ways to test usability of a task or work process using a particular software product. See: https://ed-informatics.org/2009/12/29/computers-in-the-ed-4/
  • Encapsulation: in this context, presenting the user with a screen with the most common or best choices for the task at hand; and, hiding rarely or never used choices behind a single or at most a few less-enticing links: basically, keeping information related to the task together in one place and only presenting the user with that information. That requires knowledge and understanding of the various user tasks and work processes; see the two above bullets.

This also relates to an idea, introduced in the first edition of Alan Cooper’s groundbreaking 1995 book About face: The essentials of user interface design. It’s that a user interface should correspond with the user’s mental model rather than the coder’s/programmer’s implementation model.

User Mental Models vs. Implementation Models
User Mental Models vs. Implementation Models

Share

Safety

I was recently invited to sit in on a meeting of the American College of Emergency Physicians (ACEP) ED Health Information Technology Safety Task Force. We were talking about medical error related to technology. (In your mind, substitute “technology” > “the @#$@#$ computer”). We were discussing how to make it easy to report errors and near-misses, and who should get these error reports. I opined that it depends on the seriousness, and that there is a continuous spectrum from “kill-the-patient” errors/near-misses to “annoying, slowed me down so I made a mistake,” and that usability contributes to “computer” errors. I will now demonstrate that this is true.

Emergency physicians are frequently interrupted.

Computer systems with high cognitive friction take longer to use.

Therefore, if you take longer to complete tasks on the computer, you are more likely to be interrupted.

Interruptions cause errors. In multiple ways.

Therefore, poorly usable computer systems cause increased medical error.

Quod erat demonstrandum.

Share

Point-and-Click

This entry is part 43 of 44 in the series Words

A point-and-click electronic medical record (EMR) can be very fast, at least for simple, uncomplicated cases. However, some point-and-click EMRs try to convert the information from clicked checkboxes into English. The results, just like the corrections applied by a word processor spellchecker, or the misrecognition when you’re using speech-to-text on your phone, can be amusing. My recent favorite text was when I was at a Chinese restaurant after work and my wife, who was on her way home from something, texted me to bring home some tungsten fried rice. (It was supposed to be young chow = combination fried rice.) But my all-time favorite is what I first heard of as Ode to a Spellchecker.

Candidate for a Pullet Surprise
by Mark Eckman and Jerrold H. Zar

I have a spelling checker,
It came with my PC.
It plane lee marks four my revue
Miss steaks aye can knot sea.

Read the rest of this entry »

Share

Bold

This entry is part 42 of 44 in the series Words

Sometimes usability is just typography. Or perhaps common sense. Look at the following demographic section at the top of a LabCorp lab report. (Yes, I like to name names. It’s OK: truth is an absolute defense against claims of slander.)

Imagine you’re working in a very busy ED and the follow-up nurse hands you a lab report with this at the top. Quick: How old is this patient? Male or female?

Patient-Details-1

My reaction: AAAARGH! Yes, the reason I picked this example is because it really bothers me. And, because it provides a really good bad example which is excellent for teaching purposes.

Read the rest of this entry »

Share

Testing

This entry is part 41 of 44 in the series Words

The Federal government has warped the fabric of healthcare. By giving away money. They’ve done this both to doctors’ offices and hospitals, for “meaningful use” of healthcare information technology. You get the money only if you use software that the Feds certified to meet their criteria. This is supposed to get us to rapidly have interoperable, highly-functional and easy-to-use electronic health records. But… yes, there’s always a but.American Flag Waving

One of the Federal electronic health record criteria is that the software has been tested for usability. And there’s the rub.

On February 21, 2012, the US National Institute of Standards and Technology (NIST) published (NISTIR 7804) Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records. This establishes a process for usability testing, the EHR Usability Protocol (EUP):  The EUP is a three-step process … (1) EHR Application Analysis, (2) EHR User Interface Expert Review, and (3) EHR User Interface Validation Testing.

Federally-certified vendors of electronic health records have to use a User Center Design process (UCD), for testing the usability of their software. They must test their software against a minimum of fifteen end-users according to the certification rules. Specifically, they must:

  1. Tell which user-centered design process they used, and if it is not a standard process such as the NIST EHR usability protocol above, provide a detailed explanation of how it works.
  2. Provide user-centered design test results for eight specific different EHR capabilities.
  3. Once the product is certified, make the usability testing reports public, with how many and what kind of users tested the software.

(During this whole blog post and any discussion of the issues raised by it, you must understand that the term “electronic health record” includes other functions of a Hospital Information System, such as computer-based practitioner order entry. Not my definition, but the Feds and in particular NIST seem to think that ordering medications is a function of a medical record system.)

Read the rest of this entry »

Share
//commented out L sidebar 7/26/11 //