Flat

The Master said: “When the noble man eats he does not try to stuff himself; at rest he does not seek perfect comfort; he is diligent in his work and careful in speech. He avails himself to people of the Way and thereby corrects himself. This is the kind of person of whom you can say, ‘he loves learning.’” [Confucius: Analects 1:14]

User interaction design is now the victim of fashion, or perhaps worse, fads. First, we had what artists call trompe-l’œil, which is creating a 3-D impression by skillful painting in 2-D. A great recent example is a painted concrete bridge in Frederick, Maryland. The computer equivalent of the Frederick Community Bridge is skeuomorphism. Some say that computer interfaces took skeuomorphism to excess. Many cite recent versions of Apple’s iOS operating system; others cite Windows 7’s “Aero” interface.

Most obviously with the Windows 8 “Metro” interface, an anti-skeuomorphic design fad took off. Some tout the “flat design” in Apple iOS 7. Others say that Apple stole the idea from Android, from the highly-lauded Windows Phone, and the Windows 8 “interface formerly known as Metro” (Microsoft removed the “Metro” label and now just calls it the Windows 8 interface; most of the rest of us still call it Metro.) UXmatters has an entire post about it.

[asa]1514304430[/asa]

Read the rest of this entry »

Wireframes

A wireframe document for a person profile view

A wireframe document for a person profile view

A common technique for prototyping computer screens is to use wireframes. A recent article in UXmatters discusses wireframes, and asks whether wireframe prototypes are used by program designers as a substitute for real collaboration.

That’s a good question. But I think this is a better one: is showing wireframes to people a poor substitute for figuring out what users need to do, and then designing and refining a workflow process that works for them?

Wireframing, also known as paper prototyping (because it can all be done on paper, without wasting a single electron) can be an effective tool during design. However, it is not a substitute for sitting down with some of each class of users, using anthropological techniques to document the tasks they are accomplishing as they work, and using personas to guide the design of the computer-based work process for these classes of users, and then going back and using discount usability testing to refine the process.

Wireframes are good but not a substitute for either collaboration or task analysis.

Suicide

Text MiningData mining has been a topic of interest to businesses and researchers for many decades. For physicians and other clinicians, and those designing systems for clinicians, data mining has been of less interest. Yes, you can use data mining to predict the volume of patients in your ED by day and hour. Yes, you can use data mining to order supplies more intelligently. But to improve patient care? Not so much.

Yes, research using data mining can provide us with some clinical information,  but such retrospective studies, especially subgroup analysis, can lead to egregious error, as summarized in a recent article in JAMA. It’s not a replacement for prospective, blinded and randomized studies. Read the rest of this entry »

Anthropology

Lawrence of Arabia

Lawrence of Arabia

When doing usability testing (see Discount Usability Testing) we tend to act like anthropologists, observing people using computers as if they were savages performing quaint native rituals.

In a post in UXmatters, Jim Ross argues that we should also use the anthropological technique of participant observation: basically, going native. Or, in other words, trying to accomplish the user’s tasks on the computer ourselves.

There are arguments against this approach.

One is: Who’s doing “going native” to test the program? If it’s the coder who wrote the program in the first place, then it’s hard to argue that this is a legitimate test as the coder already knows the program inside and out.

Or is that true? A guy I know coded a program then six months later, as a user, couldn’t get it to work right without a few tries. While you might think this is a rare occurrence,  my personal usability analysis of many leading medical programs suggests this might occur fairly often.

In fact, you can argue that you should take the coders or tech support people, give them tasks often performed by users, and make them use the program over and over until they can get it right each time. Time them.

Carrying the argument further, the ideal person to “go native” is someone who is naïve to the program, yet has a background in usability: an independent usability analyst. I expect that usability consultants will recommend this highly (think: job security). That doesn’t mean it’s a bad idea.

iPhones

On May 3, Steve Stack, Chair of the American Medical Association (and an emergency physician from Lexington, KY) gave a presentation on electronic health records (EHRs) to the Centers for Medicare and Medicaid Services. The paper is worth a close read. He observes that physicians are technology early-adopters, but that there had to be Federal financial incentives for physicians and hospitals to adopt an EHR. Why? EHRs suck. (I rephrase only slightly.) He points out that EHRs are immature products. If we judge by human development, and want to use a derogatory term, we might call them retarded, in this case invoking the original meaning of the word retarded, as in slowed development, compared to their peers.Signal

Though an 18 month-old child can operate an iPhone, physicians with 7 to 10 years of post-collegiate education are brought to their knees by their EHRs.

In 2010, a quarter of physicians who would not recommend their EHR to others. Two years later,  over a third were “very dissatisfied” with their EHR and would not recommend it.

When an EHR is deployed in a doctor’s office or hospital, physician productivity predictably, consistently and markedly declines. Even after months of use, many physicians are unable to return to their pre-EHR level of productivity – there is a sustained negative impact resulting in the physician spending more time on clerical tasks related to the EHR and less time directly caring for patients. In a way, it ensures the physician practices at the bottom of his degree.

He gives examples of how a physician’s medical note used to be:

  • 24 y/o healthy male. Slipped on ice and landed on right hand. Closed, angulated distal radius fracture. No other injuries. Splint now and to O.R. in a.m. for ORIF.
  • 18 y/o healthy female. Fever and exudative pharyngitis for 2 days. Exam otherwise unremarkable. Strep test +. Rx. Amoxil

He goes on to talk about how malpractice litigation, billing and coding, and CMS and other insurance requirements for payment have bloated the medical record. And further, how EHR features such as templates, macros, and cut-and-paste have homogenized medical records while (with the difficulty of typing or dictation) decreased the visit-appropriate essential information.

Seems to me that over the past 30-40 years (yes I’m that old) the medical-chart signal-to-noise ratio has gone from 0.99 (99% of the chart is signal, that is, clinically useful information) to, at least for EHR inpatient progress notes and ED notes, to 0.1 (10% signal, and 90% noise).Noise

One of his three conclusions was

ONC [Office of the National Coordinator for Health IT] should immediately address EHR usability concerns raised by physicians and take prompt action to add usability criteria to the EHR certification process.

Bravo!

//commented out L sidebar 7/26/11 //