Anthropology

This entry is part 33 of 44 in the series Words
Lawrence of Arabia

Lawrence of Arabia

When doing usability testing (see Discount Usability Testing) we tend to act like anthropologists, observing people using computers as if they were savages performing quaint native rituals.

In a post in UXmatters, Jim Ross argues that we should also use the anthropological technique of participant observation: basically, going native. Or, in other words, trying to accomplish the user’s tasks on the computer ourselves.

There are arguments against this approach.

One is: Who’s doing “going native” to test the program? If it’s the coder who wrote the program in the first place, then it’s hard to argue that this is a legitimate test as the coder already knows the program inside and out.

Or is that true? A guy I know coded a program then six months later, as a user, couldn’t get it to work right without a few tries. While you might think this is a rare occurrence,  my personal usability analysis of many leading medical programs suggests this might occur fairly often.

In fact, you can argue that you should take the coders or tech support people, give them tasks often performed by users, and make them use the program over and over until they can get it right each time. Time them.

Carrying the argument further, the ideal person to “go native” is someone who is naïve to the program, yet has a background in usability: an independent usability analyst. I expect that usability consultants will recommend this highly (think: job security). That doesn’t mean it’s a bad idea.

Share

Wireframes

This entry is part 34 of 44 in the series Words
A wireframe document for a person profile view

A wireframe document for a person profile view

A common technique for prototyping computer screens is to use wireframes. A recent article in UXmatters discusses wireframes, and asks whether wireframe prototypes are used by program designers as a substitute for real collaboration.

That’s a good question. But I think this is a better one: is showing wireframes to people a poor substitute for figuring out what users need to do, and then designing and refining a workflow process that works for them?

Wireframing, also known as paper prototyping (because it can all be done on paper, without wasting a single electron) can be an effective tool during design. However, it is not a substitute for sitting down with some of each class of users, using anthropological techniques to document the tasks they are accomplishing as they work, and using personas to guide the design of the computer-based work process for these classes of users, and then going back and using discount usability testing to refine the process.

Wireframes are good but not a substitute for either collaboration or task analysis.

Share

Fitts’s Law

This entry is part 35 of 44 in the series Words

Fitts’s Law has been known since Paul Fitts first proposed it in 1954. Wikipedia has a detailed exposition of Fitts’s Law. In essence, it says that “the time required to rapidly move to a target area is a function of the distance to the target and the size of the target.” “Targets that are smaller and/or farther away require more time to acquire.” While this has many applications in industry, we are particularly interested in computer applications, and, specifically, usability of medical software.

Fitt's Law Diagra

Fitts’ Law

We can expand this definition a bit, by being engineers and designers and critics rather than scientists. It is reasonable to assume that the harder something is to do, the more fatigue – mental, physical or both – it will entail.

We know from the Pen-Ivory experiments that paging is better than scrolling. Many vendors are tied to the idea of resizable windows, both due to laziness, and due to user demands to use the maximum space on their monitors. But as with lines of text, increasing the window size may decrease readability and usability.

Many medical applications present us with pages filled with a massive number of small targets. We know that a larger the number of choices on a screen means a more cognitively-tiring process in selecting among them. But there is another dimension to such cluttered pages; when clickable items are widely separated on the page, Fitts’s Law tells us that using the page could be made easier, in both physical and cognitive terms, by decreasing the number of clickable items on the page and increasing their clickable target size. As Strunk and White says: “omit needless clickable items.” (I paraphrase slightly.)

Fitts’s Law is interesting. But for medical applications, where a wrong click may have consequences far beyond navigating to the wrong page, it’s something all developers should keep in the front of their minds. Wrong clicks can kill.

Share

Kludge

This entry is part 36 of 44 in the series Words

On occasion, an academic paper is published, but one of the following Letters to the Editor or editorial is much more important, with a longer-lasting influence than the original article. An example is an editorial about sore throats/tonsillitis by Dr. Centor, of Centor Criteria fame. Well, now we have similar situation in the field of medical software usability.Kludge with duct tape

An article in the Annals of Emergency Medicine discusses a method to help prevent wrong-patient order entry: introducing a popup with name, age, and sex, chief complaint, bed location, length of stay, and recent medication orders, but also a mandatory 2.5 second pause. If you’re interested in informatics, I don’t recommend the article, as it discusses an inelegant, klunky, duct-tape-type workaround that should never be emulated.

Read the rest of this entry
»

Share

Ebola

This entry is part 37 of 44 in the series Words

Ebola VirionLet’s suppose it is 1980. Suppose someone shows up in your ED with a fever, and a history of travel to an area with a new plague characterized by fever. The nurse has heard about this on the news, asks the patient about travel to the area, and gets a “yes.” The nurse not only writes this on the paper chart, but tells one of the ED doctors about it. The patient is correctly identified as a possible plague carrier, and admitted into an isolation room.

1950s Emergency Room

Mid-20th-Century Emergency Room

Let’s now suppose it is 2014. There is a shortage of primary care physicians. Primary care physicians no longer see emergencies, even minor emergencies, in their offices. EDs are much, much busier, and overcrowded. As a way to make things better (and, let’s be honest, to make money), vendors have developed electronic medical record systems (EMRs). Physicians, nurses and other ED staff give these hospital-wide EMRs low grades for usability, but the Federal government has been dangling big bags of money in front of hospital administrators as an incentive to buy an EMR. The government succeeded in persuading hospitals to go ahead full-bore with hospital-wide EMRs irrespective of their poor usability.

Read the rest of this entry
»

Share