Monday, January 4, 2016

Achievement vs Activity

Ah, a new year, a fresh start, a clean slate! Welcome, 2016.

Now let the ranting begin, because I have observed something I don't like and reticence is so last year.

New Year's day I attended a party. At that party I met a number of fellow human beings and I struck up a conversation with one of them in the line for chili. As is its wont, the conversation turned to our professions: he a clinician and I a medical information systems specialist. After the initial explanation of what I _actually_ do (all of it--design, project management, coding, deployment, documentation, whatever is required), we ended up at the usual place: what sucks about Medical IT.

(Does this happen to smart phone people? I bet lots of people tell them how awesome smart phones are. No one ever tells me how awesome clinical information systems are.)

For a refreshing change, this clinician does not really care about HIPPA (as an ophthalmologist he is not under the kind of time pressure that, say ER folks are, so the additional security is not that big a deal), he finds the user interfaces adequate (although he could do without the mind-numbing repetition of drug warnings whenever he prescribes drugs) and he feels that the systems he uses are adequately powerful (however he did note that he sees fewer patients than he did 10 years ago and that he is almost certain that he is not delivering substantially better care to compensate for his lower productivity).

What did bother him is the fact that his practice has two full-time IT people for about 15 providers and that these two full-time IT people were overwhelmed, so really they might need to hire a third at some point.

I have no idea how good these two IT people are, or how efficient, or how productive. They are probably perfectly lovely people. This rant is not about them. This rant is about the understandable but regrettable use of activity as the measure of an IT group, instead of achievement.

I can see why people who are not IT experts use activity as a measure: activity is something one can often see, even if one does not understand what one is seeing. If you do not understand what an IT group does, then watching how they spend their time is probably a reasonable short cut.

But managers of IT groups should not be doing this: managers of IT groups should be tracking what gets done (achievement) and how many person hours it takes (productivity). This is why we have the concept of milestones, and why we have project plans.

I suppose that a technical support group can be managed merely by monitoring its activity, but without a concept of outcomes, you end up here:

http://dilbert.com/strip/1999-08-04

Worse, if you judge your IT folks by their activity alone, then staff who take a long time will seem better than staff who take a short time. In other words, you will select for being slow without any particular reason for being good. You will drive away talent and attract dead wood. Then you will find that you need ever more IT people but that your IT is ever less effective.


A concrete example may help:

We saw an organization invest in a super-duper printing system, because their execs work in offices and offices like to print and offices hate it when their print jobs fail. There was furious activity needed to get this system rolled out, with no discernible benefit since print jobs generally print even without massive and expensive printing systems.

Then the massive and expensive printing system's back end (printing) went down for six hours (which was theoretically impossible, but there you are) while the front end (accepting print jobs) stayed up. and clinical work had to go on, so paperwork and labels were done by hand as people had been trained to do long ago. Apps kept automatically queuing print jobs--admission reports, lab reports, patient ID labels, specimen container labels, etc--and people kept writing things down by hand.

Then the massive and expensive printing system came back up and did what it was born to do: printed those out-of-date print jobs. This amounted to a denial of service attack, as labels and paperwork which did not match the patients in front of them spewed out in front of clerks and nurses and PAs and doctors and lab techs. As they often say in clinical work: better never than late.

So  IT was scolded and in response a fail-safe was developed and deployed, essentially a way to shut off the massive and expensive printing system. This was another burst of furious activity, the net effect of which was to make it possible to emulate never having installed the printing system at all.

This was a huge amount of activity, for which the participants were lauded and rewarded. I question, however, how much of an achievement it was.

Keep your eyes on the prize and constantly renew your commitment to your goals or you will end up with too much going on and too little to show for it.

Saturday, November 21, 2015

One Size Fits Few

Bless me, Father, for I have sinned. It has been about three weeks since my last Lab IT confession. I have harboured sympathy in my heart for my competitors.

How can this be? Allow me to complain. I mean, explain.

I am often asked why a given LIS is so [insert bad quality here]. Much as I often agree with specific complaints about specific systems, I usually feel a hidden stab of pity for the team which produced the system because that team was asked to provide a single solution (the software) for a large problem area ("the lab.")

The fact of the mater is that the typical case of "the Lab" is not a monolith. It is not even a collection of more-or-less similar parts; rather it is a patchwork of very different subgroups who happen to share a need for a particular kind of work area.

Specifically, the requirements of the different workflows vary over such a large area that I am not surprised that a "one size fits all" approach usually fails: either the one size is the wrong size or the one size is also a collection of loosely-affiliate software modules which interoperate rather poorly.

Consider those requirements: on one end is clinical Chemistry:
  • high volume, low relative cost (although vital to more valuable services)
  • highly automated (very reliable auto-verification and analyzers)
  • significant time pressure
  • well-suited to an industrial approach such as Six Sigma
  • high throughput required, turn-around-time is very important
At the other end is Microbiology or much Genetic and Molecular testing:
  • lower volume, higher relative value
  • difficult to automate
  • poorly suited to industrial process control
  • takes a long time--no way around it
  • yields an often complex result requiring expertise to characterize
 Throw Haematology into the middle somewhere. Immunology is somewhere closer to Microbiology. Slot your favourite lab section into the appropriate place on the fast/simple vs slow/complex continuum.

So how does one provide a solution which is optimized for both of these extremes? Very carefully.

All to often vendors pick a section, optimize for that section and then claim that all others sections are "easier" in some way so most systems are not optimized for most of the users. Why is there so much complaining about lab systems? Because the situation makes it inevitable. Perhaps we should be surprised that there is not more even more complaining about lab systems.

Monday, November 2, 2015

EMR Disappointment And Acceptance

This past summer I was on a train in the Paris area and happened to share a train car with someone who was also from the East Coast of the US. We chatted and found out that he was a doctor, which always makes me slightly tense. Sure enough, once I mention that I am a medical information systems specialist we somehow end up on the topic of how bad so many of them are.

Why is that? I assume so many health care professionals have so little regard for their Electronic Medical Records system and other e-tools of the trade because these tools are not very good.

At least these medical informations are not very good at supporting many of their users. These medical information systems probably excel at supporting corporate goals or IT departmental goals.

The specific complaints by my companion on the train were painfully typical:
  • the screen layout is cramped and crowded;
  • the history available on-line is too short;
  • the specialized information (labs, x-rays, etc) are not well presented;
  • the data is biased toward general medicine and internal medicine.
But what struck me about our conversation with his resignation. While we rehashed the usual complaints and frustrations with large Commercial Off-the-Shelf (COTS) systems, he was more resigned than anything else. He just doesn't expect anything more from the software supporting his efforts to deliver clinical care.

We expect great things from our smart phones. We have high standards for our desktops and laptops and tablets. But somehow we have come to accept mediocrity in our systems supporting clinical care. And since we accept it, there is little chance of correcting it.

At least I will have lots to talk about with random clinicians I run into on trains.

Tuesday, August 4, 2015

Too Much Tech, Not Enough Ology

This post is a rant about how the following common ideas combine to make bad IT, especially clinical lab IT:
  • Everyone should express their special uniqueness at all times
  • Tech is hard, ology (knowledge) is easy, so go with the Tech
  • There is always a better way to do it
Alas! I believe that in many clinical lab IT environments, none of these ideas holds true and that the combination of them is toxic.

We Are All Individuals
You may be a special snowflake, all unique and brimming with insight, but engineering is often about doing the standard thing in the standard way, so those who come after you can figure out what you did. Yes, if you produce something truly revolutionary, it MIGHT be worth the overhead of figuring it out, but it might not.

Consider the mighty Knuth-Morris-Pratt algorithm which was a real step forward in text string searching. However, sometimes we don't need a step forward: we need something we can understand and support.The mighty Knuth himself told a story of putting this algorithm into an operating system at Stanford and being proud, as a primarily theoretical guy, to have some working code in production. Except that when he went to see his code in action, someone had removed it and replaced it with a simple search. Because although the KMP search is demonstrably better in many situations, the strings being searched in most human/computer interactions are short, so who cares? Having to maintain code you don't understand, however, is also demonstrably bad. So Knuth had some real sympathy for the sys admin who did this. Engineering is about accuracy and dependability; learn the standard approach and only get creative when creativity is called for (and worth the overhead).

Tech is Hard, Expertise is Easy
Too often I see clients going with a young programmer because (a) they cost less by the hour and (b) the assumption is that newer programmers are "up-to-date" and doing things "the modern way." I suppose this makes sense in contexts where the domain expertise is somehow supplied by someone else, but all too often I see bright-but-ignorant (BBI) programmers implementing in shiny new development environments with their shiny new skills, but producing a dreadful user experience. The only goal in clinical lab IT is to support the lab, to lower mistakes, to raise through-put, to shift drudgery away from humans and onto computers. If you produce a cool new app which does not do that, then you have failed, no matter how cheaply or quickly or slickly you did the work.

I Did It Myself!
Clinical lab IT deals with confidential information. We are supposed to take all reasonable and customary measures to guard the information in our care. Yet we still end up with security problems caused by programmers either "taking a shot" at security, as though it were an interesting hobby, or doing their own version of some standard tool which already exists, has already been validated and which is already being updated and tested by other people. Don't roll your own if you don't have to--security especially but other important as well. You might enjoy the challenge, but this is a job, not a college course.

Conclusion
Pay for experience. Demand that all novelties be justified. Use what is freely and securely available before rolling your own. Stop the current trend toward "upgrades" which make things worse and endless new systems which do not advance the state of the art. Have the tech and the ology.

Tuesday, July 21, 2015

System Interfaces Are Not Kitchen Renovations

Recently I had to write my part of an RFP. The topic was a system-to-system interface between two health care information systems. I went through the usual stages:
  1. Nail down my assumptions and their requirements
  2. Come up with a design to meet those requirements
  3. Translate the design into an implementation plan
  4. Put the implementation plan into a spreadsheet
  5. Make my best guess as to level of effort
  6. Have our project manager review my egotistical/optimistic assumptions
  7. Plug the estimated numbers into the spreadsheet
  8. Shrug and use the resulting dates and cost estimates
The result was all too predictable: push-back from the customer about the time and cost. In our amicable back-and-forth, which seemed to be driven on her side by a blind directive to challenge all prices of all kinds, I had an epiphany: software development in general and interfacing in particular is not a kitchen renovation, so why do customers act as if they were the same?

I have been on both sides of kitchen renovation and there are some similarities:
  • the customer is always impatient
  • the cost is hard to contain
  • accurately imagining the outcome of decisions is an uncommon skill
But there are some crucial differences:
  • the concept of kitchen is well-known and well-understood by most people
  • the elements of a kitchen are similarly familiar: sinks, cabinets, etc
  • examples of kitchens one likes can be found
  • in general the main user is also the main contact with the contractor
Why do I get huffy when people tell me I am padding my estimates? Because writing interfaces between complex systems is like the sand shifting beneath your feet. Sure, it is just another HL7 interface between an industry standard source system and a system of ours which is the intermediary system, but which has to export its data to a completely different industry-standard target system.

Thus we are linking industry standard system A (ISSA) to industry standard system B (ISSB): a piece of cake! Except....
  • ISSA has over 1,500 configurable parameters (literally).
  • ISSA was deployed almost five years ago and no one from that team is around.
  • ISSA's configuration was in flux those first few years.
These factors complicate my job because the source HL7 does not exactly match the spec. Further complications arise from the fact that the users have developed some local idioms, so the data elements are not being used in exactly the standard way.

On the target side, ISSB is still being configured, so I am trying to hit a moving target. Which of the local idioms serve a higher purpose (so they will stay) and which of them were to compensate for issues with ISSA? No one knows. What new idioms will evolve to compensate for issues with ISSB? No one can even guess.

So this is like remodelling a kitchen if the counters used to be suspended from the ceiling but now might be cantilevered from the walls and the water might be replaced by steam.

How long will it take to keep rewriting the interface so that the new system's evolving configuration and the customer's evolving needs are all met? I don't know; I wish I did. In the meantime, my good faith best guess will have to do.

Tuesday, July 14, 2015

Lab Automation vs IT Centralization

Over the past decade I have witnessed two trends in clinical lab computing which I think are two sides of the same coin:
  • Lab process automation is going down
  • IT is centralized, becoming more consolidated and less Lab-specific
 By "Lab process automation" I mean identifying repetitive tasks performed by humans and transferring those tasks to a computer or computers.

By centralized, I mean that the IT which serves the lab is now generally the same IT which serves all other parts of the organization.

I can see the appeal, especially to bean counters, of centralization: more control by execs, economy of scale, etc. But most of the IT groups I encounter are really all about office automation:
  • email
  • shared files
  • shared printers
  • remote access
These are all great if you are running a typical office, which is how everything seems to look from most C Suites.

Alas, the clinical lab is far closer in basic structure to a manufacturing plant than to a law office. Typical corporate IT is not good at process automation:
  • receiving orders
  • receiving specimens (raw material)
  • matching up specimens and orders
  • doing the assays (processing raw material)
  • serving up results (delivering finished product)
At the bench, email support and file-sharing are not very helpful; powerful and speedy instrument interfaces, audit support and throughput analysis are all much more helpful.

But centralized IT is not only oriented away from this kind of business process automation, they are punished for it: why are you spending time and money on lab-specific projects? If you get email to work on your boss's boss's boss's iPhone, you are a hero. If you figure out how to alert bench techs to specimens which need a smear, you are spending too much time in the lab.

Worse, as corporate IT completes its transition into being mostly about vendor management, the idea of doing what the vendors cannot or will not do--plugging the gaps between off-the-shelf products which gaps cause so much of the needless drudgery in lab work--becomes first unthinkable and then impossible.

Farewell, lab process automation: perhaps vendors will some day decide the interoperability is a goal and then you will live again. But I am not betting on it.

Tuesday, June 23, 2015

Better Never Than Late

I often complain that the typical organizational IT infrastructure is too complicated and often not well-suited for the clinical lab. I am often asked to given an example, but so far my examples of complexity were, themselves, overly complicated and apparently quite boring.

Well, all that changed recently: now I have nifty example of what I mean.

One of our customers has a fabulous print spooling system: many pieces of hardware, many lines of code, all intended to ensure that your precious print job eventually emerges from a printer, no matter what issues may arise with your network. Best of all, you route all printing through it and it works automagically.

The fancy print job spooler is so smart, it can reroute print jobs from off-line printers to equivalent on-line printers. It is so smart it can hold onto a print job for you, "buffer" it, until an appropriate printer comes on-line.

Alas, neither of these features is a good fit for the clinical labs, at least for specimen labels. The ideal specimen label prints promptly and right beside the person who needs it. If the label cannot be printed, then the print job should disappear: the user will have had to use a hand-written label or some other downtime alternative. Printing the label latter is, at best, annoying. At worst, printing the label later is confusing and leads to mis-identified specimens. For this job, better never than late.

With effort, our client disabled the roaming print job feature, so the labels (almost) always print where they are needed. But the buffer cannot be turned off--it is the whole point of the spooler, after all--and so after downtime, the now-unwanted labels suddently come pumping out of the printers and if the draw station happens to be busy, the old labels mingle with the current labels and opportunities for serious errors abound.

Print spoolers are nifty. They serve office workers well. They are a standard part of today's smart IT infrastructure. But they don't serve the clinical lab in any real sense. The clinical lab is not a typical office environment: don't treat it like one.