Category Archives: General

Who let Ebola Out? The Computer did it!

Published by:

Ebola made a sensational entry into the cognitive radar of Americans with an infected patient being allowed to go home even after having been to a hospital. If the hospital is to believed, it is the EHR’s fault. The Politico report Did a computer raise Ebola spread risk? is one of several that finger the EHR. Was it really the EHR’s fault, or is it just a convenient scapegoat?

The good questions to ask at this stage are:

  1. Was it really the EHR’s fault?
  2. Could the result have been different with a different EHR?
  3. Could the EHR have been configured differently to prevent this blunder?
  4. Should the EHRs be updated each time an epidemic threat is on the horizon?
  5. Could the result have been different with paper based charting?
  6. Should travel history be part of the routine history questions for all patients, and then be added to the “always visible” sections (e.g., Problem List) of the chart, if they have traveled from a region with certain infectious diseases?
The truth is that the EHRs, as they exist today, reduce the clinician’s vision to a telescopic one.
Clinicians could be scrutinizing a particular part of the data with great intensity, while ignoring a much bigger elephant sitting right next to them. There is an urgent need to create an alternative way for the clinicians to size up entire data or at least automatically flag any piece of it that is suspicious or an outlier.

Imagining Healthcare–Some Secret Desires

Published by:

I challenge you to look at the financial world and not come away depressed. Or take a look at the happenings in the political arena and tell me if it doesn’t leave a bitter taste in your mouth. And then I invite you to turn your attention to what is happening in the technological and scientific world. I bet it will turn even the most despondent among us a little bit optimistic.

Then there are those times when you encounter something which just puts a smile on your face. Witnessing IBM Watson’s virtuoso performance was one such moment. Today, there is this news on TechCrunch of a new way of interacting with computers that a group of researchers from Microsoft and Carnegie Mellon University have come up with. Take a look:

Here is how Computing Now describes it:

A wearable projection system that Microsoft Research and Carnegie Mellon University (CMU) developed lets users create graphical-input interfaces on any surface. OmniTouch has a pico- projector that can display images of keyboards, keypads, or other traditional input devices onto any surface, even a human hand. It uses a depth-sensing camera—like that used in Microsoft’s Kinect motion-sensing input device for the Xbox 360 video game console—to track a user’s finger movements on any surface. The system is mounted on a user’s shoulder, but the researchers say it eventually could be reduced to the size of a deck of cards. Chris Harrison, Microsoft Research PhD Fellow and CMU doctoral student, said OmniTouch lets users have a wide range of input because it is optically-based. OmniTouch does not require calibration. According to the researchers, a user can begin utilizing the device without having to calibrate it.. Work on the project will be presented 19 October at the Association for Computing Machinery Symposium on User Interface Software and Technology in Santa Barbara, California. ( Harrison website)(Carnegie Mellon University)


Combine this with Watson like intelligence (or even something like Siri) and you have a powerful system. Shrink it down to a head-mountable size, small enough to fit on the front of a baseball cap, improve the computer vision algorithms it uses and you have a technology that is, in terms of features, already ahead of HAL 9000 or the on-board computer of the Enterprise of Star Trek fame. I believe this could be made available in roughly this configuration in 2 to 3 years.

Now, why do I believe this has the potential of revolutionizing healthcare? It addresses several prickly challenges peculiar to the doctors’ needs.

  • The clinicians almost always have to use both their hands (and sometimes their minds) for the procedure they are performing (measuring BP, performing a clinical examination, surgical operation etc.). Leaving the patient to access a computer (or even a tablet) is not convenient. Using a tablet like device brings up the issue of sterilization and the ability of such devices to tolerate the sterilization procedures.
  • Tablet computers show their outputs only in a limited area – a tiny screen bounded by its bezel. A representation of the real world has to be recreated within these confines (something like augmented reality). With the OmniTouch approach, any surface, be it the wrist of the surgeon, abdomen of the patient, or her pelvic cavity, not just becomes the input device; it also is transformed into a screen. Never before has computing been this close to the real world.
  • Most computers require data to be entered explicitly. However, is it possible that we go about our business, and computers do the data capturing, without intruding? I think we are on the cusp of a technology convergence where this would not seem so far fetched. OmniTouch (or more specifically Kinect) technology will have to be combined with some nifty activity recognition capabilities to achieve just that. Imagine, a nurse administers an antibiotic injection while the head-mounted device, recognizes the drug, the patient, the nurse and the act of administering the drug, using a combination of bar-code/QR code recognition, facial recognition and activity recognition. All data recorded, no keys pressed, no notes dictated!

Many possible uses come to mind.

Scenario 1

Just imagine, a newbie surgeon getting the guidance from this device. I can visualize him taking a peek at the text book, for the next step in surgery, projected next to the incision site on the sterile drapes covering the patient. Combined with a Siri like interaction capability, you could very well imagine a scenario like this:

(Let us call the smart tool made by melding these cool technologies, Annika*, and let the rookie surgeon’s appellation be Dr. Greenstick, for our little fantasy’s purposes)

Dr. Greenstick (muttering to self): This looks like the internal iliac artery.
Annika: Dr. Greenstick, I think it is the ureter. I would think twice before ligating it. Why don’t you clear away some of the fascia so that it is more visible.

(Dr. Greenstick teases off some of the fascia in the pelvic fossa)

Annika: I can see it is the ureter. Do you want me to point it out for you?

Dr. Greenstick: Yes, please.

(Annika projects a fluorescent green line, curving along the course of the ureter in the pelvis making it obvious)

– – – – end scene – – – –

Scenario 2

As the nurse adjusts the Oxytocin infusion pump for a patient in labor with tardy uterine contractions, Annika projects the recommended infusion rate right on the surface of the infusion pump console, calculated based upon an assessment of contractions and the fetal heart rate.

Scenario 3

The neonatologist examining a baby indicates to Annika, by placing both his index fingers on the opposite sides of the baby’s head, the level he wants the head-circumference to be measured. Annika obliges by displaying the circumference (and a graph to show if the circumference deviates from the normal) right on the forehead of the baby.

Scenario 4

A diabetologist is monitoring the progress of a slow healing foot ulcer of patient on his return visit. Annika quietly displays, next to the ulcer its image from the patient’s previous visit, to allow the diabetologist to compare it with its present state.

I could go on, but I am sure you get the picture.

Oh, OmniTouch, how you have spurred the imagination.

Somewhat incongruously, it reminds me of a couplet from a ghazal (a form of music and poetry popular in the Indian subcontinent). It goes like this:

Agar sharar hain to bhadkein, jo phool hain to khilein
Tarah-tarah ki talab tere rang-e-lab se hai

My rough translation (with apologies to the great poet, Faiz):

If they be embers let them burst into flames, if they be flowers let them blossom
So many be the desires that the color of your lips inspires

Yet another secret desire – IBM Watson and OmniTouch teams please get together to bring these fantasies to life, for the larger good, eh? And come on now, what’s with the name OmniTouch!? Can you not think up of a name that befits such cool piece of technology.

*Annika – a human female who is ‘assimilated’ by the Borg, rendered into one of their own, enhanced by the technology and knowledge of all the civilizations previously assimilated, and given the designation, Seven of Nine. Later she was rescued by the Voyager team and inducted as a Starship Voyager’s staff –  ruthlessly efficient, emotionally distant and yet very sexy. She goes through an agonizing process of rediscovering her humanity but the vestiges of the Borg still remain a part of her.


Dazzling, Dr. Watson!

Published by:

So, IBM’s Watson won Jeopardy hands down, playing against Jennings and Rutter.

I am sure many NLP researchers will say there’s nothing new in what IBM’s Watson project achieved – all this has been done before. It has also not received the kind of attention that it deserves in the tech and mainstream media. The hoopla surrounding Kasparov’s loss to the chess playing machine Deep Blue was much bigger.

Incremental advancement in the eyes of many, to me, it is a momentous event. Watson is as much a research feat as an engineering marvel. It brought together many of the advancements in software and hardware to to notch this singular, elegant success.  I get a sense that IBM’s Watson research team understands the import of this achievement but is downplaying its implications a little bit. By all indications they did treat it as a their own little Manhattan project.

Let me explain why Watson is a daunting technology for me – an informatics researcher’s view, if you will:

Search will never be the same again

Google’s search delivers dumb web pages, but Watson, without being connected to the Web, delivers answers. This is what the all so many search engines that keep cropping up have been promising but failing to deliver.

It learns

Watson learns the real connections between facts and that too from all the undisciplined way we humans have been documenting them. If in a matter of two years it could beat the Jeopardy champs, imagine what will it be able to do down the line. It can certainly learn from its own mistakes (and successes) as from your and mine. Sure Google also learns but it learns paltry little from the same resources.

We do not need Structured Data anymore

One major challenge, in particular for healthcare, has been having the data in a form so that we can use the existing powerful database operations like finding the right information, cross referencing different items, from it. It has always needed humans to be disciplined enough to enter the structured data to work around this failing of computers – the inability to work with unstructured data. Humans have to adapt to computers if they are to be fully exploited. This is the reason, why most EMRs are such inadequate tools for the clinicians’ primary tasks; EMRs are willing to accept unstructured data but have little capability do much with it. With Watson like technology under the hood, now you, as a clinician, can keep merrily describing the patient as you would to a resident or a colleague, the data will be ‘understood’ and stored the proper way in its memory for asking all sorts of interesting questions about the patient at a later point.

In fact, if the technology is adapted to include action recognition from other cues (videos, bar codes, sensors etc.) even documentation by narration will become increasingly redundant.

We do not need to author Rules

Many business and clinical solutions get their smarts from rules engines but the rules which provide the actual logic in them, are authored by some human expert. Watson like technology will make that redundant. If you tell it that the patient is a male, presenting with acute abdomen, has right iliac fossa tenderness and fever, it will, with moderate level of confidence, tell you that it is acute appendicitis, and that you better get blood counts and sonography, to clinch the diagnosis irrevocably.  The thing is, no one would have fed the rules for differential diagnosis of acute abdomen anywhere in the system – it would have learned that from reading the surgery textbooks it is provided beforehand. For a while, I think there will be back on forth between the doctor and Watson, for one diagnosis that their combined understanding can settle upon, much like discussing a case with a colleague.  But Watson would learn and remember much more from those interactions than the doctor would, progressively diminishing the  need for the latter.

Information Retrieval Researchers can go home

The information retrieval researchers can also start packing up or quickly find some other problems to solve. Since nearly a decade medical informatics researchers have been trying to develop ways of providing adventitious information to the clinicians, that is highly pertinent to the patient on hand and his/her current problem – something like context sensitive help. There have been several ideas but all of them focused on tagging the data and resources themselves in a particular way to make this possible. These approaches will become redundant, since Watson like tech will not just identify the context much better but also dip into its learning to deliver the right resources. And all of this without any new XML tags having to be created

Knowledge Discovery will take a leap forward

Since it discovers patterns some parts of the technology suite of Watson can help pull out nuggets of unsuspected connections between facts. This will allow identifying things like new causalities for diseases and unsuspected benefits and side-effects of drugs, diets and interventions.

High level professionals should start feeling the heat

First they came for the typists, and I didn’t say anything because I was not a typist, then they came for the clerks, and I didn’t say anything because I was not a clerk ….

In fact, I even caught myself smiling smugly because I was a highly educated medical specialist. I knew the computers posed no risk to me. Suddenly, I am not so sure.  Tally all that I have written above and it will be obvious to you that we are about to cede much of the intellectual grounds also to the computers. What will remain will be the contact based part of healthcare. Well, at least until the robots achieve a little more dexterity and are able to feign a better smile, when they say, “And how are you today, Mrs. Patterson” in a calm, reassuring and friendly voice (think HAL 9000).

Get a glimpse of the DeepQA project which resulted in Watson, from Dave Ferrucci the PI of the project.

I look forward to the day when we will be able to deploy Watson as an inference tool for Proteus.

Now, only if they can bring Watson’s size down to fit into my smartphone and for it to understand the Indian accent.


Healthcare Informatics Services via Cloud – IEEE Workshop

Published by:

IEEE’s annual International Conference on Web Services and Cloud this year is featuring a special health informatics workshop.

Find more about the workshop and its call for papers here.

If you are interested in use of Web Services or Cloud Computing to make a difference in healthcare this will be an event to keep an eye on.

This is a great opportunity to present your ideas and experiences or demo some of the work you have already done.


SDCI Presents at Henry Ford Quality Expo

Published by:

An annual fixture for Henry Ford Health System is its Quality Expo. This is an opportunity for the hospitals, clinics and many other departments of Henry Ford to present any efforts aimed at improving quality of our services. This year our Semantic Data Capture Initiative project was also on display. Team members, Teresa Hantz and Patti Williams of the CSRI Department created the flash video that was displayed next to the poster. We all were impressed by how quickly Teresa mastered the applications, Protean and the video capturing tool. It is also noteworthy that she managed to highlight the essence of the tools in less than three minutes of video.

This introductory video provides you with a quick overview of knowledge editing in Proteus environment as well as how easy it is to edit a rule in GreEd.

You can check out the video here: quality_expo2009.swf.  We suggest running the video in full-screen mode of your browser (press F11).


Proteus Open Source Now!

Published by:

This is to announce the availability of source code for tools related to clinical decision support guidelines model, Proteus under an open source license (EPL). The open source development will proceed under the new Proteus Intelligent Processes (PIP) project.

With this announcement, we are also opening up the project for general participation. The code and related information can be found at  The home for Proteus will remain at Introductory information about the rule authoring system GreEd is available at

This also coincides with the release of the version 2.7 (beta), which has several new features to make knowledge authoring more exciting and easy.  Take the new application for a spin by downloading it from

What’s New

I list some of the new features in Version 2.7 below:

Protean (Clinical Workflow Authoring Tool)

  • Sharing executable knowledge
  • Unlimited undo and redo
  • Promotion and demotion
  • Move an item from one location to another
  • Search your library of components

GreEd (Rule Authoring Tool)

  • Undo and Redo
  • Default Inference
  • Semantic Guidance and constraints
  • New operators for your expressions, like [N of M] and [Between]
  • Date Fields and Operations

Read more about the new features here:

This is a major milestone for Proteus which was made possible by contributions from many wonderful people. Much of the development for this version was done in the Semantic Data Capture Initiative project of Henry Ford Health System, my employer. Besides Henry Ford, Lister Hill Center of National Library of Medicine played a critical role at the nascent stage of Proteus. Several ideas related to metadata usage and rule authoring were developed at City of Hope National Medical Center.

We will be scheduling a web seminar to provide a quick introduction to Proteus, GreEd and the PIP project and demonstrate the tools. Please let me know if you are interested in participating.

I will be at the upcoming AMIA annual symposium, in San Francisco and will be happy to meet you if you are planning to attend.

We welcome your participation and feedback.

Feel free to contact me.


Get to know GreEd Better

Published by:

In one of my previous posts, I had promised that I will be sharing with you more information about GreEd.  I did one better; I posted several pages about GreEd. You can find these here:

The same pages can also be accessed from the top menu of this blog.

Stay tuned, we will be adding some flash demos and tutorials for GreEd in near future. I will also keep you informed about development of GreEd.

P.S. Do not worry about mispronouncing GreEd, it is pronounced same as the good old human foible – greed. Either way, we wouldn’t be too offended.


A ‘Houston’ for Clinicians

Published by:

Imagine an astronaut hurtling through the immense void on his journey towards Mars in a space probe. It is a lonely journey and scary adventure. There are untold unknowns and even the known variables are so many that to keep a constant eye on each one of them is not possible for a single human. The only chance he has of making the adventure a success is by getting constant support from the proverbial ‘Houston’. ‘Houston’ translates into a large of team of scientists and engineers manning an array of sophisticated equipment in constant radio contact with the Mars probe. So not only can our astronaut flip a switch and say, “Houston, we have a problem”, when he senses something out of the ordinary, ‘Houston’ can also proactively inform the space traveler of any important issues that he needs to be aware of. ‘Houston’ might even address some of the issues remotely without distracting the astronaut from whatever else he might be doing.

Turn your gaze earthward now and look at a clinician entering into an exam room to see her patient. Continue reading


A Clinician’s Angst with Computers

Published by:

All of us who are in the field of informatics are well advised to pay attention to voices like these: Op-Ed Contributor – The Computer Will See You Now –

So many times have I heard the whines of those who develop applications for the clinicians. The refrain goes something like this, "The doctors are technology averse", "The clinicians are not interested in learning new ways of doing things".


There are innumerable examples of doctors embracing technology when it helped them achieve the goal of delivering better or more efficient care. I remember the days when ultrasound first arrived on the scene. The ultrasound machines in those days were kludgy pieces of equipment. The screens were tiny, with no shades of gray, the text messages displayed were cryptic, the buttons on the console were ill-organized. In short, they were the epitome of user-unfriendly technology. Yet, the clinicians took to them with gusto. Why? Because they gave them that extra edge in looking at soft organs. Obstetricians, whose notoriety as technology-averse is a legion, took the lead in adopting the technology. So, let’s just agree that the clinicians do not like technology is a myth promoted to conceal the failure of the discipline of Information Technology in meeting the needs of the clinicians.

A question that is relevant for us who are working with Proteus in the Semantic Data Capture Initiative (SDCI) is how much will the clinician feel shackled by the pre-defined processes which are created based upon guidelines? Does the clinician really need to enter all the data in exactly the same order as prescribed by the process or the guideline? We have taken an interesting approach to address this. In the web interface we are developing to provide clinical decision support from the Proteus Engine, we will allow the clinicians to enter any data that any of the templates for the guideline/process require, at any time. However, they will only be able to submit the data of each individual clinical context (defined by the knowledge components) only when the executing process reaches that particular point. This allows us to have best of both worlds: constraining data submission based on the needs of the decision support system while allowing the clinician to freedom to type ahead if they feel like it.


What Rules In the World of Clinical Decisions?

Published by:

Recently one of the NY Times blog carried this news about Vanderbilt University Medical Center going in for the Ilog business rules management system The Doctor Will B.R.M.S. You Now – Bits Blog – Other health IT sites and the blogosphere picked up this news and were abuzz for a while with the potential of such systems.

From this:


To This:


One has to stop and ask what is so complex about rules for healthcare that expensive systems are needed to support them. Do we really need them? Continue reading