Showing posts with label HIPAA. Show all posts
Showing posts with label HIPAA. Show all posts

Monday, January 20, 2014

Privacy Rules vs Care Delivery

In medical IT, we are often asked about HIPAA compliance, much in the way the Brothers Grimm probably asked little children about being in the woods: scare us into doing the "right" (ie legal liability lowering) thing.

When people say "HIPAA" generally I assume that they mean the privacy rule of HIPPA, specifically, which Wikipedia summarizes thusly:  (go here for the full text)


The HIPAA Privacy Rule regulates the use and disclosure of Protected Health Information (PHI) held by "covered entities" (generally, health care clearinghouses, employer sponsored health plans, health insurers, and medical service providers that engage in certain transactions.)[17] By regulation, the Department of Health and Human Services extended the HIPAA privacy rule to independent contractors of covered entities who fit within the definition of "business associates".[18] PHI is any information held by a covered entity which concerns health status, provision of health care, or payment for health care that can be linked to an individual.[19] This is interpreted rather broadly and includes any part of an individual's medical record or payment history. Covered entities must disclose PHI to the individual within 30 days upon request.[20] They also must disclose PHI when required to do so by law such as reporting suspected child abuse to state child welfare agencies.[21]

So is the primary goal to maintain privacy or deliver effective healthcare? If you said "both, of course!" then I must respectful say "balderdash!" I am well aware of the standard privacy advocate claim that one can easily do both, at the same time, with no loss of effectiveness. It is my experience that not only do the two goals not co-exist, they also work against each other in many instances.

In lab IT, this is most often the following tension: deliver lab results quickly to whoever might need them--nurses, PAs, MDs, NPs versus ensure that every access is by a care giver, specifically a care giver who is part of this patient's team. Even if the IT system user is a nurse who was doing something else when she was asked by a code team member to look up something on behalf of a caregiver who is currently not in a position to authenticate themselves.

When I ask privacy advocates how to balance these concerns the most common response is the claim that there is no problem: if IT does its job, then all required data will always be disclosed to the correct parties, but not the incorrect parties, in a timely manner. As someone who actually deploys systems in the real world, I find this answer supremely unhelpful.

When I ask security professionals how to balance these concerns, they ask me to restate my question as a risk:benefit statement, at which point they will help me figure out how much security to combat which risk. But when I respond that the risk is that security will interfere with the delivery of healthcare, I am referred to the standard menu of risks from which I may pick:
  • leaking information to blackmailers
  • leaking information to underwriters (insurers)
  • leaking information to the public

This company has a nice way to frame a conversation with a CISO, assuming that the organization is not a health care provider. You can find that conversation starter here: http://www.ey.com/GL/en/Services/Advisory/Cyber-security---Steps-you-should-take-now?utm_source=Outbrain&utm_medium=TextLink&utm_content=steps_ceo_ciso_Outbrain&utm_campaign=GISS_OLA_nov_dec_2013

But working in medical IT, I feel that I need a solution that takes into account some other considerations:
  • NOT disclosing information may harm someone, so I do not want to use solutions which assume that all disclosure is bad
  • disclosing information to unauthorized health care providers is often covered by other legal means, eg medical licensing so isn't that "breach" of rather low signifigance?
  • the information does not belong to the parent organization in the first place so taking steps to protect it must include ways to share it on demand
If anyone knows of a privacy policy, implementable for an actual lab information system, please let me know. I would love to stop trying to meet privacy rules in an environment where failure to disclose in a timely manner could kill someone.

Wednesday, January 8, 2014

Why Can You / Can't You Use The Cloud?

FAQ Wednesday is here again. Today's question: what about the Cloud and clinical labs?

This question has two variants:

  1. You can't use the Cloud for heath care data, can you--HIPAA, etc?
  2. Why can't you use the Cloud for my clinical lab interface?
Can You Use the Cloud?
The first question, which I take to mean, "is it within law and regulation to use the Cloud for PHI," is actually pretty easy to answer: yes. Does HIPAA restrict the options? Yes. Does HIPAA prohibit use of the Cloud? No.

We currently use Amazon Web Services as our Cloud vendor and they claim to be certified and everything. From http://aws.amazon.com/compliance/#case-studies:

HIPAA

HIPAAAWS enables covered entities and their business associates subject to the U.S. Health Insurance Portability and Accountability Act (HIPAA) to leverage the secure AWS environment to process, maintain, and store protected health information and AWS will be signing business associate agreements with such customers. AWS also offers a HIPAA-focused whitepaper for customers interested in learning more about how they can leverage AWS for the processing and storage of health information. The Creating HIPAA-Compliant Medical Data Applications with AWS whitepaper outlines how companies can use AWS to process systems that facilitate HIPAA and HITECH compliance. For more information on the AWS HIPAA compliance program please contact AWS Sales and Business Development.

But I expect that Google will keep up and this reference implies that they are:

http://www.healthcareinfosecurity.com/google-amazon-adjust-to-hipaa-demands-a-6133

In fact, we are counting on growing acceptance of Cloud implementations in health care, which is why we are currently developing Direct Interfaces.

Why Can't You Use the Cloud?
This is a slightly difference question, which I take to mean "in practical terms, what are the obstacles to Cloud-based interfacing?" The short answer is "the conservative nature of hospital and clinical lab IT culture." This is very linked to why lab interfacing in general is so hard: our industry punishes mistakes and does not reward innovation. So often, doing nothing is rewarded and thus fighting innovation tooth-and-nail is the norm.

(Since this is legal and low overhead and effective, we plan to step around the hospital and clinical lab IT organizations with our new Cloud-based lab connectivity venture, but that is another story.)

Wednesday, December 25, 2013

Cross-organization Patient Identification

A colleague asked me what I thought of this:

http://www.healthcareitnews.com/news/himss-hhs-join-forces-patient-id

"To improve the quality and safety of patient care, we must develop a nationwide strategy to match the right patient to the right record every time," said Lisa Gallagher, HIMSS vice president of technology solutions, in a statement.

The innovator in residence, she said, "will create a framework for innovative technology and policy solutions to help provide consistent matching of patient health records and patient identification.”

I had two reactions, one after the other:
  1. That would be awesome (a nationwide strategy to match the right patient to the right record everytime).
  2. Good luck balancing privacy and accuracy.
I have been dealing with this issue for just about 29.5 years. That is a loooooong time. I have seen lots of ideas come and go. Alas, I have no great solution, but I do have a firm grasp on potential issues:
  • Very similar demographics: (two easily confused patients)
    • identical twin boys named John and James, for instance (yes, people do that)
    • father and son, same name, unlucky birth dates such 11/1/61 and 1/16/11. It happens and MANY clerks are so pleased to spot the "typo"
    • cousins born on the same day or with unlucky birth dates with the same name
    • mother & daughter have same name until marriage and updating the daughter obscures the mother, making the mother look like the maiden name version of the daughter
  • Very dissimilar demographics: (one patient looks like two)
    • maiden name to married name: add a birth date correction and all bets are off
    • legal name change, sometimes to deliberately leave behind the past--prison term, bad marriage, etc
    • heavy use of a nickname "Steve Jones" finally decides to go by "Steven Jones" because his dad, "Steven Jones," just died. Yikes.
  •  Privacy nut / identity theft: the patient deliberately gives false demographics or those of someone else.
I cannot imagine, in the absence of a national identity card, how a private American effort could even span different organizations within a given state, let alone across state boundaries.

Insurance companies could help, given their efforts to get bills paid across institutions, but I cannot see why they would and I can see why they wouldn't.

Man, I hope that I am wrong about this.

Friday, December 20, 2013

Physical Security Dimension to Cybersecurity

A friend who is interested in cybersecurity drew my attention to the following item:

https://www.schneier.com/blog/archives/2013/12/attacking_onlin.html

December 16, 2013

Attacking Online Poker Players

This story is about how at least two professional online poker players had their hotel rooms broken into and their computers infected with malware.
I agree with the conclusion:
So, what's the moral of the story? If you have a laptop that is used to move large amounts of money, take good care of it. Lock the keyboard when you step away. Put it in a safe when you're not around it, and encrypt the disk to prevent off-line access. Don't surf the web with it (use another laptop/device for that, they're relatively cheap). This advice is true whether you're a poker pro using a laptop for gaming or a business controller in a large company using the computer for wiring a large amount of funds.
The friend asked the following question: don't these same issues arise with medical records, eg the lab results I so often handle? Specifically, isn't physical security of personal devices a real issue?

The short answer is yes: God, yes! Yes. Yes.

The long answer is that I see two common issues:

Benign Neglect

In this scenario, doctors and other professionals forget that their spouses and kids and kid's friends may end up borrowing computers, laptops, smart phones or tablets which are also used by clinicians to review sensitive medical information.

We all know that traces left by legitimate access of this information can often be found, either by accident by intent, if you let other people fiddle with your device. We all know that we should take basic precautions:
  • Clean your browser cache.
  • Use passcodes and automatic inactivity locking and make the timeout period short.
  • Don't lend your devices if you can avoid and never lose physical control of them.
  • Be aware of how your data is backed up
But we don't all follow these guidelines and we don't follow them all the time. And we should.

Every time I create a system which offers a confidential medical report as a PDF, I cringe. I warn the users that they are responsible for the PDF once it hits their browser. I do the best I can to expire and obscure the PDFs. But I know that the average clinical user neither knows nor cares about ghost images of private health information (PHI) floating around on his or her devices.

Targeted Attacks

Alas, the following is true: major medical centers have celebrities as patients; celebrity PHI is very valuable; human beings take bribes. But that is a problem for HR. What about users who are targeted, especially in this day and age of personal devices at work? I know of such attacks in other domains, such as finance, but I do not know of any against my clinical users. But does that mean that it hasn't happened? Or that it won't?

I do my best to make sure that my smart phone never sees PHI and that my laptop is physically secure and regularly checked for malware. But that won't help my users. So what is our professional obligation here? How do we foster a greater awareness of the risks so that appropriate action can be taken?

There is no point in trying to scare clinicians into not using their shiny, powerful, useful toys. Instead, we need to figure out how to help them use those toys more safely.

Thursday, November 21, 2013

Direct Interfacing?

This looks like an interesting idea for our lab connectivity start up:

http://directproject.org/faq.php?key=faq

Random connections over the Internet, secured by x509 certificates, with the payload format unspecified. Assuming we control the certificates, we can be confident that our connections are secure from others and from the right computers.

We would use HL7 to encode the payload, of course. But this gives us a validated and accepted model for transactions in a cloud-based environment. Interesting. I feel a proof-of-concept project coming, probably built on Amazon Web Services.

This architecture also lets us circle back to this at some point down the road:
http://www.mehi.masstech.org/health-it/health-it-learning-center