HIPAA Blog

[ Wednesday, March 04, 2015 ]

 

Not sure how useful these are, but here are 5 tips for avoiding a data breach.

Jeff [3:47 PM]

[ Wednesday, February 25, 2015 ]

 

Hippler (East Texas Hospital Data Thief) Gets 18 Months: Like Gibson years ago, a healthcare worker who stole PHI to use for fraud gets 18 months in jail.  Two interesting points: first, the US Attorney's office isn't saying what hospital he worked at, and in fact the case has been weirdly under the radar.  Secondly, is there a better name for a HIPAA violator than Hippler?

Jeff [8:08 AM]

[ Wednesday, February 18, 2015 ]

 

Health Data Identity Theft: Interesting article from NPR on the black market for stolen health data.

Jeff [10:55 PM]

[ Friday, February 13, 2015 ]

 

De-Identification Resources: Looking for de-identification tools?  Here's a good place to start, with some proprietary and open source options.  Nice to see my hometown UT-Dallas in the mix here. 

A big hat tip to Daniel Barth-Jones (@dbarthjones) for this link.

Jeff [2:24 PM]

 

One More Anthem Thought: What if the hackers were really looking for a needle, not the haystack?  What if they weren't after tens of millions of medical identities to conduct identity theft or something else, but were really looking for specific information on a handful of specific individuals, and only accessed the huge amount of data to cover their tracks?  If it really was a hack by Chinese nationals operating under the guise of the Chinese government, wouldn't that make more sense?

I'm not saying I believe it was the Chinese government, any more than I believe the Sony hack was the North Korean government.  Or, if actually were, I'm not buying that the Norks were so torked about a Seth Rogan film they'd waste their resources hacking Sony.  I'm betting if it was them, they were looking for something else, perhaps something they could use to extort someone. 

I'm not paraniod.  Really.

Jeff [12:27 PM]

 

More Anthem: Lessons for IT LeadersEncryption decisions (really, any protective decisions) can have much greater consequences tomorrow than you realize today.  And it is very important that you know the value of your information: not just the value to you, but the value to a hacker.  You may think the information you hold is mundane, but what you think doesn't really matter.  What matters is what the predator thinks. 

Jeff [12:21 PM]

[ Thursday, February 12, 2015 ]

 

More on the Anthem Hack, and "What It Means": I've posted several posts on the Anthem hack, and I'm not the only one.  AHLA sent out an email to its HIT and payers, plans and managed care practice groups explaining the hack and the class-action lawsuits already filed.  More news is out today, from experts who think 2015 will be "the year of the healthcare hack."  Maybe, maybe not, but the news does bring a few additional issues to mind:

First, as the AHLA email points out, the initial lawsuits and some of the initial reporting point to the lack of encryption as a big factor.  Some have indicated that the Anthem hack may cause HHS to harden the encryption requirement of the Security Rule (as you know, encryption is not a required element, only an addressable one, and HIPAA covered entities are free to forego encryption if they reasonably determine it's not right for them).  However, the hackers apparently got user credentials; even if the data had been encrypted, the hackers could have used the credentials to de-crypt the data.  The fact that encryption would've been irrelevant probably won't stop those claiming encryption should become required, but it's worth considering.

Secondly, some of the reporting is highlighting the "monetization" issue, which I've always seen as the issue.  The hackers probably don't want the data because they're going to use the data; they want it so they can sell it to someone else who will use it for identity theft.  If that's the case, there is a multi-tier market, which could be good or bad: as the data changes hands, it's harder and harder to catch the initial culprit; on the other hand, if there are several steps between the point of theft and the point of use, there are several opportunities to put systems or safeguards in place to catch the actors and/or prevent the improper use.  In other words, you might not be able to stop the thief, but if you can stop the purchaser from using the stolen data, the criminal enterprise falls apart.  Something to consider.

Another issue I hadn't thought about previously: not only can the stolen medical identity be used to obtain needed healthcare services (an impostor uses the stolen identity to directly receive needed healthcare services), the stolen identity could also be used to obtain unnecessary services.  I can think of two examples: a stolen identity could be used to obtain Oxycontin or other prescription drugs that could then be resold, or could be used to bill for services that are not actually provided.  In both cases healthcare providers would be required to be part of the scam, either unwillingly (a convincing doctor-shopping patient gets painkiller prescriptions) or willingly (a doctor bills for services not provided), but that's not inconceivable.  My previous thoughts focused on the receipt of actual, needed services, in which case the value proposition is harder to see (you need an ultimate purchaser of the stolen identity who currently needs healthcare services); however, that's not the case, since you could get prescription drugs to sell on the black market.  I hadn't considered that.

Finally, I had recently heard that while social security or credit card numbers don't bring much more than a couple of dollars each on the black market anymore, a stolen medical identity might be worth $50.  In today's news from Reuters, it seems that a stolen medical identity is now worth only $20.  These aren't hard and fast numbers, but still, that's a pretty big devaluation.  Maybe the supply of medical identities (and concommitantly, the amount of hacking) is growing so fast the price is dropping; maybe hacker buyers are determining that medical identities aren't all that valuable; or maybe there's really not that big a market of buyers out there after all.  I have no idea, but it's worth considering. 

Jeff [10:50 AM]

[ Wednesday, February 11, 2015 ]

 

Aspire Indiana Breach: Stolen laptops with unencrypted PHI result in a mental health provider sending HIPAA breach letters to 45,000 patients. 

Jeff [6:40 PM]

[ Tuesday, February 10, 2015 ]

 

More Anthem Fallout: Is Healthcare Particularly Vulnerable to Hacking?  There are a lot of people saying that; most of them stand to profit if you believe them (including me, in fact).  The Anthem breach gives an opportunity for a bunch of news articles on just this point.  Let's consider this for a moment.

Much hacking and phishing is aimed at access to quick-value money: credit card numbers that can be used right away (with the victim perhaps not knowing about the use until the bill comes, or perhaps not even noticing it when the bill comes), actual bank account or financial acount data so current funds can be withdrawn, phony checks written, etc.  In this type of hacking, the reward comes quickly to the hacker, but might be small change and is usually not a long-term proposition.

Some hacking is designed to allow for real identity theft: the hacker acquires a social security number and other information, impersonates the individual to obtain credit cards, car loans, even house loans, runs up big debts, and when the credit card company or bank tries to collect, the impostor is gone with the loot and the victim is left to try to prove that it wasn't him that got/used the credit card, loan, etc.  The reward takes longer, but can be much bigger than snatching a credit card number.

With regard to both of these types of hacks, the victim, the bank or credit card company, and the vendor at which the stolen credit card is used are all incentivized to prevent the hack, since all of them stand to suffer substantial harm: the victim's credit might be ruined (or he might pay for something he didn't get), and the bank, the credit card company, or the later vendor might be left with the bill.

Health records sometimes contain credit card numbers, but often don't, making them not particularly useful for the first type of hack.  On the other hand, health records usually contain social security numbers and other demographic data that can be useful for the second type of hack.  Thus, medical records might be useful for traditional identity theft schemes.

The much bigger risk, and what medical records are particularly well suited for, is medical identity theft.  This type of hack targets patients with good insurance, and allows someone to impersonate the insured and receive the insured's health benefits.  The impostor gets free or reduced cost healthcare, but unlike most other hacks, the "victim" (the person whose data was stolen) doesn't necessarily suffer (or at least doesn't suffer immediately); in fact, the victim might benefit, since the impostor might actually pay a part of the victim's annual deductible.  Additionally, the person whose data was stolen is not in a very good position to know it was stolen, unless he regularly checks his EOBs (frankly, even if he scrupulously checks his EOBs, they can be hard enough to understand that the medical identity theft might not even be noticed).  Rather, the immediate victim is the insurer, who pays for care for someone who did not buy insurance.  And if the insurer discovers the identity theft, the care provider becomes the victim, since the insurer may try to recover the funds paid to the provider for the imposter's care.

Unlike a stolen credit card number, which can be used to purchase almost anything (including cash cards), a stolen medical identity is not as easy to immediately monetize.  However, the lower level of vigilance by the potential victim makes medical identity theft easier to pull off.

More importantly, however, the risks of medical identity theft far outweigh the risk of credit card theft or regular identity theft.  An impostor who receives care while posing as the insured will leave behind a medical record that might be relied upon by some future healthcare provider.  Perhaps the impostor is not allergic to penicillin, but the insured is; the impostor receives care at a hospital and the medical record says the patient may have penicillin.  When the real insured shows up, tragedy might occur.  Thus, while regular identity theft might cause financial ruin to its victims, medical identity theft can kill.

Does the Anthem hack indicate that an epidemic of medical identity theft is on its way?  Most criminals are looking for quick cash, and medical identity theft doesn't offer as quick a reward as access to a bank account or credit card number.  However, given that there is profit to be made in medical identity theft, and the risks are much greater, healthcare providers, insurers, and patients should all be on high alert for signs of it, and be prepared to quickly respond.

Jeff [11:12 AM]

 

Anthem Breach: Secondary Impacts on Employers.  One thing to think about when you hear of big insurers being subject to a data breach: in many cases, while the company usually does have a great deal of insured beneficiaries (either through direct insurance purchases or fully-insured employers), almost all have a great many more beneficiaries covered as TPAs or otherwise.  For example, most Americans with private insurance are insured by employers who have self-funded insurance plans.  Those self-finded insurance plans then go and hire Anthem, United Healthcare, Blue Cross Blue Shield, Cigna, Aetna, or some other entity to administer those plans, and those third-party administrators (or TPAs) are usually insurance companies themselves; that makes sense, since they must know how to administer the employer's self-funded plan if they can administer their own insurance products.

So, when an insurer like Anthem suffers a breach, many of the impacted individuals will be direct Anthem subscribers, but more will likely be beneficiaries of some employer who hired Anthem as a TPA of its self-insured plan.

Thus, in addition to pondering Anthem's fate, and what Anthem ought to do, it makes sense to also ponder what those self-insured plans and plan sponsors ought to do.  Interestingly, here's an employment law boutique with a blog post on just that.  Something for employer clients of Anthem to consider, for sure, and useful thoughts for all employers with either fully-insured or self-insured/TPA plans.  Additionally, it's worth it for employers to start thinking about what they would do if such a breach occurred with their own TPA.

Update: Here's another (shorter) blog post with an additional good point: check your BAAs to see who is responsible for notifications.  Of course, if you are (i) a HIPAA covered entity or (ii) a HIPAA business associate with any possible breach notification obligations, you should already have breach notification communication tools (set channels of communication, form letters, vendors chosen if not actually lined up, etc.) in place, ready to pick up and use.

Jeff [10:06 AM]

[ Thursday, February 05, 2015 ]

 

HIPAA for Paralegals Webinar: if you're a paralegal interested in how HIPAA works, why providers hesitate to give you medical records you've requested for litigation purposes, or how to get those covered entities to give you those records, you might want to check out this webinar I'm putting on next week.  You can get a 50% discount if you use priority code 15999 and discount code O7839374.

Jeff [2:22 PM]

 

Anthem Breach: By now you've heard of the latest huge data breach.  Just a reminder, PHI has value as a breach target on multiple fronts: direct theft of account numbers (particularly credit card numbers that can be used immediately), regular identity theft value (stealing SSNs to get credit cards or loans in the victim's name), medical identity theft (to pose as the victim and use up their insurance benefits), sensitivity/"hostage" value (to obtain information on particular individuals for extortive uses or to extort the covered entity), etc.  It appears that Anthem's credit card info was protected (probably in accordance with PCI standards), but the other PHI also has value.

Jeff [11:12 AM]

[ Tuesday, January 27, 2015 ]

 

Reporting Breaches of Less Than 500 Individuals: Don't forget that for "small" breaches (those involving less than 500 people), even though you need not report to OCR at the same time you report to the patient, you must still report to OCR during the first 2 months of the next calendar year.  We are about halfway through that reporting period, so don't forget to log those minor breaches.

In other words, if you sent a breach notification to anyone in 2014, and did not at the same time notify OCR, then you need to do so now.  You may have sighed with relief that you did not need to notify OCR (and the media) at the time, and your notification now will not lead to a "wall of shame" posting, but you must still notify OCR.

You can do so by going here and following the link to "Breaches Affecting Fewer Than 500 Individuals."

Jeff [12:50 PM]

[ Monday, January 26, 2015 ]

 

Not again: Indianapolis dentist throws un-shredded medical records in dumpster.

Jeff [11:44 AM]

 

Health Apps: How does HIPAA apply to health apps?  It seems pretty easy to me (follow the trail from the covered entity -- plan, provider, or clearinghouse -- to see if the app provider is a BA, subcontractor BA, etc.), but apparently there's some confusion in the industry, since OCR has indicated they will give guidance soon. 

Jeff [11:36 AM]

[ Sunday, January 25, 2015 ]

 

New Jersey Requires Encryption: Beginning August 1, Garden State insurers and healthcare providers must now encrypt all PHI they collect or possess.  It's more restrictive than HIPAA (where encryption is not required but is an addressable standard) so it's not preempted.  This will raise issues for multi-state providers and insurers. 

UPDATE: the new New Jersey bill only applies to "health insurance carriers," not to providers.  The blog post I linked to implied that healthcare providers were also covered.  Not so.

Hat tip: Theresa Defino

Jeff [4:38 PM]

[ Tuesday, January 20, 2015 ]

 

Healthcare.gov: Who Has Access to What Data?  It seems a lot of vendors can access information of people who log onto healthcare.gov to look for Obamacare insurance or information.  I wonder if these vendors are subject to BAAs or other privacy and confidentiality restrictions.  I don't believe the website is a HIPAA covered entity. . . .

Jeff [1:56 PM]

 

Is HIPAA Enforcement Increasing?  Or is the delay in the roll-out of the 2nd phase of audits indicative of lackluster enforcement by OCR?  That's what some are saying.

Jeff [1:54 PM]

[ Monday, January 12, 2015 ]

 

Bleg: I'm looking for recommendations for HIPAA auditors, to review a covered entity for HIPAA compliance.  Let me know if you have any recommendations.

Jeff [3:16 PM]

 

Big Data: Do you have a lot of data and dreams of striking it rich in the Big Data future?  Here are some things to think about.

Jeff [2:06 PM]

 

Hospitals likely to be cyberattack victims in 2015.  Or later.

Jeff [1:06 PM]

[ Friday, January 09, 2015 ]

 

Bad Employee Does Not Always Equal Employer Liability: Kettering Health Network was sued by a woman whose estranged husband, while an employee of Kettering, wrongfully accessed her and family members' medical records.  She sued on a qui tam action, alleging the hospital wrongfully took Meaningful Use funds.  The court ruled against her, saying Kettering in fact did what they were supposed to under Meaningful Use: they installed controls and did the risk analysis.  The controls obviously weren't fool-proof, cause that fool husband got around them.  But it was actually the hospital that caught him, proving that the controls were at least of some use.

This is not exactly opposite of the Indiana Walgreens case (there are tons of differentiating factors), but indicates that just because an employee goes rogue, it doesn't necessarily mean the employer has to be liable.

Jeff [4:30 PM]

[ Monday, January 05, 2015 ]

 

Medical Reality TV Shows and Potential HIPAA Violations: Interesting (but obviously one-sided -- it is the New York Times) article on a family's lawsuit against a hospital for allowing seemingly de-identified PHI to be used in a reality TV show.  The patient's face was blurred, but the family was able to tell it was him.

Jeff [11:54 AM]

[ Tuesday, December 30, 2014 ]

 

Cool Data Breach Infographic.

Jeff [1:44 PM]

[ Tuesday, December 23, 2014 ]

 

2015: Year of the Hospital Hack?

Jeff [1:13 PM]

 

2014 OCR Enforcement Actions: Ed Zacharias runs down OCR's year of settlements.  I tend to agree that enforcement will continue to increase, and that having good, regularly updated policies and procedures (based on a valid and thorough risk assessment), and following them, might not completely insulate you, but will go a long way to keeping fines down.

Jeff [1:01 PM]

 

Boston Children's Hospital: a laptop stolen from a physician traveling in South America results in a $40,000 fine for the hospital.  (via BNA, subscription required).  This just settles the state law issues (Massachusetts' AG being one of the more aggressive ones), so HIPAA fines may loom on the horizon.  Guess what?  Encryption would've fixed the problem.  To be fair, the hospital did have an encryption policy, but the doctor didn't follow it.

Jeff [10:49 AM]

 

The 5 Biggest Healthcare Data Breaches of 2014.  Important takeaways: encrypt portable devices, protect and update your networks, and keep track of what your business associates are up to. 

Jeff [9:56 AM]

[ Thursday, December 18, 2014 ]

 

Illinois Hospital Blackmailed: at least it wasn't about a dumb movie.

Jeff [1:33 PM]

[ Wednesday, December 10, 2014 ]

 

Walgreens: Here's a good piece on the implications for employers.

Jeff [7:35 AM]

[ Tuesday, December 09, 2014 ]

 

Some Folks are Catching Up at Lexology: (i) on the Connecticut decision allowing a state cause of action to proceed using HIPAA as a guide (but acknowledging the lack of a private cause of action under HIPAA itself); and (ii) on the Indiana case holding Walgreens liable for an employed pharmacist's apparent improper access to PHI.  Of course, you read about them here first. . . .

Jeff [1:30 PM]

[ Monday, December 08, 2014 ]

 

$150,000 fine for Alaska Mental Health Agency's Failure to Protect ePHI: Malware on the computer system compromised data of 2,743 patients, but the bigger issue is the failure of the organization to keep its information systems up to date.  The malware apparently took advantage of security issues in the software for which patches had been issued, but the agency didn't keep track of patch management.  Basically, it's proof that adopting decent policies isn't nearly enough if you don't regularly make sure you've got reasonable risks covered.  The bulletin also pushes the HIT Security Rule Risk Assessment Tool: hint, hint, if you haven't reviewed this and compared your current security to what's in here, you're likely gonna get fined if there's a breach. 

Jeff [7:28 PM]

[ Friday, December 05, 2014 ]

 

80% of patients worry about health data security. 

Jeff [11:20 AM]

[ Wednesday, December 03, 2014 ]

 

Employee Snooping at Cleveland's University Hospitals.  It's being blamed on lax oversight; policies were good enough, but access auditing and other gatekeeper activities might have exposed the problem much earlier.  Trust but verify, a wise man once said. . . .

Jeff [2:41 PM]

 

How to Protect Patient Data: a cursory look.

Jeff [12:00 PM]

[ Tuesday, November 25, 2014 ]

 

Beth Israel Deaconness, BYOD angle: As previously noted here, someone stole a laptop from a physician at Beth Israel Deaconness hospital in Boston.  The laptop didn't belong to the hospital, but the hospital knew the doctor was using it for patient data, and (of course) it wasn't  encrypted.  The hospital has settled the state-law breach issues (and the state AG HIPAA enforcement issue) with the Massachusetts state officials, for a $100,000 fine.  I asssume there will be no OCR fine in this case, since HIPAA was specifically included in settlement with the state AG. 

Jeff [10:46 AM]

[ Wednesday, November 19, 2014 ]

 

Shasta Update: Prime Healthcare Services' Shasta Regional Medical Center in California was fined by the State of California and OCR in a case involving an advocacy group trying to make the hospital look bad.  A patient disclosed her own medical information related to her stay at Shasta Regional Medical Center; when the press asked the hospital executives about the matter, they disclosed the patient's information in defending the hospital, which served as the basis for the state and federal fines.  Apparently the patient also sued, but lost; the court determined that the patient had implicitly waived her privacy rights by making the initial disclosure to the hospital, and that therefore there was no improper disclosure of private information or any harm suffered by the patient. 

Hat tip: Theresa Defino

Jeff [2:16 PM]

 

Detroit: Hospital employees steal patient identities, file false tax returns.  HIPAA breach, but really just plain old identity theft.

Jeff [7:33 AM]

[ Tuesday, November 18, 2014 ]

 

Brigham & Women's Hospital Laptop and Phone Theft: approximately 1000 patients affected.  Can't really tell if the devices were encrypted, but don't know if that would matter if the robbers made the victim give up the codes.  As the commenter notes, I wonder why it took 2 months to report this -- hopefully that was at the request of the police.

Jeff [1:19 PM]

[ Monday, November 17, 2014 ]

 

Walgreens' $1.4 Million Verdict: an Indiana court has upheld a $1.4 Million judgment against Walgreens.  A Walgreens employed pharmacist accessed prescription records of her boyfriend's ex-girlfriend, and apparently disclosed the details to the boyfriend.  Presumably, the employee violated all sorts of rules, procedures, policies, and training, and I would assume Walgreens argued that she was acting outside the scope of her employment when she accessed the records.  But the court has held Walgreens liable, and the appellate court affirmed it, based on negligent supervision and retention, and invasion of privacy.

Jeff [1:13 PM]

[ Sunday, November 16, 2014 ]

 

Data Breach Response: Here's an interesting thumbsucker article on how to respond to a data breach.

Jeff [2:08 PM]

[ Friday, November 14, 2014 ]

 

$19,000 per victim?  That's the alleged cost per person of a HIPAA breach, although it's the cost if the breach victim is actually a victim of medical identity theft. 

Jeff [4:20 PM]

 

Are you a lawyer with covered entity clients?  Are you worried about HIPAA?  Want to know what your obligations are as a business associate?  You might want to check out this webinar next month.

Jeff [4:13 PM]

[ Thursday, November 13, 2014 ]

 

Ebola and HIPAA: I've heard lots of folks questioning how healthcare providers such as Texas Health Presbyterian Hospital here in Dallas were able to issue press releases and discuss the health condition and treatment of the Ebola patients they treated.  It's an interesting question: the providers can't talk about it unless the patients authorize them to do so, but they must also disclose data to governmental agencies when required by law to do so (whether those government agencies may then disclose the data depends on whether they are covered by HIPAA [usually not] or some other privacy law [usually are]).

However, what normally happens in high-profile medical cases, whether they be "epidemic-disease-fo-the-day" or some high-profile incident like a terrorist attack, is that the provider coordinates with the patients, asks them how much information they want disclosed (if any), and respects their wishes.

Here's a pretty good article on what Emory Healthcare did with their Ebola patients where the press was concerned.

Jeff [3:04 PM]

 

Oops: WellPoint email glitch puts colonoscopy test in the subject line.  WellPoint apparently currently believes this is not a breach, and they may be right (it may not exceed the "low probability of compromise" standard").

Jeff [2:45 PM]

[ Wednesday, November 12, 2014 ]

 

Encryption: This seems like a good place to implement data encryption.  It's not required, but sometimes it's just a really good idea.

Jeff [12:05 PM]

[ Tuesday, November 11, 2014 ]

 

HIPAA-compliant website issues: here's an interesting blog post on HIPAA issues encountered relative to a specialty pharmacy website hosting arrangement.

Jeff [7:27 AM]

[ Monday, November 10, 2014 ]

 

Ebola Reporting: Wondering how those hospitals are able to discuss the status and prognosis of their Ebola patients?  OCR has just recently published a Bulletin on "HIPAA Privacy in Emergency Situations" as a reminder to covered entities about the who/what/how/when of making these sorts of disclosures. 

Jeff [3:18 PM]

[ Wednesday, November 05, 2014 ]

 

HIPAA Private Cause of Action: Long-time HIPAAcrats know that there's no private cause of action for a HIPAA violation.  In other words, if your doctor violates HIPAA and discloses your PHI to the National Enquirer, you can't sue him for violating HIPAA.  Depending on where you live, you may be able to sue him for violating a similar state law, a state data breach law, a law requiring physicians to maintain confidentiality, or on common-law grounds such as invasion of privacy.  In such a suit, the doctor's failure to follow HIPAA would probably be pretty good evidence that he did not act reasonably, and would help your case.  But unless you had some statutory or common-law claim, you can't sue just for a violation of HIPAA.

A recent Connecticut case implies that you can sue for a HIPAA breach in that state.  Actually, a better description would be that "a violation of HIPAA regulations may constitute a violation of generally accepted standards of care."  In other words, you can sue for negligence based on a violation of HIPAA; you just can't sue based on the HIPAA violation alone.

Jeff [3:30 PM]

[ Wednesday, October 29, 2014 ]

 

It May Be a Dirty Little Secret, But It's Not Necessarily a HIPAA Violation: Venture Beat has figured out that a lot of healthcare providers text using unencrypted devices operating over regular cellular networks.  Yes, they do.  And yes, many of us strongly urge against them doing so.  But it's not necessarily a HIPAA violation to do so.  As I would've commented on the post itself if it didn't mean letting Venture Beat "manage my Google contacts":

To say "This is a clear violation of HIPAA" is fatuous and false. It's not very secure and not very smart; it could be a violation of an entity's policies and procedures; it could in some instances be a violation if it is absolutely and legally unreasonable to use such a communications device in such a fashion. But HIPAA is scalable and technologically neutral; encryption IS NOT A REQUIRED ELEMENT under HIPAA.

HIPAA covered entities should conduct risk analyses and do their best to secure their data as much as possible, including eliminating unsecure texting wherever possible. But just because it's a bad idea doesn't mean it's against the law (or, in this case, against the regulations).

Jeff [3:35 PM]

http://www.blogger.com/template-edit.g?blogID=3380636 Blogger: HIPAA Blog - Edit your Template