HIPAA’S Right of Access is No Joke, and the Holidays are No Excuse for Noncompliance

On November 30, 2021, the Office for Civil Rights (“OCR”) at the United States Department of Health and Human Services (“HHS”) announced the resolution of five investigations in its Health Insurance Portability and Accountability Act (“HIPAA”) Right of Access Initiative. This brings the total number of this type of enforcement action to 25 since the initiative began. OCR originally launched this initiative in an effort to support individuals’ right to timely access their health records at a reasonable cost under the HIPAA Privacy Rule.

HIPAA grants people the right to see and obtain copies of their health information from their healthcare providers and health plans. Once a HIPAA-regulated entity receives a request, it has 30 days to provide an individual or their representative with their records in a timely manner. If HIPAA-regulated entities need more time to comply with timely requests, they may obtain an additional 30-day extension of time to do this by providing written notice to the individual who made the request, including the reasons for the delay and the expected date by which the entity will complete the action on the request.

Newly-appointed OCR Director, Lisa J. Pino, has said that timely access to health records is a powerful tool for people to stay healthy, protect their privacy as patients, and is a right under the law. She has gone on to say that OCR will continue its enforcement actions to hold covered entities responsible for their HIPAA compliance and pursue civil monetary penalties for violations that go unaddressed. 

For example, OCR has taken enforcement actions that underscore the importance and necessity of compliance with the HIPAA Right of Access, such as the enforcement action against Dr. Robert Glaser, a cardiovascular disease and internal medicine doctor in New Hyde Park, New York, who allegedly did not cooperate with OCR’s investigation or respond to OCR’s data requests after a hearing. He also did not contest the findings of OCR’s Notice of Proposed Determination. Consequently, OCR closed this matter out by issuing a civil monetary penalty of $100,000. Moreover, a licensed provider of residential eating disorder treatment services in Eugene, Oregon, Rainrock Treatment Center, LLC, doing business aa Monte Nido Rainrock (“Monte Nido”),  has taken corrective actions including one year of monitoring and a $160,000 settlement payment to HHS for the alleged violation of the HIPAA Privacy Rule’s Right to Access. In the Monte Nido action, the patient requested records on two occasions—on October 1, 2019 and then again on November 21, 2019. Monte Nido complied with the request for access but not until May 22, 2020, more than six months after the initial request was made. Even still, OCR moved to enforce.

These are just two examples of enforcement actions taken by OCR for violations of the HIPAA Privacy Rule’s Right to Access. In order to avoid an investigation and potential enforcement action such as the ones noted above, it is imperative to determine first whether you are subject to HIPAA’s Privacy Rule as a covered entity, and if so, to handle any requests for access to health information with requisite haste and attention so as to avoid costly and time-consuming regulatory enforcement actions.

Krishna A. Jani, CIPP/US, is a member of Flaster Greenberg’s Litigation Department focusing her practice on complex commercial litigation. She is also a member of the firm’s cybersecurity and data privacy law practice groups. She can be reached at 215.279.9907 or krishna.jani@flastergreenberg.com.

Defending Patient Breaches for Hospitals – PODCAST

On this episode of Darshan Talks, we had discussed Health Literacy with guest Krishna Jani.

Krishna Jani, Cybersecurity & Data Privacy Attorney at Flaster Greenberg PC, had spoken about the relationship between data privacy, life sciences, and health issues in the legal domain. She had highlighted that healthcare services should ensure that they don’t compromise patients’ digital privacy in any way. What startups do with patient data matters regarding what legal liability or implications are placed on them. Example: If they sell data for a profit, it might come under California’s CCPA or the new CCRA regulations. There are a lot of hospitals getting hacked despite hiring IT teams to avoid these incidents. This is often because they outsource the IT work to another company, have an outdated privacy policy, and don’t discuss cybersecurity in board meetings. Besides the breach of privacy to patients, there is also a strong possibility that the hospital will be sued. Thus, hospitals need to hold themselves accountable, focus on data privacy and keep themselves up to date with the latest digital security compliances. She had cited a study in the 80s where people could connect even 1 or 2 data points to a single person. With the advancement of technology that is there now, it is even harder to remain completely anonymous. Thus, it is advisable to delete unnecessary patient data systems for clinical trials or purposes of research and development. A defence would arise only if there has been some substantial effort made or standard of care exercised by hospital management to curb these cybersecurity attacks, even if attacks have become increasingly sophisticated over the years. She had concluded with an interesting point: healthcare data is 3x more critical than financial data.

The Uniform Personal Data Protection Act Is Here

In July 2021, the Uniform Law Commission (“ULC”) voted to approve the Uniform Personal Data Protection Act (“UPDPA”). The UPDPA is a model data privacy bill designed to provide a template for states to introduce to their own legislatures, and ultimately, adopt as binding law.

The UPDPA

The UPDPA would govern how business entities collect, control, and process the personal and sensitive personal data of individuals. This model bill has been in the works since 2019 and includes the input of advisors, observers, the Future of Privacy Forum, and other stakeholders. This is significant because the ULC has set forth other model laws, such as the Uniform Commercial Code, which have largely been adopted across the states.

Interestingly, the model bill is much narrower than some of the recent state privacy laws that have been passed, such as the California Privacy Rights Act and Virginia’s Consumer Data Protection Act. Namely, the model bill would provide individuals with fewer, and more limited, rights including the right to copy and correct personal data. The bill does not include the right of individuals to delete their data or the right to request the transmission of their personal data to another entity. The bill also does not provide for a private cause of action under the UPDPA itself, but would not affect a given state’s preexisting consumer protection law if that law authorizes a private right of action. If passed, the law would, consequently, be enforced by a state’s Attorney General.

Applicability

The UPDPA would apply to the activities of a controller or processor that conducts business in the state or produces products or provides services purposefully directed to residents of this state and:

(1) during a calendar year maintains personal data about more than [50,000] data subjects who are residents of this state, excluding data subjects whose data is collected or maintained solely to complete a payment transaction;

(2) earns more than [50] percent of its gross annual revenue during a calendar year from maintaining personal data from data subjects as a controller or processor;

(3) is a processor acting on behalf of a controller the processor knows or has reason to know satisfies paragraph (1) or (2); or

(4) maintains personal data, unless it processes the personal data solely using compatible data practices.

The UPDPA defines “personal data” as a record that identifies or describes a data subject by a direct identifier or is pseudonymized data. The term does not include deidentified data. The bill also defines “sensitive data” as a category of data separate and apart from mere “personal data.” “Sensitive data” includes such information as geolocation in real time, diagnosis or treatment for a disease or health condition, and genetic sequencing information, among other categories of data.

The law would not apply to state agencies or political subdivisions of the state, or to publicly available information. There are other carve-outs, as well.

Notably, the model bill also contains several different levels of “data practices,” broken down into three subcategories: (1) a compatible data practice; (2) an incompatible data practice; and (3) a prohibited data practice. Each subcategory of data practice comes with a specific mandate about the level of consent required—or not required—to process certain data. For example, a controller or processor may engage in a compatible data practice without the data subject’s consent with the expectation that a compatible data practice is consistent with the “ordinary expectations of data subjects or is likely to benefit data subjects substantially.” Section 7 of the model bill goes on to list a series of factors that apply to determine whether processing is a compatible data practice, and consists of such considerations as the data subject’s relationship to the controller and the extent to which the practice advances the economic, health, or other interests of the data subject. An incompatible data practice, by contrast, allows data subjects to withhold consent to the practice (an “opt-out” right) for personal data and cannot be used to process sensitive data without affirmative express consent in a signed record for each practice (an “opt-in” right). Lastly, a prohibited data practice is one in which a controller may not engage. Data practices that are likely to subject the data subject to specific and significant financial, physical, or reputational harm, for instance, are considered “prohibited data practices.”

The model bill has built in a balancing test meant to gauge the amount of benefit or harm conferred upon a data subject by a controller’s given data practice, and then limits that practice accordingly.

What’s Next

After final amendments, the UPDPA will be ready to be introduced to state legislatures by January 2022. This means that versions of this bill can, and likely will be, adopted by several states over the next couple of years—and perhaps, eventually, lead to some degree of uniformity among the states’ privacy laws.

Krishna A. Jani, CIPP/US, is a member of Flaster Greenberg’s Litigation Department focusing her practice on complex commercial litigation. She is also a member of the firm’s cybersecurity and data privacy law practice groups. She can be reached at 215.279.9907 or krishna.jani@flastergreenberg.com.

Cybersecurity & Data Privacy Updates

There is a lot going on in the world right now—and the world of data privacy is no exception.

Here is a snapshot of what’s on our radar:

Senators Jeff Merkley and Bernie Sanders introduced the National Biometric Information Privacy Act of 2020 on Tuesday, August 4, 2020.

This legislation would, among other things, prohibit private companies from collecting biometric data—including eye scans, voiceprints, faceprints, and fingerprints—without consumers’ and employees’ consent, or profiting from this data. This introduction comes amid growing concerns over the prevalence of biometric data collection among private companies, including the use of facial recognition technology.

This legislation limits the ability of companies to collect, buy, sell, lease, trade, or retain individuals’ biometric information without specific written consent, and requires private companies to disclose to any inquiring individual the information the company has collected about that individual. Importantly, this bill would allow individuals and State Attorneys General to bring lawsuits against companies that fail to comply.

Several United States Senators have urged Congress to include the privacy protections contained in the Public Health Emergency Act into any new stimulus package.

On July 28, 2020, several U.S. senators drafted a letter addressed to senate leaders urging them to include the privacy protections contained in the Public Health Emergency Privacy Act in any forthcoming stimulus package.

The senators emphasized the need for commonsense privacy protections for COVID data because “public trust in COVID screening tools will be essential to ensuring meaningful participation in such efforts.” Research shows that many Americans are hesitant to adopt COVID screening and tracing apps due to privacy concerns; therefore, the lack of health privacy protections could significantly undermine efforts to contain this virus and safely reopen—“particularly with many screening tools requiring a critical mass in order to provide meaningful benefits.”

As the drafters point out, “health data is among the most sensitive data imaginable and even before this health emergency, there has been increasing bipartisan concern with gaps in our nation’s privacy laws.” The drafters believe these common-sense protections are critical in quelling the spread of COVID-19 while at the same time protecting sensitive health and geolocation information.

We will continue to track this legislation and provide updates as they become available.

Schrems II invalidated the EU-US Privacy Shield.

On July 16, 2020, the Court of Justice of the European Union issued a decision in Data Protection Commission v. Facebook Ireland, Schrems. The decision, known as Schrems II, invalidated the European Commission’s adequacy decision for the European Union-United States (EU-US) Privacy Shield framework, which is critical for more than 5,000 United States based companies that conduct trans-Atlantic trade in compliance with EU data protection rules.

The Court found the European Commission’s adequacy determination for the Privacy Shield invalid for two primary reasons: (i) the US surveillance programs, which the commission addressed in its previously-issued Privacy Shield decision, are not limited to what is strictly necessary and proportional as required by EU law; and (ii) with regard to US surveillance, EU data subjects lack actionable judicial redress and, therefore, do not have a right to an effective remedy in the US, as required by the EU Charter.

The Schrems II decision requires both data importers and data exporters to be reasonably certain that they can comply with their obligations in the Standard Contractual Clauses. Where they cannot comply, importers and exporters should likely stop transferring data, forcing some companies into data localization. Schrems II addresses a long-running series of issues regarding the appropriate role of surveillance in our society and its inevitable clash with privacy.

This decision also influences data flows across nations. Some data privacy professionals believe that we are moving away from global data flows and moving towards more fragmented data flows. This shift could have a particularly significant impact on e-commerce. For more, see the Court of Justice of the European Union’s Press Release on this decision.

The attorneys at Flaster Greenberg are following developments related to the COVID-19 Pandemic and formed a response team and to work with businesses to keep them up-to-date on developments that impact their business. If you have any questions on the information contained in this blog post, please feel free to reach out to Donna Urban, Krishna Jani, or any member of Flaster Greenberg’s Telecommunications or Privacy & Data Security Groups.

COVID-19 RESOURCE PAGE

To serve as a central repository of information and contributions from Flaster Greenberg attorneys on legal developments during the COVID-19 crisis, we have launched a COVID-19 Resource page on our website. Feel free to check back frequently for Flaster Greenberg’s ongoing analyses of important legal updates that may affect you or your business.

More Tips On Protecting Your Virtual Meetings to Avoid a Cybersecurity Breach: An Update

At this point, many of us are well into our fourth or fifth week of quarantine due to the outbreak of COVID-19. Even for those of us who are fortunate enough to be able to work remotely from our homes, this comes with certain challenges, including potential security issues with virtual conferencing. In our first installment about virtual meetings, and their unintended vulnerabilities, we provided some guidance on how you and your staff might implement certain strategies to keep your virtual conferences as safe as possible from hackers and trolls. In this new installment, we will provide further guidance on staying safe amidst emerging privacy and security concerns associated with virtual meeting platforms.

Zoom Announces Updates to its Data Privacy and Security Measures

On April 1, 2020, the Chief Operating Officer of Zoom, Eric Yuan, announced certain changes that Zoom is making to enhance its virtual meeting spaces. On April 14th, the Chief Product Officer of Zoom, Oded Gal, provided clarification on those enhancements to those of us who are using Zoom during quarantine.

  • Have a plan and be prepared for interference in your virtual meetings. Zoom has encouraged its users to have a plan in place for their virtual meetings and to be prepared should any unwanted interference arise. This includes ensuring that the application has been updated to include the latest security features, co-hosting meetings whenever possible, and utilizing preexisting and new security tools built into the application. To check for updates to the app, click on the main menu, then click on “Check for Updates,” and then “Begin Upgrade” if any new updates are available. We recommend doing this every week or so to ensure that you and your staff are up to speed on all available cybersecurity protections.
  • Co-host and record your virtual meetings whenever possible. A meeting creator can choose to co-host a meeting while creating the meeting invitation or in the actual Zoom meeting itself. A co-host can monitor the virtual waiting room or assist with any disruptions. Furthermore, record your Zoom meetings whenever possible because recording meetings creates a forensic trail of the meetings, as well as any bad actors that interfere with them, as soon as the meetings begin. The more data that virtual meeting platforms are able to collect about bad actors, the better able they are to stop the threat of further disruption.
  • Zoom has increased access to its security features. Zoom has made its pre-existing security features easier to find. A “Security” button has been added to the bottom banner of virtual meetings and is now easily accessible to meeting hosts. By clicking on this new security feature, meeting hosts are able to enable a waiting room or lock the meeting. Moreover, a meeting host can also remove a participant from a virtual meeting. Once that participant has been removed, he or she cannot reenter the meeting, even if using a different username. This is because as a part of Zoom’s new security rollouts, Zoom has started to collect IP addresses, among other data, to be able to better respond to security threats. While removing a participant from a meeting will only remove the participant from that particular meeting, you have other tools available to permanently block that user.

For example, right now Zoom recommends recording your meetings whenever practicable to ensure a forensic trail is created, as stated above. In addition, Zoom recommends taking a screenshot whenever a bad actor enters your virtual meeting. Then, you can report this intruder on Zoom’s website. And starting this coming weekend, Zoom will be releasing a new security feature built into the app, which will allow users to send a report to Zoom right from the security button should any unwanted interference arise.

Other Noteworthy Developments

Zoom announced that as of April 1, 2020, it would freeze all future product development except for data privacy and security updates for the following 90 days. Moreover, beginning April 18, 2020, every paid Zoom customer will be able to customize which data center regions their account can use for its real-time meeting traffic. By default, however, there will be no connection to any data centers in China beginning April 18, 2020 for all users. Additionally, users with an “.edu” registered email address are automatically given the highest level of security in their meetings, and this will continue. Zoom has begun to address user demands for a “kid-friendly” interface, but it has not yet launched any such interface.

Other virtual meeting platforms, such as GoToMeeting, have also enacted enhanced security protections in their respective applications. For example, GoToMeeting gathers cyber threat intel through partnerships including external intelligence communities, personal and professional sharing groups, and its own internal research to collect Indicators of Compromise or IoC data. IoC can include forensic data such as IP addresses, domains, hashes, and pulls them into its threat intelligence platform to reduce the risk of cyber threats.

Still though, platforms like Zoom and GoToMeeting urge users to utilize additional security measures as outlined in our previous blog post, and above, to provide the greatest level of privacy and data security for your virtual meetings.

Updates on Regulatory Guidance

On April 8th, Senator Edward Markey, whose priorities include telecommunications, technology, and privacy policy, urged the Federal Trade Commission (FTC) to publish industry cybersecurity guidelines “for companies that provide online conferencing services, as well as best practices for users that will help protect online safety and privacy during this pandemic and beyond.”

In Senator Markey’s letter, he urges that the guidance cover, at a minimum, the following topics:

  • Implementing secure authentication and other safeguards against unauthorized access;
  • Enacting limits on data collection and recording;
  • Employing encryption and other security protocols for securing data; and
  • Providing clear and conspicuous privacy policies for users.

Senator Markey also requests that the FTC develop best practices for online conferencing users, so that they can make informed, safe decisions when choosing and using these platforms. He requests that these best practices cover at least the following topics:

  • Identifying and preventing cyber threats such as phishing and malware;
  • Sharing links to online meetings without compromising security;
  • Restricting access to meetings via software settings; and
  • Recognizing that different versions of a company’s service may provide varying levels of privacy protection.

To date, the FTC has not published new guidelines.

Remember to have a plan and be prepared. Stay safe, everyone!

If you have any questions, please feel free to reach out to Donna Urban, Krishna Jani, or any member of Flaster Greenberg’s Telecommunications or Privacy & Data Security Groups.

Donna T. Urban is a member of Flaster Greenberg’s Commercial Litigation and Environmental Law Departments concentrating her practice in telecommunications law, environmental regulation and litigation, and privacy and data security. She is a seasoned litigator, and for more than 20 years has successfully represented business clients in contract disputes, regulatory matters, and complex negotiations. She can be reached at donna.urban@flastergreenberg.com or 856.661.2285.

Krishna A. Jani is a member of Flaster Greenberg’s Litigation Department focusing her practice on complex commercial litigation. She is also a member of the firm’s cybersecurity and data privacy law practice groups. She can be reached at 215.279.9907 or krishna.jani@flastergreenberg.com.

To serve as a central repository of information and contributions from Flaster Greenberg attorneys on legal developments during the COVID-19 crisis, we have launched a COVID-19 Resource Page on our website. Feel free to check back frequently for Flaster Greenberg’s ongoing analyses of important legal updates that may affect you or your business.