In my last blog post, I talked about compliance with the European Union’s General Data Protection Regulation (GDPR), why U.S. businesses need to worry about GDPR, and some steps businesses can take to prepare for GDPR’s compliance deadline. The previous post contains the basics about GDPR. This post expands on one aspect of GDPR: information security requirements. The press has a lot of information about privacy protections under GDPR, but GDPR also contains requirements for data security as well.

What does GDPR require regarding data security? GDPR has a general statement about security. Article 32(1) says, “Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk.” The term “controllers” refer to businesses that collect personal data from European citizens and determine the purpose or means of processing data. “Processors” process personal data on behalf of controllers, such as a third party service provider or outsourcing service vendor.

Unlike laws such as the U.S. federal Health Insurance Portability and Accountability Act (HIPAA) security regulations in the healthcare field, GDPR does not attempt to offer a complete list of security controls a controller or processor would need to implement. Instead, it provides the general statement about “appropriate” measures. It lists “technical” and “organizational” security measures. In this way, GDPR is similar to the HIPAA Security Rule’s requirements for “administrative” and “technical” safeguards. It provides a number of examples of controls, but the list is not meant to be exclusive.

The hottest data protection issue for major U.S. businesses this year is compliance with the European Union’s General Data Protection Regulation (GDPR). Even small and medium sized businesses may also need to comply with GDPR. This post covers frequently asked questions about GDPR.

What is GDPR? GDPR[1] is the European Union’s comprehensive data protection law that takes the place of 1995’s Data Protection Directive 95/46/EC.[2] By “data protection,” I am referring to both privacy and security. GDPR collects, clarifies, harmonizes, and expands data protection requirements throughout the European Economic Area (EEA). The European Economic Area consists of the 28 countries of the European Union plus Norway, Iceland, and Liechtenstein.

Why is GDPR such a concern for U.S. businesses? First, the fines for violating GDPR are potentially heavy. EU data protection authorities can fine businesses up to 20 million euros ($23.5 million) or 4 percent of their global revenues for violations, whichever is greater. Fines, moreover, will likely be based on the revenue of the global parent and any subsidiaries involved with the violations. Second, U.S. businesses find GDPR to be complex and unfamiliar. Questions arise concerning jurisdictional scope, defining the kinds of personal data covered, obtaining consents from individuals, maintaining an audit trail of consents, managing cross-border data flows, and handling new forms of individual rights given to EEA residents.

Wun Systems, LLC is an award-winning and fast-growing leader in the coworking and shared workspace sector located in Florida, serving more than 80,000 members in just under 500 locations globally. WUN Systems’ intelligent workspace management platform delivers the software, hardware, and support services required to open new shared workspaces, improve existing workspaces or monetize vacant or underutilized space with the goal to increase revenue, maximize productivity and build community for its members.  KUBE by WUN empowers operators and members with greater accessibility when it comes to billing, booking spaces, managing opportunities and communication. To learn more about WUN Systems, visit wunsystems.com.

Yardi Systems, Inc. based in Santa Barbara, California, develops and supports industry-leading investment and property management software for all types and sizes of real estate companies and serves clients worldwide.  For more information visit yardi.com.

The synergy between the real estate market and the “coworking and shared workspace” market makes this relationship a natural fit and strengthens both of our offerings,” said Dale Hersowitz, CEO at WUN Systems.

Healthcare providers, health plans, health care clearinghouses, and their business associates have an obligation under the Health Insurance Portability and Accountability Act (HIPAA) to protect patient health information protected under the law. Regulations issued by the Department of Health and Human Services (HHS) under HIPAA require HIPAA covered entities and their business associates to implement policies and procedures to address the disposal of electronic protected health information (PHI) and the hardware or electronic media on which it is stored. As a result, secure data disposal is a key process for HIPAA covered entities and their business associates.

The covered entity or business associate must have policies and procedures to ensure PHI cannot be inadvertently disclosed during or after disposal or reuse of its storage media. Next to the theft of lost and stolen laptops and media, the second most common subject of enforcement by the HHS Office for Civil Rights (OCR) has been improper disposal of PHI. For example, South Shore Hospital near Boston faced an attorney general enforcement action after the hospital retained a data management company to dispose of computer tapes containing PHI, but the tapes were lost in transit. The hospital failed to delete the PHI from the tapes before shipping them.[1] In another case, the OCR forced Affinity Health Plan to pay over $1.2M after it returned photocopiers to a leasing company without first removing electronic PHI from them.[2]

Some of the OCR enforcement activities concerned cases involving the improper disposal of paper PHI. The security of paper PHI falls under the Privacy Rule, rather than the Security Rule.[3] In one of the OCR cases, workers left boxes of paper medical records on a retiring physician’s driveway while he was away.[4] In one attorney general enforcement action, the Massachusetts attorney general sued former owners of a medical billing practice and four pathology groups after they improperly disposed of paper records with sensitive PHI in a public dump, which were found by a Boston Globe photographer. The former owners paid $140,000 to settle the claim.[5]

People sometimes file lawsuits without the assistance of an attorney at law.  People who represent themselves in a court case are called “pro se litigants.”  Complaints filed by pro se litigants are common, including in California and in Santa Clara County and San Mateo County in particular.  Imagine that your business received a complaint from a pro se litigant.  Should you ignore it because it did not come from an attorney?  You might think that a pro se case is not serious, but taking a complaint lightly could be a big mistake.

Judge Richard Posner served as a Circuit Judge of the United States Court of Appeal from 1981 until his recent retirement on September 1.  Posner’s self-published book, Reforming the Federal Judiciary: My Former Court Needs to Overhaul Its Staff Attorney Program and Begin Televising Its Oral Arguments, was also released earlier this month.

In an interview with the New York Times, Posner opined that most judges regard pro se litigants as “not worth the time of a federal judge” and that Judges in the 7th Circuit (in the Midwest) generally rubber stamp recommendations of staff lawyers who review pro se appeals.   In addition, Posner said he was prevented from giving pro se litigants a better chance by reviewing the staff attorney memos before they were circulated to judges.

SVLG Shareholder Stephen Wu will host a conference call program on the recent Equifax data breach on October 25, 2017 at 10 am Pacific/1 pm Eastern. While the Equifax is not the largest ever in terms of the total number of records affected, by some estimates, it affected about half of the population in the United States. With a breach that large, legislators and regulators are considering what new policies may help to prevent future large-scale breaches.

For businesses that create, receive, maintain, and transmit personal data, the Equifax breach raises the question of what changes are necessary to keep up with evolving data security threats. According to news reports, the breach occurred because of a failure in patch management — a failure to implement a publicly available patch to a known security vulnerability for a period of months. Are there emerging threats that warrant changes in patch management practices? Or did the Equifax breach occur because of the company’s failure to take care of the basic patch management steps. We will explore these questions in this program.

The program will generally explore the technical and legal ramifications of the breach.  What are the prospects for liability? What compliance challenges does the breach highlight? Are there changes in documented practice and procedure that the breach would suggest?

If your business provides services to healthcare providers or health insurance companies, your business may have data privacy and security requirements under a federal law called “HIPAA” (the Health Insurance Portability and Accountability Act). If your business offers an online service or application, the first time you may have heard of HIPAA is when your potential customer asks you to sign a “business associate agreement.” Even if you don’t sign a business associate agreement, you may have compliance obligations under HIPAA. And if you fail to comply with HIPAA, you may face penalties and liabilities for violations.

Health records are among the most sensitive sets of information about us. The results of an unauthorized disclosure of health records could be devastating. Leakage of health records could lead to victims’ embarrassment, stigma, job loss, and even identity theft. Following concerns about the privacy and security of health records in the 1990s, the public began to demand protection to ensure that the healthcare industry would implement controls over what information was gathered from patients, how the information could be shared, and the secure management of that information. When Congress overhauled the healthcare laws and called for greater use of electronic transactions, Congress was aware of the need for protections over the privacy and security of health information.

The need for simplifying the administration of healthcare, coupled with a public concern over privacy and security, prompted Congress to include requirements for privacy and security in landmark healthcare legislation enacted in 1996. The 1996 legislation, called the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”),[1] has had a broad impact on the healthcare industry since its enactment, transforming practices for creating, storing, managing, transmitting, and disclosing health information in the United States. Later, Congress passed the Health Information Technology for Economic and Clinical Health Act, also called the “HITECH Act.”[2]

On September 7, 2017, Equifax Inc. announced a security breach involving the compromise of sensitive personal information of approximately 143 million U.S. consumers. Attackers compromised Social Security numbers, birth dates, addresses, driver’s license numbers, and credit card numbers. This is the kind of information that could help attackers engage in identity theft. This isn’t the largest breach ever in terms of the number of unique accounts; the Yahoo breach involved approximately 1.5 billion accounts. Nonetheless, the fact that the number of affected individuals is approaching half of the U.S. population and involves sensitive information that could be used for identity theft, the Equifax breach is far more concerning than the Yahoo breach.

What caused the breach? The Apache Software Foundation reported on September 9 that attackers compromised Equifax’s systems by exploiting a vulnerability in the Apache Struts Web Framework. It appears that Equifax failed to implement an update that would have prevented the attack. Thus, WIRED magazine is reporting that the Equifax breach was entirely preventable.

The steady stream of news about data breaches emphasizes the importance of rigorous enterprise security programs. The consequences for the breached company are enormous. Companies sued for data breaches are paying staggering amounts to investigate and settle the cases against them. For instance, The TJX Companies set aside $107 million to cover the litigation against it and regulatory actions. Heartland Systems set aside $73.3 million for breach expenses in 2009. The loss of sales, reputation, profit, and ultimately shareholder value may bring a company to its knees. At the time of the Sony breach, for instance, the company’s entire information technology infrastructure was down in order to mitigate the effect of the attack against it. Workers were using personal devices to continue conducting company business.

On January 19, 2017, SVLG attorney Stephen Wu will present a program at the Global Artificial Intelligence Conference entitled “Product Liability Issues in AI Systems.”  The talk will focus on product liability risks to companies providing AI-based products and services.  It will cover the sources of legal risk to manufacturers, and how manufacturers can manage those risks.  Many in the industry consider liability to be a chief obstacle to the widespread deployment of AI systems.  Nonetheless, it is possible to implement design practices and procedures to minimize and manage legal risk.

In preparation for the conference, Steve Wu addressed some questions posed by the conference organizers.  Some of the conference’s questions and Steve Wu’s answers are below.

Q.  Where are we now today in terms of the state of artificial intelligence, and where do you think we’ll go over the next five years?

Healthtech devices are increasingly common. People are wearing sensor devices that monitor fitness metrics. They can count steps and distance walked or run, calories burned, elevation changes, and heart rate. In the future, people may swallow sensor devices that can monitor or transmit video of the digestive system, may have sensor devices in their bloodstream monitoring the level of a medication, or may ingest smart pills that detect diseases. An organization can also embed devices in a patient, such as a catheter for an insulin pump, a pacemaker, or microchips placed under the skin.

With all of these devices, various security vulnerabilities may be present. Hackers can exploit vulnerabilities to take control of them or otherwise tamper with them. Devices that communicate with systems outside the body entail the risk of interception or interruption. Moreover, once systems collect data from devices on or in the body, the systems are potential targets for attack.

To mitigate these risks, the device manufacturers should design their products with security features in mind. They should thoroughly test the device during the design phase to determine if vulnerabilities pose risks to users. They should have an independent third party test the device to check for vulnerabilities and seek any available security certifications for the device. Finally, the vendor hosting the applications and data needs to secure the data and the systems collecting the data. It should use transmission security procedures and technology to secure the communications with the device, encrypt the collected data, and manage access to the infrastructure supporting the devices.

Contact Information