Articles Posted in Information Security

Published on:

Shareholder Stephen Wu will be speaking at the American Bar Association Annual Meeting in San Francisco. On August 9 at 2 pm, he will be presenting in a panel Presidential Showcase continuing legal education program entitled “Law Firm Cybersecurity Requirements You Never Dreamed Of: Emerging Threats, Ethical Obligations, and Survival Tactics.” The sponsor is the ABA Cybersecurity Legal Task Force, and Stephen is a new member of the Task Force. If you are attending the ABA Annual Meeting, please join us. For event details please click on the link below:

https://drive.google.com/file/d/1Wgdu-R8FKk6Ri1mYCPc63IvWcmeQs54v/view?usp=sharing

Published on:

On May 23, 2019, shareholder Stephen Wu spoke with Marianne Kolbasuk McGee of Information Security Media Group about a HIPAA an enforcement case.

The case was brought by the Office for Civil Rights, Department of Health and Human Services. It emphasized the importance of conducting a security program risk assessment in order to prevent security breaches.

To read the article with Stephen’s comments click here.

Published on:

The Internet of Things connects machines to other machines in a wide variety of fields and industries. In our digital lives, we are connecting devices to our networks at work and at home. In addition to work and home, however, we spend much of our waking time in transit from one place to another, often in our private automobiles. The Internet of Things is extending our digital lives to our cars, trucks, and other road vehicles. With this new integration comes privacy, security, and other legal issues.

A 2015 episode of the CBS television show “60 Minutes” vividly illustrates what can happen when we connect cars with information technology networks. In the show, reporter Lesley Stahl sat behind the wheel of a nondescript dark gray sedan while driving through a tree-lined suburban parking lot. She appeared on a 60 Minutes segment aired on February 8, 2015. In the driver’s seat next to her was Kathleen Fisher, a veteran of the Defense Advanced Research Projects Agency or “DARPA” for short. As Stahl navigated one end of the cleared parking lot, two men stood at the other end – Karl Koscher, a University of Washington Ph.D. student, and Dan Kaufman, who was then Director of DARPA’s Information Innovation Office. Koscher used a laptop sitting on black boxes of what appeared to be equipment, while Kaufman provided instructions.

Kaufman told Koscher, “You wanna hit the fluids?” Koscher typed something on the laptop and suddenly the windshield wiper fluid sprayed onto the windshield on Stahl’s car and the wipers started moving back and forth. Stahl said “I did nothing” to turn on the spray. And yet, without Stahl doing anything, Koscher had taken control of the wipers and fluid. In a cut-away scene, Stahl explained that hackers had contacted the car’s emergency communications system, flooded it with sound data, and inserted a piece of code, which reprogrammed the car’s software so the researchers could take complete remote control of the car. Further demonstrating this control, Koscher caused the horn to sound, again without Stahl’s knowledge or action.

Published on:

The ISO 27001 standard[1] is a specification for managing an information security program in an organization. The International Organization for Standardization (ISO) developed and maintains this standard. Worldwide, ISO 27001 has become the most popular standard for managing information security programs, and many organizations have received a certification that their information security management system complies with the standard.

When companies obtain an ISO 27001 audit, they usually envision working with auditors to complete the project using operational, management, information security, and internal audit teams within the organization. What they may find surprising is that the ISO 27001 framework contains a number of legal topics, and the input of the legal team is vital as well. Some organizations may consult legal counsel about these topics, but I believe most organizations try to address these topics without legal help or use “off the shelf” documentation offered as samples from their auditors.

Writing documentation without the help of legal counsel creates risk for the organization. In the event of a security breach, for instance, government or plaintiff’s lawyers will ask for, and be entitled to examine, the “off the shelf” policies and procedures adopted by the company. If the policies and procedures were never implemented properly, these lawyers would point to the lack of adoption as evidence of knowing failure to implement security practices properly. If the policies and procedures are inconsistent with actual security practice, the inconsistency will make the company look lax in its security practices.

Published on:

In my first blog post on GDPR, I talked about why some U.S. businesses have an obligation to comply with the European Union’s General Data Protection Regulation (GDPR). This post expands on the territorial scope of GDPR. Which U.S. businesses have to comply with GDPR and which don’t?

Starting first with GDPR’s direct coverage, GDPR talks about businesses established in the EU. If a business has a division or office in the EU, then it has activities covered by GDPR. Also, if a business offers goods or services in the EU (even if free), the business must comply with GDPR. In addition, if a business is monitoring the behavior of EU residents, GDPR applies. Finally, if EU law applies because of public international law, then GDPR applies there as well. This last basis will not affect most U.S. businesses. These bases for GDPR’s coverage all appear in GDPR Article 3.

In addition to direct coverage, a U.S. business may have compliance obligations indirectly because it is processing data on behalf of a company that is covered by GDPR. The customer may be a controller collecting personal data from EU residents or may itself be a data processor receiving data from a controller covered by GDPR. In either case, a controller and any downstream processor have an obligation to protect personal data collected from EU residents. They will need to make sure any U.S. businesses providing data processing services meet the standards of the kind imposed by GDPR through an agreement or other mechanism. The two most common mechanisms are “standard contractual clauses” – essentially an addendum to a service agreement that impose data protection requirements – or a U.S. business’s self-certification under the EU-U.S. Privacy Shield program of the U.S. Department of Commerce. A U.S. business providing services for a European controller or processor can import EU personal data to the U.S. if it provides assurances of an adequate level of protection under either of these two mechanisms. These U.S. businesses, then, have an indirect obligation to comply with GDPR standards.

Published on:

This is my third blog post on the European Union’s General Data Protection Regulation (GDPR). For basic information about GDPR and why U.S. businesses need to watch out for GDPR, see my first blog post in the series. Or to see what GDPR says about information security requirements, see my second post.

What is the first thing your business should do in taking steps towards GDPR compliance? The short answer is you should assess your current privacy and security program.

More specifically, you will need to understand your organization, its business context, its culture, and the types of personal data it processes. You will need to understand in a detailed way what kinds of personal data the business is collecting and receiving, how it uses personal data, its practices in sharing and transmitting personal data, its retention of personal data, and its disposal of personal data. Your business should understand its entire personal data lifecycle and where personal data flow through your business’s systems. You need to assess the strengths and weaknesses of your current data protection program. Once you have made this assessment, you can plan future steps to enhance the data protection function within your business. You can then assess its effectiveness and make improvements.

Published on:

In my last blog post, I talked about compliance with the European Union’s General Data Protection Regulation (GDPR), why U.S. businesses need to worry about GDPR, and some steps businesses can take to prepare for GDPR’s compliance deadline. The previous post contains the basics about GDPR. This post expands on one aspect of GDPR: information security requirements. The press has a lot of information about privacy protections under GDPR, but GDPR also contains requirements for data security as well.

What does GDPR require regarding data security? GDPR has a general statement about security. Article 32(1) says, “Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk.” The term “controllers” refer to businesses that collect personal data from European citizens and determine the purpose or means of processing data. “Processors” process personal data on behalf of controllers, such as a third party service provider or outsourcing service vendor.

Unlike laws such as the U.S. federal Health Insurance Portability and Accountability Act (HIPAA) security regulations in the healthcare field, GDPR does not attempt to offer a complete list of security controls a controller or processor would need to implement. Instead, it provides the general statement about “appropriate” measures. It lists “technical” and “organizational” security measures. In this way, GDPR is similar to the HIPAA Security Rule’s requirements for “administrative” and “technical” safeguards. It provides a number of examples of controls, but the list is not meant to be exclusive.

Published on:

The hottest data protection issue for major U.S. businesses this year is compliance with the European Union’s General Data Protection Regulation (GDPR). Even small and medium sized businesses may also need to comply with GDPR. This post covers frequently asked questions about GDPR.

What is GDPR? GDPR[1] is the European Union’s comprehensive data protection law that takes the place of 1995’s Data Protection Directive 95/46/EC.[2] By “data protection,” I am referring to both privacy and security. GDPR collects, clarifies, harmonizes, and expands data protection requirements throughout the European Economic Area (EEA). The European Economic Area consists of the 28 countries of the European Union plus Norway, Iceland, and Liechtenstein.

Why is GDPR such a concern for U.S. businesses? First, the fines for violating GDPR are potentially heavy. EU data protection authorities can fine businesses up to 20 million euros ($23.5 million) or 4 percent of their global revenues for violations, whichever is greater. Fines, moreover, will likely be based on the revenue of the global parent and any subsidiaries involved with the violations. Second, U.S. businesses find GDPR to be complex and unfamiliar. Questions arise concerning jurisdictional scope, defining the kinds of personal data covered, obtaining consents from individuals, maintaining an audit trail of consents, managing cross-border data flows, and handling new forms of individual rights given to EEA residents.

Published on:

Healthcare providers, health plans, health care clearinghouses, and their business associates have an obligation under the Health Insurance Portability and Accountability Act (HIPAA) to protect patient health information protected under the law. Regulations issued by the Department of Health and Human Services (HHS) under HIPAA require HIPAA covered entities and their business associates to implement policies and procedures to address the disposal of electronic protected health information (PHI) and the hardware or electronic media on which it is stored. As a result, secure data disposal is a key process for HIPAA covered entities and their business associates.

The covered entity or business associate must have policies and procedures to ensure PHI cannot be inadvertently disclosed during or after disposal or reuse of its storage media. Next to the theft of lost and stolen laptops and media, the second most common subject of enforcement by the HHS Office for Civil Rights (OCR) has been improper disposal of PHI. For example, South Shore Hospital near Boston faced an attorney general enforcement action after the hospital retained a data management company to dispose of computer tapes containing PHI, but the tapes were lost in transit. The hospital failed to delete the PHI from the tapes before shipping them.[1] In another case, the OCR forced Affinity Health Plan to pay over $1.2M after it returned photocopiers to a leasing company without first removing electronic PHI from them.[2]

Some of the OCR enforcement activities concerned cases involving the improper disposal of paper PHI. The security of paper PHI falls under the Privacy Rule, rather than the Security Rule.[3] In one of the OCR cases, workers left boxes of paper medical records on a retiring physician’s driveway while he was away.[4] In one attorney general enforcement action, the Massachusetts attorney general sued former owners of a medical billing practice and four pathology groups after they improperly disposed of paper records with sensitive PHI in a public dump, which were found by a Boston Globe photographer. The former owners paid $140,000 to settle the claim.[5]

Published on:

SVLG Shareholder Stephen Wu will host a conference call program on the recent Equifax data breach on October 25, 2017 at 10 am Pacific/1 pm Eastern. While the Equifax is not the largest ever in terms of the total number of records affected, by some estimates, it affected about half of the population in the United States. With a breach that large, legislators and regulators are considering what new policies may help to prevent future large-scale breaches.

For businesses that create, receive, maintain, and transmit personal data, the Equifax breach raises the question of what changes are necessary to keep up with evolving data security threats. According to news reports, the breach occurred because of a failure in patch management — a failure to implement a publicly available patch to a known security vulnerability for a period of months. Are there emerging threats that warrant changes in patch management practices? Or did the Equifax breach occur because of the company’s failure to take care of the basic patch management steps. We will explore these questions in this program.

The program will generally explore the technical and legal ramifications of the breach.  What are the prospects for liability? What compliance challenges does the breach highlight? Are there changes in documented practice and procedure that the breach would suggest?