Welcome to the second issue of Decoded for 2023.
Congratulations to Alex Turner for his CIPP/US Certification. This is the first certification ever developed by the International Association of Privacy Professionals. As the global gold standard for privacy professionals, the CIPP/US certification is a key industry benchmark among top employers. Backed by ANSI/ISO accreditation, this certification shows the recipient has a strong understanding of U.S. privacy laws and regulations. This is a rigorous process and we appreciate that Alex devoted the time and energy to achieving this certification.
We have a couple of meetings that our legal professional readers may find of interest.
First, we are sponsoring the West Virginia State Bar 2023 Annual Meeting being held March 26-27, 2023. The Annual Meeting is the ideal venue to reconnect with lawyers and friends – and to build new relationships. Click here to learn more and register.
Second, for those of you interested in labor and employment law, we are pleased to sponsor the ABA's 2023 Employment Rights and Responsibilities Committee Midwinter Meeting being held March 13-17. The ERR Midwinter Meeting will offer a wide range of thought-provoking and topical educational programming. Several years in the making, the subcommittees have joined forces to offer a mock trial that combines theory and practice in a multi-panel presentation. Presented through four separate panels, the mock trial will explore investigations, whistleblower and retaliation claims, and collateral torts. Other panels will discuss the Supreme Court’s recent decision in Dobbs v. Jackson Women’s Health Organization and its effect on labor and employment matters, emergency ethics matters, the future of employment arbitration agreements, and new developments in international labor and employment law. Click here to learn more and register.
We hope you enjoy this issue and, as always, thank you for reading.
Nicholas P. Mooney II, Co-Editor of Decoded, Chair of Spilman's Technology Practice Group, and Co-Chair of the Cybersecurity & Data Protection Practice Group
and
Alexander L. Turner, Co-Editor of Decoded and Co-Chair of the Cybersecurity & Data Protection Practice Group
| |
“The case involves Ohio-based fast-food company White Castle.”
Why this is important: Illinois has the strictest biometric privacy law in the country with the Biometric Information Privacy Act (“BIPA”). The BIPA requires employers who collect employees’ biometric data to follow a number of protocols. These protocols include (1) maintaining a written policy about the collection and storage of employee biometric data, (2) providing employees with written notice of that policy, and (3) obtaining informed consent from employees to collect biometric data. The BIPA also provides for a private right of action for individuals harmed by the BIPA violations, with statutory damages up to $1,000 for each negligent violation, and up to $5,000 for each intentional or reckless violation.
The question of what constitutes a separate violation was raised recently in Cothron v. White Castle Systems, Inc., 2023 IL 128004 (Feb. 17, 2023). Conthron involves claims brought by a White Castle restaurant manager on behalf of a putative class of White Castle employees regarding White Castle’s use of employee fingerprints to allow them to access their paystubs. Plaintiff alleges that White Castle violated the BIPA by unlawfully collecting employees’ biometric information and improperly disclosing their biometric data to White Castle’s third-party payroll vendor. White Castle argued that Plaintiff’s claims were barred by the statute of limitations because claims brought pursuant to sections 15(b) and 15(d) of the BIPA accrue only once when the biometric data is collected. Plaintiff disputed White Castle’s interpretation, and argued that a new claim under the BIPA accrued each time White Castle collected her biometric data and sent it to the third-party payroll vendor. In a 4-3 split decision, the Illinois Supreme Court sided with Plaintiff that a separate claim for damages arises each time a business fails to seek permission to gather biometric data from employees or consumers, or fails to disclose retention plans for that information. While this decision appears to be a huge victory not only for Plaintiff, but for the Illinois Plaintiff Bar, the Conthron court clarified that the award of damages is discretionary and not mandatory. So even if there is a finding of multiple violations of the BIPA for each biometric scan and subsequent transmittal, the court can decide to not award any damages. It is unclear what approach trial courts will take regarding the award of damages, so companies cannot base their biometric collection practices on the hope the trial courts will not award damages in the event of a finding of violations of the BIPA. What is known is that the Illinois courts are now going to be flooded with the BIPA litigations that were stayed pending the decision in Conthron.
This ruling has significant impacts both inside and outside of Illinois. For employers who collect the biometric data of employees in Illinois, the statutory damages can quickly accumulate due to hundreds, if not thousands, of violations of the BIPA for just one employee. Multiply that by the total number of employees your company employs in Illinois, and one class action could be financially ruinous. Either your company invests in rigorous compliance with the BIPA, or avoids the issue altogether by eliminating the collection of employee biometric data.
This decision also affects employers in other states because many states, such as Maryland, Mississippi, and New York, are looking to pass biometric privacy laws that allow for private rights of action. Because these pending legislation in these states closely resemble or are based on the BIPA, the ruling in Conthron may impact the judicial interpretation of these statutes, once passed. This decision also impacts companies that are outside of Illinois, but have employees and/or customers inside Illinois. A careful analysis of your company’s biometric data collection practices is needed if you do business in Illinois, or any other state that has a biometric privacy statute that allows for a private right of action. If you need assistance with compliance with the biometric privacy laws in the states in which you do business, then please contact Spilman’s Cybersecurity and Data Privacy Practice. --- Alexander L. Turner
| |
“But the agency’s actions are being watched closely because if it starts an official procedure, then it could have huge implications for all stablecoins including tether and USDC.”
Why this is important: This article discusses the SEC’s recent notice to Paxos that it is considering recommending action against it, alleging a digital currency it issues, Binance USD (“BUSD”), is security. BUSD is a stablecoin whose value is pegged to the U.S. dollar. It and other stablecoins allow people to move into and out of different digital currencies quickly without having to convert into and out of fiat currency (government-backed currency like the U.S. dollar). If BUSD is determined to be a security, then the SEC would have regulatory authority over it, and Paxos would need to register with the SEC. The implications to Paxos of the SEC’s notice are significant. But, there also are implications for the stablecoin market as a whole. There isn’t yet an action against Paxos. If and when one is commenced, and depending on its allegations, it could extend to other participants in the stablecoin market, including the largest two, Tether and Circle/USDC. Last week, New York’s financial regulator ordered Paxos to stop issuing BUSD. We should see in the coming weeks whether the SEC recommends taking action, too. --- Nicholas P. Mooney II
| |
“But the pandemic-era rise of telehealth and therapy apps has fueled an even more contentious product line: Americans’ mental health data.”
Why this is important: As consumer data collection continues to rise in the United States and around the world, aggregated health data is becoming a more common product bought and sold by data brokers. While worrying on its own, even more concerning is the growth in individually identifiable data being sold by private companies, which could range from the number of occurrences of a certain condition in a given zip code to the names, addresses, and incomes of individuals with the same condition.
Currently, the federal government does not provide comprehensive consumer data protections that would bar this behavior. HIPAA, the Health Insurance Portability and Accountability Act, governs how and when health care providers and servicers may share patient health data, but this does not extend to tech companies collecting data under terms and conditions accepted by the consumer that often grant the company broad discretion in how their data may be shared and used.
The result overall is a patchy and unregulated system in which consumers often do not realize that their data is being collected at all, let alone aggregated and sold for a profit. Some states, notably California and Vermont, have made some headway in cracking down on the practice by requiring data brokers to register with the state, but this alone will not stop data sales. This article makes a good point to note that the recent Supreme Court decision in Dobbs could be raising much more awareness of the real-world consequences of rampant data collection. Legislators are working on limiting the ability to share reproductive health data so that it cannot be misused. While this is a promising advancement, it is still a small piece of the puzzle when it comes to protecting consumer data. --- Shane P. Riley
| |
“Campuses are limiting use on school devices and Wi-Fi.”
Why this is important: Some colleges and universities in the United States are banning TikTok, a popular social media platform, due to concerns over data privacy and security. TikTok is owned by the Chinese company ByteDance, which has faced scrutiny over its ties to the Chinese government and the potential for user data to be accessed by Chinese authorities.
Many colleges and universities are limiting the use of TikTok on school devices and Wi-Fi networks, citing concerns over the app's data collection practices and the potential for sensitive information to be shared with third-party companies or foreign governments. Some institutions have also raised concerns over the app's potential to distract students from their studies and to promote cyberbullying or other negative behaviors.
In addition to concerns over data privacy and security, the ban on TikTok may also reflect broader geopolitical tensions between the United States and China. The Trump administration previously sought to ban TikTok in the United States, citing concerns over national security, although the proposed ban was blocked by the courts. The Biden administration has also expressed concerns over TikTok's data collection practices and the potential for sensitive information to be shared with foreign governments.
Overall, the ban on TikTok reflects ongoing concerns over data privacy and security in the digital age, as well as broader geopolitical tensions between the United States and China. While some students may be disappointed by the ban, it is important for colleges and universities to prioritize the security and privacy of their students and to ensure that sensitive information is not put at risk. --- Kevin L. Carr
| |
“Joint venture in more than 10 cities will enforce U.S. laws protecting U.S. advanced technologies from illegal acquisition and use by nation-state adversaries.”
Why this is important: The Department of Justice (“DOJ”) recently announced the creation of the Disruptive Technology Strike Force. This strike force consists of members of the DOJ National Security Division; the Commerce Department’s Bureau of Industry and Security (“BIS”); members of the FBI; members of Homeland Security Investigations (“HIS”); and 14 U.S. Attorneys’ Offices in 12 metropolitan regions, including Atlanta, Boston, Chicago, Dallas, Houston, Los Angeles, Miami, New York City (Southern and Eastern Districts of New York), San Jose, California, Phoenix, Portland, Oregon, and the Washington, D.C. region (District of Columbia and the Eastern District of Virginia). The strike team will be co-led by Assistant Attorney General Matthew G. Olsen of the Justice Department’s National Security Division and Assistant Secretary for Export Enforcement Matthew Axelrod of the Commerce Department’s Bureau of Industry and Security. The goal of this strike team is to target bad actors, strengthen the supply chain, and protect critical technological assets from harm by nation-state adversaries. These adverse nation-states include China, Iran, Russia, and North Korea. The strike force seeks to prevent these nation-states from acquiring critical technologies that would enhance their military capabilities or strengthen their mass surveillance programs that enable continued human rights abuses. The strike force’s focus is on prosecuting “criminal violations of export laws; enhancing administrative enforcement of U.S. export controls; fostering partnerships with the private sector; leveraging international partnerships to coordinate law enforcement actions and disruption strategies; utilizing advanced data analytics and all-source intelligence to develop and build investigations; conducting regular trainings for field offices; and strengthening connectivity between the strike force and the Intelligence Community.” --- Alexander L. Turner
| |
“From reaching customers that are buying security solutions in new ways, to fighting cybercrime and emerging threats alongside their trusted channel partners.”
Why this is important: This article summarizes the many statements of CEOs of cybersecurity firms about what they anticipate prioritizing in 2023. Some are the bland statements that you would expect, like “we’re going to do our best to meet the cybersecurity challenges of the world head-on.” Others give a hint at some of the nuts and bolts of what those firms plan to do in the rest of the year. Many touted new additions to their core teams and growth globally. Others stated that they are prioritizing more investment in longer-term innovation. One stated that his firm is focusing on securing cloud assets, user identities, and endpoint devices, since these are areas of cyberattack that are most commonly exploited. Another stated that his goal is simplification of his firm’s products so that users are more easily able to fully implement the products and protect themselves. --- Nicholas P. Mooney II
| |
“The pace is slow, and target dates for some of the most ambitious 3D plans are decades off.”
Why this is important: As this article notes, nearly 106,000 Americans are currently on waiting lists for donor organs and 17 die each day while waiting. The holy grail for rising to meet this demand and end this suffering will be the ability to “print” the organs, muscles, and tissues from individually grown cells, lowering the need for human donors and the complicated and sometimes unbearable stress that goes along with donor wait lines.
There have been recent advancements in the field that have produced several different skin, bone, muscle and vascular structures, and even whole non-human organs, such as a functional porcine pancreas and a rabbit-sized heart. The most recent use of 3D printing in humans was in 2022 where Dr. Arturo Bonilla constructed and implanted an outer ear structure for a woman born without one. The implant was made from the woman’s own cartilage cells and made using a 3D bioprinter.
In its most basic terms, bioprinting is a process where individual cells are generated and instructed to become a specific cell type. They are then made into a “bioink,” which can be loaded into a bioprinter and laid down in layers with other cell types, resulting in a structure resembling a naturally occurring tissue. Once connected to oxygen and other necessary nutrients, the printed structure matures and develops higher function.
The promise of scaling up this process to create whole organs that can be readily used in humans comes down to accuracy, precision, and reduced costs. The process of making 3D printed organs can eventually be automated and can build cell structures to exact specificity using the patient’s own cells, which dramatically decreases the likelihood of organ rejection.
While estimates on when these 3D printed organs will become commonplace in health practices ranges from a decade to several decades, the experts appear to all agree that this is no longer a science-fiction fantasy, but a real inevitability. What has yet to be seen or broadly discussed, however, is the legal ramifications of artificial organs. How will these be treated by the FDA? Who will be liable when an artificial organ fails? Will companies that produce artificial organs be subject to product liability in the same way as automakers and toymakers? As 3D printing is on the rise, all eyes should be on Congress and the federal agencies responsible for oversight in the coming decades to see how they respond to a new era in regenerative health care. --- Shane P. Riley
| |
“Higher-education institutions that handle federal financial aid data have until early June to comply with federal rules for protecting privacy and personal information.”
Why this is important: With only four months left before most changes to the federal Standards for Safeguarding Customer Information (“Safeguards Rule”) – a component of the Gramm-Leach Bliley Act (“GLBA”) that provides for the protection of consumers’ privacy and personal information – take effect, the Federal Student Aid Office is focused squarely on postsecondary educational institutions and third-party servicers, according to its recent announcement. Is your institution ready for the June 9, 2023 deadline?
Colleges and universities that participate in federal student financial aid programs authorized under Title IV of the Higher Education Act of 1965 (“Title IV”) are obligated to protect student information under the GLBA. As belts and suspenders, each institution that participates in Title IV programs expressly agrees to comply with the GLBA Safeguards Rule through its Program Participation Agreement with the United States Department of Education. Third-party servicers have similar obligations. Along with postsecondary institutions, servicers must sign the Student Aid Internet Gateway Enrollment Agreement, ensuring that all federal student aid applicant information is protected and guarded against unauthorized access in the administration of Title IV programs.
On December 9, 2021, the Federal Trade Commission issued final regulations to strengthen consumer protections under the Safeguards Rule, which take effect June 9, 2023. Among the June 9 requirements, covered schools and servicers are required to have a written, comprehensive information security program that contains specific administrative, technical, and physical safeguards. Other mandates include risk assessments, implementation of risk control and testing safeguards, staff preparedness (necessitating training) to enact the information security program, and an incident response plan for institutions and servicers that maintain student information on 5,000 or more consumers.
Foreshadowing these expectations were multiple Dear Colleague Letters and electronic announcements from the Federal Student Aid Office over the past decade, informing schools of ways to strengthen their cybersecurity infrastructure to protect student financial aid information and emphasizing plans to enforce the GLBA through annual compliance audits. While all elements of the Safeguards Rule are vital, the Federal Student Aid Office indicates that an institution or servicer may significantly reduce the risk of a security breach “by encrypting customer information while it is in transit outside its systems or stored on its system and by implementing multi-factor authentication for anyone accessing customer information on its systems.”
As this article highlights, amendments to the Safeguard Rule come at a time when educational institutions remain significant targets of crippling ransomware attacks, including at least 35 colleges and universities in 2022 alone. In this environment, it is no surprise that failure to comply with the Safeguards Rule carries the potential for a heavy penalty – the inability to participate in Title IV programs. As the June 9 deadline approaches, schools and servicers should act now to ensure that their information security programs include the specific administrative, technical, and physical safeguards imposed by the Safeguards Rule and staff are effectively trained to implement these requirements. --- Erin Jones Adams
| |
“The concerns leading organizations to withhold information are aplenty, including reputational damage and financial impacts.”
Why this is important: This article discusses an alarming trend. According to a recent report from the cybersecurity firm Arctic Wolf, nearly 75 percent of organizations that suffered a data breach in 2022 chose not to disclose it. Their reasons are many. Some believe there is a stigma to being a victim of a breach and that it is a sign of organizational weakness. Others fear reputational damage, increases in insurance premiums, potential follow-up breaches, and the like. The article also discusses those organizations’ top cybersecurity concern, ransomware, and how organizations dealt with it in 2022. According to the report, 40 percent of the surveyed organizations admitted they had been hit by ransomware in 2022. Despite advice from authorities not to pay ransom, nearly 75 percent of the organizations paid some part of the ransom, with nearly 40 percent of them paying the full ransom amount. At bottom, this article demonstrates the gap that exists between what should be done and what is done when responding to a breach and/or ransomware attack. --- Nicholas P. Mooney II
| |
This is an attorney advertisement. Your receipt and/or use of this material does not constitute or create an attorney-client relationship between you and Spilman Thomas & Battle, PLLC or any attorney associated with the firm. This e-mail publication is distributed with the understanding that the author, publisher and distributor are not rendering legal or other professional advice on specific facts or matters and, accordingly, assume no liability whatsoever in connection with its use.
Responsible Attorney: Michael J. Basile, 800-967-8251
| | | | |