Main Menu Main Content
PDF

Dan McMullen:

Good afternoon, all. My name is Dan McMullen. I'm a partner with Calfee, Halter and Griswold where I practice intellectual property law and lead our firm's information technology practice. Along with my colleague, Billy Raulerson, we're very pleased to have you join us for a discussion of data privacy in times of public crisis, including the more specific look at what in the pandemic infected world, Apple and Google are up to with their much publicized COVID Exposure Notification system.

Dan McMullen:

For your information and benefit, if it's helpful, we are seeking one hour of continuing legal education credit in Ohio and Kentucky or today's program. And in the interest of making this program as interactive and useful as possible for all of you, you're invited to submit questions as we go. If we don't address them during the presentations, we'll get to as many as possible at the end.

Dan McMullen:

So to better understand and appreciate aspects of the Apple, Google undertaking, we start with an overview of the legal landscape around data privacy now two decades into the 21st century. As you know, issues of data privacy and cybersecurity have become increasingly prominent for individuals in businesses of all sizes. Both the risks and consequences of a data breach or other cybersecurity incident continued to grow in both likelihood and severity.

Dan McMullen:

The study several years ago from the [inaudible 00:01:32] Institute estimated that approximately 50% of small businesses had been breached in just the preceding 12 months, 50%. IBM executive chair and former CEO Ginni Rometty has pronounced that quote cyber crime is the greatest threat to every company in the world. And as many of you have read, breach incidents experienced by companies like Wyndham Hotels, target Home Depot, Yahoo, and Equifax, and resulted in costs, damages, and settlements totalling in the billions of dollars. Daily, we all read about the latest attacks by virus, malware, ransomware, phishing, spoofing, and all the other variations that are being developed and refined by people who do not have your best interests at heart.

Dan McMullen:

In view of that environment, we ask how has the US legal system responded, and what has our legislators done to address such threats to the digital environment. Politely put, one might call data privacy law in the United States a patchwork, somewhat daunting, even confusing. It is noted that was before the pandemic. To digress for a moment, we might ask why is that? In the country that arguably invented and certainly promoted the growth of the internet as a global communications phenomenon, why is data privacy such a challenge?

Dan McMullen:

You take a historical view since the early days of the commercial internet, US law has taken a relatively hands-off approach to data privacy and security, except within specific industries or activities. The implicit bargain here in the United States has been to leave it to the tech industry to innovate and grow the commercial internet, creating and extracting vast fortunes in the process, of course, and for the government to stay out of its way. Translated to the individual level, the bargain has been for the tech industry to provide free, and for those of you who can't see, I'm emphatically using air quotes, free services, I think such social media and the like in exchange for the largely unregulated collection and exploitation of individual users information.

Dan McMullen:

The reality of that bargain for consumers has been distilled in the guidance. If you're not paying for the product, you are the product. That's the legal effect of this bargain, has largely been to leave it to private actors to address privacy issues, contractually or through individual litigation. And as most of you probably know the terms of service and privacy policies of large online service providers often render such remedies few times. I would submit, however, that view of the world is not shared everywhere and is even changing here at home.

Dan McMullen:

So to take a brief walk, US data privacy law is a multi-layered amalgam of obligations arising from different levels of government. In some instances, based on different data types and implicating different actors. Thus, we have special purpose federal laws directed to select industry verticals, such as HIPAA-HITECH, as I'm sure many of you are familiar that's the Health Insurance Portability and Accountability Act, and those laws of course, arraigned at protecting the confidentiality of individuals health-related information.

Dan McMullen:

The next acronym in the list, GLBA, Gramm–Leach–Bliley Act requires privacy notices and certain protections for individuals financial information. COPPA is the Children's Online Privacy Protection Act, which governs online collection and use of information about minors. FERPA, the Family Educational Rights and Privacy Act protects the privacy of student educational records. Now, along with those statutes, the Federal Trade Commission conducts investigations and enforcement proceedings, and can't sanction companies for their disclosures in handling of consumer information.

Dan McMullen:

As you may be aware, the FTC has imposed some particularly harsh sanctions where companies have published privacy policies or statements, and then failed to adhere to their own stated policies. The bottom of that list, PCI DSS is the Payment Card Industry Data Security Standards, and although they were not enacted by Congress, the PCI standards have acquired a quasi legal status by virtue of their widespread adoption and recognition, and indeed some States have even proposed legislation to formerly enact them into state law.

Dan McMullen:

Additionally, the legal landscape on privacy is populated by state laws. Legislatures in every state have enacted some form of data privacy statute, prominently, including breach notification laws, which are now in effect in all 50 States that require entities suffering data breaches to that disclose personal information to notify the individuals whose data was exposed, and often notify state regulatory authorities as well. The States respectively have also enacted various special purpose laws. So in New York, New York's Financial Services Cyber Security Regulation is directed to the financial services. Insecurities industries, Illinois has a specific Biometric Information Privacy Act.

Dan McMullen:

The last example on this slide, California's Consumer Privacy Act, which took effect at the beginning of this year is notable for its more comprehensive treatment of data privacy rights and obligations. I'll say more about that in a moment, but there's one other feature of the legal landscape that's highly relevant to many US enterprises, even if it is not a product of US legal process, and that is GDPR, the General Data Protection Regulation, which as you probably know was adopted by the European Parliament in 2016, took effect in may of 2018, and has the force of law in the EU, unlike its predecessor, the 1995 EU Data Privacy Directive.

Dan McMullen:

GDPR does not require individual member States to enact their own laws in order to give it effect, meaning it is a law. As a new law, GDPR is noteworthy on a number of fronts. Perhaps most fundamentally, it recognizes and declares that personal data protection is a fundamental right of EU citizens, fundamental right. Let that sink in for a moment. To analogize, the Europeans express that commitment to data privacy as akin to our free speech rights under the first amendment of the US constitution. So the notion of letting Google, Apple, Facebook, Amazon, Twitter, and others do whatever they wish with the personal data of EU residents in exchange for the free services they provide is fundamentally anathema to the policy embodied in GDPR.

Dan McMullen:

Further, GDPR is extraterritorial in its effect. Its rights and obligations are not confined to the physical geography of Europe. In a sense, they travel with the data, which is why many US businesses are subject to the law, even if they don't have physical facilities in Europe. Moreover, the potential penalties for violating GDPR, especially if done willfully, are jaw-dropping up to 4% of an enterprise's annual global revenue or 20 million Euro.

Dan McMullen:

Now under that general umbrella, GDPR recognizes data subjects also known as people, and that they have various rights in their personal data. The right to grant or withdraw consent to another party to process and hold that data, the right to access it, to correct errors, obtain their own copy of the data in question, the much publicized right to be forgotten, meaning the right of an individual data subject to have their personal data deleted from certain sources under certain conditions, and the right to bring legal action to enforce the other rights under GDPR.

Dan McMullen:

Corresponding to those rights of data subjects, data controllers, meaning that people who are responsible for collecting such data and processors, are these that are engaged to store and process data have corresponding duties starting with fundamental duty to protect the data subjects for our going right, but in more operational terms to process personal data in ways that are lawful, that have a legitimate purpose, that minimize the amount of data that's collected and processed to that legitimate purpose, that are accurate, that protect the confidentiality of the data and preserve accountability on the part of the parties handling it.

Dan McMullen:

Processors and controllers are also obligated to employ appropriate security measures like encryption or pseudonymization in order to effect such protections. Additional obligations include the fact that controllers and processors may not engage in unauthorized subcontracting. That is, it has to be documented, and understood. They must also secure consent when consent is required from a data subject in an intelligible and easily accessible form using clear and plain language. If you pause and think about that for a moment, compare that to the terms of use you may have read on many websites, and I know because I've drafted some of them.

Dan McMullen:

In addition, data controllers and processors must ensure that cross border transfers of personal data are done securely into jurisdictions with appropriate legal safeguards, which as you may be aware, does not include the United States. The EU does not believe that American law or US law provides appropriate legal safeguards for personal data. So specific measures may need to be employed in order to transfer data for processing in the US. I'll say more about that in a minute. There's an obligation to give very prompt notice of a data breach presumptively 72 hours, which is a very short window. And in circumstances where data processing is a core activity, businesses are required to appoint a data protection officer.

Dan McMullen:

Now for US businesses, of course, the threshold question is, does GDPR apply to us, to our business? And that question can only be answered starting with a data audit and inventory for an enterprise to be expressly cognizant of what data it is collecting and processing. If so, then businesses need to develop plans to comply with GDPR requirements. Most large businesses have already done so or well along that path, although many smaller and even some mid-market companies have not fully completed GDPR compliance plans.

Dan McMullen:

For companies processing EU personal data in the US, a key consideration is satisfying that, that condition that the jurisdiction ensures an adequate level of protection since the Europeans have determined US law does not meet this condition. Alternatives for businesses include adopting binding corporate rules regarding data privacy and security or standard contractual clauses that can be baked into agreements to meet such conditions or certifying under the US EU privacy shield, which is administered by the United States Department of Commerce and something we have helped a number of our clients to accomplish.

Dan McMullen:

Now, closer to home is California's new Consumer Privacy Act. And I think you'll recognize in notable ways California's new law echoes GDPR. It reiterates the California constitutional principle that privacy is an inalienable right of all people, and establishes basic rights of California consumers in their personal information, including to know what personal information is being collected, from what sources, for what uses, and through whom it's disclosed.

Dan McMullen:

It includes the right to opt out of a third-party selling personal information to others. It includes that right to have a personal information deleted in certain circumstances, like the right to be forgotten so-called under the GDPR standards. And additionally, an interesting feature of California Statute is the right to be treated equally, commercially that is, regardless of exercising privacy rights. That is the right to the same level of service and pricing even if the individual consumer elects to vigorously exercise the rights that California Statute affords. The California Act applies to businesses that operate in California, collect personal information of California residents and meet one of the other specified criteria related to the size and handling of personal information.

Dan McMullen:

As noted, the law took effect January 1st of this year, and the active enforcement authority of the California Attorney General commenced just on July one. So, the California AG is now open for business on California Consumer Privacy Act enforcement activities. Its authority includes the right to seek injunctive relief or civil penalties in the event of a data breach. There is also a private right of action. So the California AG can enforce the law and impose sanctions for violating it. In the event that there's a breach, individuals can bring their own separate private causes of action, including as class actions, which, as you can readily appreciate was a contentious topic in the California legislature when the law was adopted.

Dan McMullen:

And again, those remedies for individuals include injunctions and statutory damages up to $750 per resident per incident. California's size and impact on the national economy, of course underscore the significance of this law even though it's only one of 50 States, but disproportionately significant in terms of impact on the economy, and further as comes as no great surprise, a number of other States are now considering legislation similar in varying degrees to the California Statute. I want to say, one of the things about a state law, specifically Ohio's Data Protection Act adopted as Senate Bill 220, it very close to home, a law that was promoted in particular by then Ohio Attorney General, now governor, Mike DeWine as part of his CyberOhio initiative.

Dan McMullen:

Ohio's Data Protection Act took effect in November of 2018, and Ohio's law is in some respects sort of countertrend in that unlike California's law and other legislation in the pipeline in various States that impose sanctions for failing to provide adequate data security and privacy protections, Ohio's law offers something of a carrot rather than that stick for businesses that implement a data security program that reasonably conforms to an industry recognized cybersecurity framework. And you can see in this slide a number of the frameworks that are specifically recognized in the law.

Dan McMullen:

And so this affords a legal benefited degree of protection for companies that affirmatively undertake to conform to an industry recognized security framework. And I should add, importantly, that parties can make this law applicable to their commercial engagements by invoking Ohio law in their contracts. So among other considerations that may come into play and choice of law and commercial agreements, this is one that at the margin militates in favor of invoking Ohio law.

Dan McMullen:

So, having addressed briefly that legal landscape, understanding what your business can do to adapt to what is admittedly a rapidly evolving legal landscape around data privacy is probably a separate webinar for another day, but here are just a few high-level points for your consideration. Certainly steps like developing a cybersecurity plan, including in particular a data breach or incident response plan, and providing appropriate training and practicing that training, are pretty common sense, but I would ... Although done well, are not necessarily easy and require an investment of energy and resource. I'd also highlight the importance in thinking about what your business might do of reviewing and negotiating the contracts that affect cybersecurity in your environment. And I can say more about that at the end if we have enough time.

Dan McMullen:

So, with that brief fly over of the data privacy legal landscape in 2020, consider what happens when a public crisis suddenly makes the prospect of identifying individuals and sharing their health information seemed critical to a public health response. And with that, let me introduce you to my colleague, Billy Raulerson to tell you about the Apple and Google agents, COVID Exposure Notification system. Billy, all yours.

Billy Raulerson:

Thank you, Dan, for that overview of privacy law. Hello, everyone. I'd like to begin and end my portion of the presentation by posing questions. My first question is, why should you care? Not the collective or hypothetical you, but each individual that's connected and listening now. My answer to that question is, because you are all data generators and you are all data consumers. And so each of you has a lot of information associated with you, whether it's inbound or outbound. And by data, data generally means information. For purposes of this presentation, we will be talking about information in a digital form.

Billy Raulerson:

And so typically, we expect that our personal information will remain private. Indeed the law clearly recognizes certain types of data as warranting confidentiality. Some examples include medical records, financial information, attorney-client communications, but what about other types of personal data? What about texts that you only want your spouse to see? What about texts that you never want your spouse to see?

Billy Raulerson:

Data also can have value, monetary value. Companies might pay to know where you are going, what you are doing, what you are buying, what you are eating, et cetera. Indeed, the the monetization of data is a constant pressure on privacy concerns. So how did we become the data generating generation? Primarily, in my opinion, this is the result of technological advances. From desktop PCs to laptops, to tablets, and now to smartphones. This is a picture of a Samsung galaxy S10, the type of smartphone I always have with me.

Billy Raulerson:

So a smartphone is a smart device. It's finally small enough to be carried everywhere, pocket-sized if you will. And it's not just small, but it's incredibly powerful. You probably have heard anecdotal tales of the power of these devices, but it's true. I mean, the processor in this smartphone here is a 100,000 times more powerful than the processor that was used in the Apollo mission to put a man on the moon, but it's not just small and it's not just powerful, it's actually cheap enough so that my children have these.

Billy Raulerson:

And so the pocket-size computers that allow us all to be data generators and consumers are what's driving some of these privacy concerns. And you'll note I circled the phone icon on the bottom here. I recently heard a comedian say that the phone feature was just another pre-installed seldom used app, which I think illustrates that although we're talking about smart phones, they're not always used as phones. So they're not always even most frequently used as phones.

Billy Raulerson:

And so not withstanding that we're talking about phones, this can extend to other smart devices that are small, powerful, affordable tech. I know many people have smartwatches now, and that may be the next generation. And you'll see on the left here, I've given some examples of how data generators and data consumers push and generate this information. This also illustrates that these really are, I mean, they're portable computers.

Billy Raulerson:

So if the smartphone is the body, the operating system is its brain, which brings us to Apple and Google. And so an Apple device uses a proprietary operating system called iOS and non-Apple devices typically use the Android operating system. The Android operating system was actually developed by a consortium of developers, but it's most often associated with Google who commercially sponsored it and often associates their proprietary application suite with it. And so when we look at the market share, and again, these are 2019 figures, Apple's iOS operating system had approximately 13.9% of the global smartphone operating market share while the Android operating system had 86.1% of the global operating system market share. In other words, Apple and Google essentially have locked up the smartphone operating system market.

Billy Raulerson:

So with that technology background in mind, I want to shift gears momentarily and talk about a public health tool that has frequently been discussed during the current COVID pandemic, and that is contact tracing. So contact tracing is the process of identifying persons, we'll call them contacts, that may have come in contact with an infected person. This is typically considered an index patient. The public health aim being to trace, test, isolate, often called quarantining, and trading, which can cover immunizations and other treatments, these individuals to manage the infectious disease.

Billy Raulerson:

For decades, contact tracing has involved public health professionals conducting actual interviews and follow up interviews to gather relevant information, including these other potential contacts. This form of contact tracing has proven to be a powerful tool in combating the spread of diseases such as smallpox and cholera. This type of a conventional contact tracing is often referred to as manual contact tracing. But since smartphones can do everything, why not contact tracing?

Billy Raulerson:

So several States, Utah being an example, and countries, France being an example, have set out to build their own contact tracing applications or apps predicated on smartphone technology. Typically these apps are using integrated technology integrated into the smartphone, such as Bluetooth and or GPS to track who an infected person, and again, this typically requires self-reporting by that individual or an affiliated public health institution, who that person has been around, then alerting those persons that they may have been exposed to the virus. This more modern type of contact tracing can be considered automatic or digital contact tracing.

Billy Raulerson:

So as governments and healthcare authorities rushed to create these tracking apps, the potential privacy concerns are becoming apparent. Some examples are, is the confidentiality of your health status and your personal information being adequately protected, i.e., is there reliable anonymity? Is the information on your other habits such as your movements and your purchases vulnerable, i.e., is there an overcollection and or misuse of the collected personal information? Is this information being maintained in a manner where future abuses are possible, i.e., is the information stored in a manner where it can be used beyond the current health crisis?

Billy Raulerson:

I'd like to go back to Apple and Google and why they became involved. Indeed, why did mega-competitors, Apple and Google decide to collaborate on their COVID Exposure Notification system, also known as AGENS? Well, as noted above, given their market share in the smartphone operating systems, these two tech giants were uniquely positioned to assist in the development of COVID related apps. AGENS however, is not an app itself. It is an application programming interface, which is a set of tools that support the development of software apps that work across both operating systems. Indeed this was a critical aspect of the collaboration, a key goal being to allow development of apps that worked across both platforms.

Billy Raulerson:

Also importantly, AGENS was intended to facilitate the development of privacy-preserving contact tracing apps. That was actually noted in their mission statement, their joint mission statement, and we'll see that on the next few slides. Actually before I move on, I think it's important when we talk about digital contact tracking, it's not person-to-person tracking, it's device-to-device tracking with the implication being that the device is an extension of each person.

Billy Raulerson:

So how does AGENS endeavor to protect privacy? Well, first then very importantly, each user must opt into the system. In this manner, it is a voluntary versus a compulsory system. Furthermore, it can be turned off at any time. Additionally, collected data is stored on the user's phone, not in a central repository such as a central database. Furthermore, it uses Bluetooth and periodic anonymous beacon transmissions. By anonymous, we mean no personal information is associated there with. It also uses Bluetooth technology to record any received beacon transmission. This is actually how the contact log is built up of devices that have been in proximity to one another using low energy Bluetooth technology that's already existed in most of these phones.

Billy Raulerson:

To provide a little more background on how this process works periodically, let's say daily, the device will download from a server a list of keys that are associated with beacons that have been confirmed to belong to individuals that have been diagnosed, let's say with the COVID virus in a particular geographic region. Then the device can compare that list of keys against the list of beacon stored on itself. If there is a match, the user, in this case the device owner, again being the implication, may receive a notification, this is from AGENS, of potential exposure and suggested next steps.

Billy Raulerson:

Note, beyond the API that I discussed initially, this feature of this Bluetooth, low energy Bluetooth tracking feature, has now become part of the operating systems and does not require a third-party app to function. I believe both Apple and Google pushed out in one of their recent updates this capability. So continuing with the AGENS privacy considerations, so again, it's been made clear that no user identifying information is ever made available to Apple, Google, or other users of the system. Furthermore, processing, not just the aforementioned storage of the information occurs on each individual device on the user's device.

Billy Raulerson:

The random Bluetooth identifiers also change periodically, let's say every 10 to 20 minutes, and are encrypted, are using encrypted keys. Individuals who test positive for the virus are not identified to Apple, Google, or others. Again, this is reporting to you that you may have been exposed to an individual that has been confirmed to have a virus. It doesn't tell you who, I don't believe it tells you when, it does not tell you where. Interestingly, this is also true with conventional manual contact tracing. They generally don't tell you who you may have been exposed to. They generally tell you that you may have been exposed.

Billy Raulerson:

So continuing on, AGENS is limited to contact racing only by authorized government public health entities. So this is not something where individuals or private companies can download it from the app store or build their own app system around it. So that adds purportedly some layer of trust. Furthermore, AGENS importantly is intended to be disabled on a regional basis when it is no longer needed, and that could be simply done by Apple and or Google by rolling out again another software update.

Billy Raulerson:

Furthermore, Apple and Google have expressly come out and said that they have no intention of monetizing any of the data. It's important to note that everything on this slide and the previous slide were designed choices that were made by Apple and Google. For example, the decision to use Bluetooth versus GPS as part of the tracking technology, whereas, GPS can provide a lot of specificity on where you've been location-wise, that's generally not available with Bluetooth technology.

Billy Raulerson:

However, other apps or frameworks may involve different design choices. And we've seen examples of a lot of States and countries opting not to adopt the Apple/Google framework in many instances, because they think it's too limited in the amount of data that it collects or the availability of that data for further processing. So as previously stated, I want to end the presentation with a question as well, and that question is simply, is this automated contact tracing flawed?

Billy Raulerson:

And again, I'll give you my answer. I believe from a technological standpoint, not at all. In theory, it should work and it should work well. The problem is that it requires a certain level of adoption by a population within a region to be effective. And you've heard estimates of 50 to 70% of the US population needing to adopt this technology for it to be effective. However, our population roughly only 80% of US population owns a smartphone. So it becomes more daunting when you realize the degree of adoption that's necessary for this technology to work. And to date, few, if any, regions have been able to achieve a sufficient rate of adoption based at least in part on a fear that the user data might be compromised or otherwise misused.

Billy Raulerson:

Additionally, Apple and Google have been clashing with government entities who want more access to the underlying data. For example, not just that a user was potentially exposed to the virus, but where that exposure happened. And from their perspective, Apple and Google have alleged that they had never really intended to digitize contact tracing or to remove the human element of public health, but instead they were merely intending to address technical challenges that contact tracing apps developed by others using Bluetooth would likely encounter. All right. Well, thank you for attending this webinar. I believe we have plenty of time left now for questions, if anyone has any.

Maggie:

Yeah. Billy, it's Maggie. It looks like we got a few questions during that. So I'll go ahead and read them. We have two. The first one is, what are the prospects for seeing general privacy legislation adopted at the federal level in the US?

Dan McMullen:

[inaudible 00:39:49], it's Dan McMullen again. I'm happy to try to address that. I mentioned that, in the aftermath of the California Consumer Privacy Acts enactment, a number of other state legislatures are considering laws similar in varying degrees. And this perspective, proliferation of state laws, is prompting more interest in and indeed pressure on the Congress to enact a national data privacy law. In fact, even the major tech companies that in years past have routinely resisted such federal legislation in this arena, those companies are now seeing it as preferable, meaning national legislation, preferable to having 50 separate state laws with often overlapping and or conflicting provisions.

Dan McMullen:

I just add that a colleague of Billy's and mine, Tim Day, who's a principal in the Calfee Strategic Solutions practice, is actively involved in these legislative efforts at both the state and federal level. So I'd invite you to please, look for a future Calfee webinar with Tim in particular on this subject and opportunities for your business to engage and help shape such legislation if you have an interest.

Maggie:

Okay, it looks like we have one more. It says one of your slides asked what businesses can be doing to address data security and referenced third-party contracts. Can you explain how or why such contracts are relevant?

Dan McMullen:

Right. This is Dan McMullen again. So I think that alluded to a point on one of the slides related to what businesses can do in addressing their own data privacy obligations and considerations. And so I made reference to developing a data privacy plan of course, and conducting training, but also alluded to inappropriate review of third-party contracts. If you think a little bit about how the IT environment has evolved, particularly in the last decade or so, almost no business any longer operates all of its own IT infrastructure.

Dan McMullen:

Particularly as cloud computing and software as a service have grown in popularity and volume, for most businesses reliance on a web of third-parties to maintain and support software databases and networks has become routine. It's what IT means in 2020. But that means that all of those third-party vendors are also part of each individual businesses data security environment, and it highlights the importance of making sure that relevant contracts with such vendors properly assigned responsibility for data security that is commensurate with the services they are providing, and it represents another one of the many ways that we have worked over time to help our clients manage their own data security environments. It no longer resides wholly within the four walls of the given business enterprise, it now involves lots of third-parties, which makes contracts with those third-parties correspondingly important.

Maggie:

Looks like we had one more just come in. It says, if this contact tracing was active years ago under a different aspect for safety, would there be a pushback?

Billy Raulerson:

I mean, I can give my views on that, Dan, and then you can weigh in from a privacy standpoint maybe. The issue truly is adoption, and if you go back in time, I mean, if you know the technology [inaudible 00:44:16] under, it depends on how far you go back. We're perfectly situated now with the market plan. I mean, 80% of the US does have smartphones. This does include the young, the old, the middle age, the working, the non-working. It crosses all demographics.

Billy Raulerson:

The problem is adoption. And if you dig through the interviews, there is a big distrust between users of smartphones and the government entities, and even the developers. I mean, Google, when COVID started, I believe in April, was charged with sharing anonymous GPS information so that government entities could track social distancing trends. And so you have a history of these big companies and these government entities feeling like this data's out there, it's out there for them to do and use as they feel fit.

Billy Raulerson:

And as Dan was alluding to in his overview, the legal framework is popping up to say, well, we have to have some protections on this or otherwise it will inevitably be misused in our views. So adoption is going to always be the issue, and some of it's malaise, some of it's ... You have a disease now where early on it seemed to be particularly affecting an older demographic who may not be as technology embraced in general, and a younger demographic who felt like they were invulnerable to disease who were probably better situated to use the type of technology.

Billy Raulerson:

So I don't know if I've entirely answered the question, but if it's not working now, I doubt it would have worked in the past. I mean, you actually have Apple and Google partnering, which is almost unheard of to remove technical barriers at the operating system level to make this work. So it's not a question of technology, it's a question of, in my opinion, trust and willingness to adopt for the benefit of everyone.

Dan McMullen:

Well, I might just add a word and I agree with your characterization, Billy, and to underscore an important point you made there, the technology exists today that permits tracing at a level with a degree of granularity that literally would not have been possible 25 or 30 years ago via smartphone, that super computer that each of us carries around in our pockets. To the point of adoption though, I guess a further question, I'd be interested in your reaction on Billy.

Dan McMullen:

In some countries around the world, we've seen the use of smartphone technology and tracing associated with it apparently to considerable positive effect, and I point to the example of Korea in particular as a country that seemed to rather effectively suppress the spread of the infection through very active and aggressive use of tracing technologies, and I'm curious if you think that there are technical differences or cultural differences in why the Korean seemed to have been so successful in their tracing efforts where I think we, at least not yet happen.

Billy Raulerson:

So there's a lot of tricks that countries are playing to ... They're waking up to the adoption being the problem, not the technology. And so if you talk about a system that's compulsory versus voluntary, then right out of the gate, you start to promote adoption. If you look at a country like China, they were kind of clever. They integrated it into I believe two of their primary social media, WeChat and I forget what the other one is. They actually embedded it into apps that already existed, that users were already comfortable with.

Billy Raulerson:

The problem with China from a privacy standpoint is, they do centralize the data collection and they're not just telling who's contacted who, they're saying, okay, well, what public transportation did you ride? What route did you take? What buildings have you entered? So some of these systems I fake you would get a lot more pushback in a country like the US, but you're not going to get them in certain countries, but those countries I think raise more privacy concerns.

Billy Raulerson:

Some of the statistics I looked at is I saw one article that said Iceland had a 40% adoption rate, which was one of the highest, but 40% is not enough. I mean, I think most people agree that's low. Singapore, which was one of the earlier, not using Apple Google framework, I believe, I think they developed their own app, like you said early on, to get ahead of this, their adoption rate was a dismal. I think it was 10 to 20%.

Billy Raulerson:

So to answer the question kind of full circle, I no longer think it's a technology issue. I think it's an adoption issue in my opinion, and I think that part of the equation really implicates privacy. Privacy can be managed at the technological level, and that's actually what those slides I had on the Google, Apple collaboration touchdown. You just make developmental design choices when you're developing the API or the apps that are going to be supported by the operating systems or the API.

Billy Raulerson:

But the fact that Apple and Google came out and said that they're not going to monetize this data, I think they had to do that because they both have a really ... Each of them have a bad track record, I think. Bad in the sense that they're behaving like businesses do and they're trying to mind the state and extract the commercial value out of it, but not always in a way that protects some of those personal freedoms that you talked about.

Maggie:

We have another question. It says, can you speak a little more about how employers may be struggling with the privacy of their employees personal health information with putting together returned to work initiatives, specifically looking at the need to screen employees that are entering work sites and potentially needing to deploy a vendor solution to screen a large workforce.

Dan McMullen:

I'll say a word about that. I recognize that, that is a very important practical consideration for every business now that it's trying to figure out how to safely return to work. I would add that, well, I spent some time discussing the legal landscape regarding data privacy, generally in the US at the moment. The employment context presents a whole additional set of considerations that arise fundamentally between that employer-employee relationship, and employers have considerably greater latitude to impose requirements as conditions of employment to oblige their employees, to observe protocols, to engage in safe practices and the like, and simply that becomes a part of the employees' obligation in order to retain their employment.

Dan McMullen:

Now, certainly that doesn't give employers carte blanche to do whatever they wish with private health information. Those statutes, some of them that I alluded to, HIPAA-HITECH in particular, still operate, but within that framework, I think employers have a fair degree of latitude. This by the way is a subject that I think you may see again Calfee webinar in the not too distant future, because it is such a ubiquitous topic.

Dan McMullen:

I think those, I'll just conclude by saying, for many vendors, and I heard this in the question, they are going to be relying on third-party providers with respect to many of those solutions and scrutinizing carefully the terms of their agreements with those third-party providers is going to be critical to make sure that employers are observing the privacy obligations that they have, even though they may exercise greater latitude because of that employment relationship.

Maggie:

All right. It looks like that's it.

Dan McMullen:

All right. Well, thank you again for all of you who joined us this afternoon. We appreciate your interest. If you have any further questions that you'd like to pursue, you can see our contact information there on this last slide. You are more than welcome to contact Billy or myself, and we'll be happy to follow up with you, and in the meantime, enjoy the rest of your afternoon.

Jump to Page