Judy Selby is a partner at Kennedys and over 30 years of insurance coverage experience. Judy was...
Sharon D. Nelson, Esq. is president of the digital forensics, managed information technology and cybersecurity firm Sensei...
John W. Simek is vice president of the digital forensics, managed information technology and cybersecurity firm Sensei...
Published: | September 24, 2020 |
Podcast: | Digital Detectives |
Category: | Legal Technology |
What happens when biometric information is compromised? For too many lawyers, the risks associated with this technology have been flying under the radar, but that needs to change! Digital Detectives hosts John and Sharon welcome Judy Selby to discuss the full spectrum of what lawyers should know about biometric technology. They address its many uses, the risks involved, relevant laws, and insurance coverage for biometric lawsuits.
Judy Selby is a partner at Hinshaw & Culbertson LLP.
Digital Detectives
The Perils of Biometric Information Relevant Laws and Insurance Coverage for Biometric Lawsuits
09/24/2020
[Music]
Intro: Welcome to Digital Detectives. Reports from the battlefront. We will discuss computer forensics, electronic discovery and information security issues and what’s really happening in the trenches; not theory, but practical information that you can use in your law practice, right here on the Legal Talk Network.
[Music]
Sharon D. Nelson: Welcome to the 119th Edition of Digital Detectives, we’re glad to have you with us. I’m Sharon Nelson, President of Sensei Enterprises a digital forensics, cybersecurity and information technology firm in Fairfax, Virginia.
John W. Simek: And I’m John Simek, Vice President of Sensei Enterprises. Today on Digital Detectives our topic is the perils of Biometric Information Relevant Laws and Insurance Coverage for Biometric Lawsuits.
Sharon D. Nelson: Before we get started, I’d like to thank our sponsor. Thanks to our sponsor, Logikcull instant discovery software for modern legal teams. Logikcull offers perfectly predictable pricing at just $250 per matter per month. Create your free account at any time at Logikcull.com/ltn.
John W. Simek: Today, our guest is our friend and co-speaker Judy Selby. Judy has over 25 years of complex insurance coverage litigation and international arbitration experience, handling large-scale complex first and third-party insurance matters. Her experience includes coverage opinions all phases of coverage litigation re-trial and appeal and international arbitrations involving environmental toxic — cyber/privacy, BIPA, TCPA, business interruption bad faith pharmaceutical products and COVID-19 claims. She also advises clients concerning compliance with privacy and cybersecurity laws and regulations technology, contracts and co-founded the E-Discovery and Information Governance Teams at AmLaw100 Firm. It’s great to have you with us Judy.
Judy Selby: Hey John, hey Sharon, thanks so much for the invitation. I appreciate having the opportunity to speak with you and your listeners today.
Sharon D. Nelson: Well it is good to have you here, but I got to tell you that Biometric Information and Biometric Lawsuits, the very thought of these make my head hurt.
So, before we get into the subject, maybe tell us a little bit about your background and what you’ve been seeing with regard to biometric information because I know it’s important and timely.
Judy Selby: Yeah, it sure is and it is such a hot issue right now with real serious risks attached to it that for some reason is flying under too many people’s radar screens. What I’m seeing now is more and more companies utilizing technologies that allow them to exploit biometric information usually for identifying employees or customers, and something along those lines.
But maybe not recognizing all the risks associated with the use of that technology and the information they get from that that technology and then of course, being an insurance lawyer, we’re starting to see more and more claims coming in for companies that are sued for biometric privacy law violations. Typically, under the Illinois Statute and looking to their insurance carriers for coverage of those claims. And so, that’s another area we’ll discuss today. Whether or not there’s coverage under various policies for these claims and the claims out of Illinois can be very substantial because of the statutory damages involved.
John W. Simek: So, Judy, help us out a little bit here and give us some examples of biometric identifiers.
Judy Selby: If you think of things like fingerprint scans, retinal scans, facial scans, things like that. I don’t know about you, but when I was a kid, my first job I had to go in and punch a timeclock, with a card. Now employees go in and put their fingerprint on a scanner and that checks them in and out of the job. There’s also facial recognition technology that’s being used by retailers, employees. But there are these types of information including by the way behavioral information, characteristics around how people behave such as how long it takes you to type your password in, when you’ve enabled multi-factor authentication in perhaps, a bank app.
All those types of things are typically used to identify human beings, grant access to buildings, things along those lines.
Sharon D. Nelson: Well, I think you’ve told us something about how they’ve been used. Are they used in some bad way?
Judy Selby: Well you know, there’s a lot of press lately and some good scientific evidence to back it up that some of the law enforcement uses facial recognition for example is not accurate for people with darker skin.
(00:05:07)
And so, there’s real issues around making identifications based on that that technology if it’s suspect. And the real risk with using this information is that unlike a credit card or a password, something like that, you can’t cancel it.
You know, in that way, it’s a little bit like protected health information, you can’t just say “You know, somebody got my fingerprint scanned. So just cancel that and give me a new fingerprint, god(ph).
Sharon D. Nelson: It doesn’t work that way, huh?
Judy Selby: It doesn’t work that way and so they’re usually used for identification purposes. Sometimes it’s — oftentimes it’s in the employment context but it could also be used for a person who subscribes to some type of an offering. And there’s a there’s a case we’ll talk about in a little while where a tanning salon has their members provide a fingerprint scan and then they can use any tanning salon in their network. You just walk in and put your fingerprint down and your you can get entry to the tanning salon. So, there’s all kinds of things but you know, the technology is moving so fast and developing so quickly. Now, we’re seeing things like remote proctoring of students taking exams at home because of the COVID situation.
Also, facial recognition and you know, involved with temperature scanning for people for COVID, to enter a building for example. So, we’re starting to see these new uses pop up all the time. That’s why it’s so important and I’m very glad we’re doing this — having this discussion today because people need to be aware of all these uses and as I was saying earlier, the risks attached to the to the use of this type of data and the technology.
John W. Simek: I’m glad you mentioned all that Judy because that’s one of the things that i personally am not a big fan of the biometrics because of all that that data collection. And being a technologist, not everything, works right? Every single time. So, which I’m sure you know, even preparing for this podcast, we had some gremlins that were out there. Talk a little bit about some of the risks, if something does go wrong and obviously, they’re doing these apps and they’re doing this collection for a particular purpose but not everything works 100% at a time but what if it does go wrong, you know? The data gets out or it goes to somebody that shouldn’t have access to it?
Judy Selby: Yeah, the biggest risks in my mind are misidentification of somebody or identity theft, so let’s say that you work for a company that requires a facial scan for you to gain entrance into the building? And your facial scan, the data is compromised somehow. How does that affect your ability as the employee to continue working for that employer? How does the employer handle that and what happens going forward if the employee wants to go work someplace else that uses that same type of technology and that data has already been compromised? So that that to me is a really big issue and what people can do with identity theft is kind of “the sky is the limit” which is scary and sometimes you don’t find out about it until many years after the fact which makes it even worse. And then the misidentification of course is a big issue. I know that some cities have prevented law enforcement from using facial recognition just for this very reason.
But it will manifest in other ways as well as the technology continues to evolve. So, we just have to be aware of it that yes, things can go wrong with the technology. All kinds of malfunctions or ineffectiveness and then there’s also the risk of the information being compromised and what can happen to the victim when the information falls into the wrong hands.
John W. Simek: Well, I’m sure there’s another squishy area there too about data ownership too, right? I mean who owns your iris scan right? As an example.
Judy Selby: Yeah, well one of the big issues under the Illinois Law and I have referenced that a couple of times so I should probably tell you what it is. It’s the Illinois Biometric Information Privacy Act and is referred to as BIPA, B-I-P-A. It has that private right of action for consumers or individuals to bring a lawsuit against the entity that’s alleged of violating the act. But the requirements of BIPA are all around getting the consent of the person and providing the adequate disclosures about what information you’re taking?
(00:10:02)
Why you’re taking it and the retention period, how long you’re keeping it for? And so, providing those types of disclosures in advance at least under the under the Act is Illinois is really, really key and where we see companies getting into trouble at least allegedly is not providing those disclosures, not getting those consents. There’s a lawsuit filed very recently against a major retailer that allegedly was doing facial recognition for employees but also for all the all the customers who walk in and out of the store. So can you imagine like a major retailer with locations all over the country having all of that information there and then how do — even if you wanted to get the consent, how do you do that under the relevant regulatory or legal scheme in terms of posting signs or whatever it may be. So yeah, these are tricky issues that are being worked out as we speak.
Sharon D. Nelson: Well beyond Illinois Judy, are there other laws concerning biometric information elsewhere in the country? There are four specific biometric privacy laws in the United States. One is on Illinois, one is Texas, Arkansas, and Washington State. Now, the other three Texas, Arkansas and Washington don’t have a private right of action.
Some other states added biometric information to their breach notification laws, New York did that fairly recently and biometric information is included in the CCPA in their definition of protected information. So, it’s kind of these two different kind of avenues where we’re starting to see biometric information creep into legislation. But there are some proposed laws out there that people should try to stay aware of. There’s one in Massachusetts right now, that also has a private right of action up to $750 per consumer per incident, the Illinois law goes all the way up to $5,0000 per incident for intentional or reckless incidents by the way.
So, you can see it’s a very attractive type of statute for class action lawyers. It’s a $1,000 per person a negligent violation of the act. The proposed New York Biometric Act which was introduced a couple years ago hasn’t gone through yet has a very similar statutory damages scheme as BPA. So it’s hard to keep track, people need to be kind of vigilant and see what’s going on in whatever state you’re in but a takeaway message from this discussion we’re having right now should not be “I’m not in Illinois so who cares?”
These laws are coming everywhere and so people who really should start paying attention.
Sharon D. Nelson: I’m glad we’re going to talk a little bit more about that later because BIPA has certainly made the headlines a lot.
Judy Selby: It sure has, and another takeaway message or not takeaway message should be that — it’s not just big tech companies or major retailers or who are being sued. People like to talk a lot about the biometric claims against Google and Facebook and the major retailers. But you know, we see Mom & Pops Fast Food franchises, all businesses large and small impacted by this. So, it’s not just affecting the big tech companies. So that’s another takeaway message as well.
John W. Simek: Well, before we move on to our next segment. Let’s take a quick commercial break.
[Music]
Does your law firm need an investigator for a background check, civil investigation, or other type of investigation? Pinow.com is a one-of-a-kind resource for locating investigators anywhere in the U.S and worldwide. The professionals listed on PINow understand the legal constraints of an investigation, are up to date on the latest technology and have extensive experience in many types of investigation, including workers’ compensation and surveillance. Find a pre-screened private investigator today. Visit www.pinow.com.
[Music]
Trying to cut costs? You’re not alone. In today’s climate, a five-figure e-discovery bill per month is steep. Don’t pay that. Use Logikcull to reduce expense and control your discovery process. Get started today for only $250.00 per matter and they’ll waive migration costs from competing platforms. For more information, visit logikcull.com/ltn.
[Music]
Sharon D. Nelson: Welcome back to Digital Detectives on the Legal Talk Network. Today our topic is Perils of Biometric Information Relevant Laws and Insurance Coverage for Biometric Lawsuits.
(00:15:06)
Our guest is our friend and sometime co-speaker Judy Selby. Judy has over 25 years of complex insurance coverage, litigation and international arbitration experience handling large-scale complex first and third-party insurance matters.
John W. Simek: Well Judy, before the break, we were talking a little bit about some of the states and the laws, and things they have out there. But are there any additional laws on the horizon or god forbid, is there going to be a federal one either?
Judy Selby: Well you know, there was a Federal Law proposed about a month ago. I’m not expecting anything to happen anytime soon with that. But it was introduced in the senate, probably about four weeks ago. So we’ll see what happens but you know, I would urge people to pay attention to what’s going on in their specific state or if you’re working with companies that are in various states, in those different locations because you know, anybody can introduce a bill at any time for anything. I would pay particular attention to Massachusetts though. The last I heard that one is probably moving along faster than some of the other ones. So, I would watch Massachusetts.
Sharon D. Nelson: I mentioned earlier that BIPA — I see it in the headlines all the time. Why is that particular law garnering so much attention?
Judy Selby: Well it really is that private right of action, we are seeing hundreds of class actions filed against companies large and small for alleged violations of the act. Oftentimes, they’re in the employment context as we were saying earlier. It’s oftentimes that a timekeeping type of situation. Usually with a fingerprint scan, and I don’t know how the plaintiffs are being found, I don’t know if they’re advertising on TV. I’m in New York, I’m not in Illinois, I don’t know how they’re being found. But as I say, there are hundreds of these class actions being filed and there was some case law — the funny thing is that BIPA has been on the books since 2008 but the floodgates have really opened fairly recently and I think a lot of that has to do with a decision from the Illinois Supreme Court; I believe it was in 2015 that said you don’t have to show any harm as the consumer, as the plaintiff bringing the claim. You can just allege the statutory violation. Meaning, you didn’t get my consent, you didn’t provide the adequate disclosures and off you go.
There’s been a little bit more activity concerning which sections of BIPA would be appropriate for standing in Federal Court probably beyond the scope of this discussion. But to file a BIPA claim in State Court, all you need to do is allege that violation and as I say hundreds, hundreds of these lawsuits are being filed.
John W. Simek: What types of BIPA violations are you seeing?
Judy Selby: Yeah, it’s usually — you know, as I say in the employment context, there are so many around, I guess hourly workers, people who clock in and out of their jobs. From what I’m seeing, those are the — like the largest number of claims out there. And you know, different issues are going to arise when you’re dealing with employees and when you’re dealing with the general public. But it comes down to that same — the same basic factors of you collected the biometric information, you didn’t get my consent and you didn’t provide the appropriate disclosures.
And so, I would say anybody who is dealing with companies that have hourly workers that are being clocked in and out, as I said the old-fashioned timecards are — you know, people are using this other type of technology now and that’s where the risk is.
Sharon D. Nelson: Well you know, your next question here is — it’s kind of complicated. So, I’m going to say it slowly for the benefit of our listeners. So, when you’re looking at insurance coverage for these BIPA lawsuits, what insurance policies are potentially implicated? And are there exclusions in other policy provisions that come into play?
Judy Selby: What we’re seeing right now is when a company is sued with a BIPA class action, they may tender the claim to their insurance company. Report the claim to their insurance company under any variety of insurance policies, trying to find some coverage out there. So, it can be your general liability policy, it can be an employment practices policy, it can be a D&O policy, depending on the situation at issue, it can be a cyber policy.
(00:20:00)
And in my view, the claim, a BIPA lawsuit probably would fit best under a cyber policy, depending on the form. Sharon and John, we have discussed this in other venues how all the cyber insurance policies are different, and you really have to look at them. But when you tender a claim under a general liability form, like a CGL(ph) form, they’re frequently called. There are two types of coverage there; there is Coverage A and that’s bodily injury and property damage-type claims.
And so, when an insured presents a claim under Coverage A, it may not seem intuitive. You know that that would be a fit because of the bodily injury or a property damage requirements. Some claimants allege that they have like mental anguish, mental distress, emotional distress associated with a BIPA claim. Those allegations are probably put in there to try to trigger some insurance coverage I suspect.
But that’s probably not a great fit under many of the of the policies. The policies require an occurrence which is often defined as an accident that leads to property damage or a bodily injury. And so, getting through over that hurdle of “Was this an occurrence?” could be very difficult. In some states and in some policy forms, there’s no coverage for mental distress or emotional distress, unless it’s accompanied by actual — kind of physical injury. So that seems like a difficult hurdle I would think for a lot of BIPA claims that there was any type of physical injury there.
Under Coverage B, that’s the second type of coverage under a CGL Policy. There can be coverage for what’s called “wrongful acts” for personal injury or advertising injury. And so, one of the types of wrongful acts that’s oftentimes included is for violation of somebody’s right to privacy. And so, we’re seeing some action in that area. The policies require that the insured published information that violated somebody’s right to privacy.
And so, we’re getting down to this issue of what is a publication of the biometric information in the context of an insurance policy? And there has been only one reported case, believe it or not in Illinois not surprisingly, concerning coverage for a BIPA claim under Coverage B of the CGL Policy. And it was that tanning salon case that I mentioned earlier and the insured in that case was the tanning salon, the claimant was a customer and she provided her fingerprint so that she could utilize any other salon. The fingerprint was given by the tanning salon to one single vendor and that vendor then implemented the technology that would allow the plaintiff — the underlying plaintiff to use any of the other salons.
So, when the tanning salon tendered the claim to the insurance company, the insurance companies said “Well, wait a minute. There hasn’t been a publication of the information within the meaning of the insurance policy because it wasn’t widely distributed. The fingerprint, the data wasn’t widely distributed.” And the Court said, “Well, insurance company if you wanted that to be the meaning of publication, you should have said so in the policy and we’re going to say that you have a duty to defend this case. So, it’s a little bit of a controversial decision because there are some earlier Illinois President that talks about a more widespread distribution to satisfy that publication requirement. I expect that to be a pretty hot issue in insurance coverage circles going forward. Under a lot of policies, CGL Policies, there are also exclusions for violations of different types of laws. Different laws that are around, obtaining and sharing confidential information, exclusions for violations of laws, exclusions for things arising out of the employment context.
And so, if you look at an employment practices insurance policy, they often have a similar violation of laws exclusion. So, there’s probably going to be a knowledge component there, so whether the insured knew they were violating the law you may be a relevant issue in in these cases. These things are going to be very, very fact specific. In fact, a workers’ comp type of exclusions in CGL policies may be applicable as well.
(00:25:08)
Some policies also will say, “Well, we don’t cover you for fines or penalties or for punitive damages. In fact, sometimes, the state law, whatever law is governing the insurance policy will say, “You know, it’s against our public policy to allow insurance for punitive damages or for fines and penalties.”
And so if the interesting thing about BIPA by the way is for the statutory damages. You can bring a claim for — you can try to get for damages your actual compensatory damages or you can default to the statutory damages. I haven’t seen any claim where somebody was going in there and saying, “This is how much this alleged violation impacted me, here are my receipts.” It seems that people are typically relying on the statutory damages. And so, then you get into this issue; well, are these really damages? You know, depending on what the policy says if the policy defines covered damages as something compensatory in nature, there could be an argument that, “Well, these are not compensatory. This is really a fine or a penalty that wouldn’t be covered either under the wordings of the policy or by the applicable law. So that’s a pretty big issue right now. Under cyber policies we probably have a better shot at coverage. You probably want to make sure that the policy applies to biometric information.
If the policy defines protected information or whatever term they use in a more narrow way, that doesn’t include biometric information then there might be a challenge to coverage. But if the policy says, “We cover information that’s protected under any privacy law in the United States or in other jurisdictions.” You probably have a much better shot of saying “You know, at least it covers this type of information” but those policies will also typically contain an intentional acts exclusion. Like you can’t go around intentionally harming people, intentionally breaking laws and expecting insurance companies to pay for it. So that could be problematic, as well and then another issue that I know — Sharon, you and I have discussed on some panels; if there’s a regulatory claim, in other words not a private right of action but let’s say a regulator or an AG or somebody brings a claim against a company for violation of a biometric law or regulation.
Some cyber insurance policies limit their regulatory coverage to data breach or security event type of situations. And so that may not apply to the typical type of claim we’re seeing now, where the issue is, well, you didn’t get consent, or you didn’t provide the right disclosures, but nothing happened. There was no data breach for the information that was stolen for example. So, I know that’s a mouthful but those are the big issues that come to mind when you’re looking at — is there going to be coverage for these types of claims under all types of policies, including cyber.
Sharon D. Nelson: Well, it’s a very dense topic and I regret that we are out of time today. But I know that a lot of people have will learn a lot from this particular podcast. It really is very difficult to understand and I’ve heard a lot about, it’s been all over the place, now we’re talking about it. Today was also Tik-Tok day of course, so we’ve all been reading about Tik-Tok. So maybe what we need to do to lighten up BIPA is to do a BIPA boogie woogie on Tik-Tok. So, let’s combine the events —
John W. Simek: Oh, I don’t want to see that.
Sharon D. Nelson: Oh, Judy’s going to do all the dancing.
John W. Simek: Oh okay.
Judy Selby: The main thing I wanted to get across today was, these laws are out there and there are more coming. Don’t assume they don’t apply to you, don’t make assumptions around your insurance coverage, do your homework. And try not to be caught flat-footed well.
Sharon D. Nelson: Well, I think everybody came away with that. And I know how valuable your time is Judy. So, I really want to thank you for lending your expertise on this very — it can be a dense subject. I know it makes my head hurt and I’m sure I’m not alone. So, thank you very much for being our friend and for being our guest.
Judy Selby: My pleasure, thanks guys.
John W. Simek: Well, that does it for this edition of Digital Detectives and remember, you can subscribe to all the editions of this podcast at legaltalknetwork.com or on Apple Podcasts and if you enjoyed our podcast, please rate us on Apple Podcast.
(00:30:04)
Sharon D. Nelson: And you can find out more about Sensei’s digital forensics technology and cybersecurity services at senseient.com. We’ll see you next time on Digital Detectives.
John W. Simek: Thanks for listening to Digital Detectives on the Legal Talk Network. Check out some of our other podcasts on legaltalknetwork.com and in iTunes.
Notify me when there’s a new episode!
Digital Detectives |
Sharon D. Nelson and John W. Simek invite experts to discuss computer forensics as well as information security issues.