Lawyer 2 Lawyer

Facial Recognition Technology: Security vs. Privacy Concerns

Imagine a computer thousands of miles away recognizing you in a camera at an intersection. Furthermore, consider being tracked and monitored from your home to your place of work every day. Facial recognition technology makes this type of identification possible and it is being rapidly developed for country defense and law enforcement purposes. On this episode of Lawyer 2 Lawyer, host J. Craig Williams interviews Ed Tivol from EWA, Government Systems, Inc. and Jennifer Lynch from Electronic Frontier Foundation. Together, they discuss the paradox of security vs. privacy when it comes to biometric modes of identification. In addition, they deliberate on how this data is being collected, who is collecting it, and for what purpose. Tune in to hear about your evolving First and Fourth Amendment Rights in the face of national security, crime prevention, and the private sector.

Jennifer Lynch is a Senior Staff Attorney with the Electronic Frontier Foundation and works on open government, transparency, and privacy issues as part of EFF’s Transparency Project. She is a writer and frequent speaker on government surveillance programs, domestic drones, intelligence community misconduct, and biometrics. Lynch has testified about facial recognition before Senate Subcommittees and prior to joining EFF, she was the Clinical Teaching Fellow with the Samuelson Law, Technology & Public Policy Clinic at UC Berkeley School of Law.

Ed Tivol is the Vice President of the Intelligence and Operations Division for EWA, Government Systems, Inc. a defense contractor actively developing facial recognition technology for the Federal Government. He is a 1964 graduate of The Citadel and served in the Army’s Military Intelligence branch for 24 years. Tivol completed two tours in Vietnam and retired with the rank of Colonel in 1990. In the same year, he began his work with EWA and has been there ever since. Ed holds master’s degrees from University of Maryland and the Army War College. Today Mr. Tivol and his wife raise racehorses and Angus cattle outside of Bowling Green, Kentucky.

Special thanks to our sponsor, Clio.

View transcript

Ed Tivol: None of us, none of us, wants to have law enforcement or surveillance in our bedrooms, but fortunately we do live in a country that has three branches of government and a Constitution. I think the Supreme Court will ultimately make the decisions on privacy issues and I feel comfortable with the court taking those steps to balance our Constitutional rights to privacy with the national security and public policy concerns.

Welcome to the award-winning podcast, Lawyer 2 Lawyer, with J. Craig Williams and Robert Ambrogi, bringing you the latest legal news and observations with the leading experts in the legal profession. You’re listening to Legal Talk Network.

Craig Williams: Hello and welcome to Lawyer 2 Lawyer on the Legal Talk Network. This is Craig Williams coming to you from Southern California. I write a blog called May it Please the Court. Before we introduce today’s topic, we’d like to thank our sponsor Clio, an online practice management software program for lawyers at goclio.com. My co-host, Bob Ambrogi, is off today.

In recent news, there have been many stories regarding the topic of facial recognition and other biometric modes of identification. Highlighted in these accounts is the scope of use for these ID systems. Initially developed for the defense in the war on terror, the applications have begun shifting toward law enforcement. As a result, some watchdog groups are becoming concerned that the government may be going too far and thus violating legitimate rights to privacy.

Here to discuss that topic today we have Jennifer Lynch. She is the Senior Staff Attorney with the Electronic Frontier Foundation, and she works on open government transparency and privacy issues as part of EFF’s Transparency Project. Jennifer is a writer and frequent speaker on government surveillance programs, domestic drones, intelligence community misconduct, and biometrics. She has testified about facial recognition before a Senate subcommittees and prior to joining the EFF, she was a clinical teaching fellow with Samuelson Law, Technology & Public Policy Clinic at UC Berkeley’s School of Law. Welcome, Jennifer Lynch.

Jennifer Lynch: Thank you for having me on the show.

Craig Williams: In addition, we have joining us today, Mr. Ed Tivol. He is the Vice President of the Intelligence Operations Division for EWA Government Systems, Inc., a defense contractor actively developing facial recognition technology for the federal government. Ed is a 1964 graduate of The Citadel, and served in the Army’s Military Intelligence branch for 24 years. He completed two tours in Vietnam and retired with the rank of Colonel in 1990. The same year, he began his work with EWA and had been there ever since. He has a master’s degree from University of Maryland, and another master’s from the Army War College. Today, Mr. Tivol and his wife raise racehorses and Angus cattle outside of their Bowling Green, Kentucky home. Welcome, Mr. Tivol.

Ed Tivol: Hello. Glad to be here.

Craig Williams: Before we start with the questions, I wanted to mention that we invited into our program representatives from the FBI, the CIA, the NSA, and the Department of Homeland Security. We also extended invitations to the Obama administration, NTIA, and Senator McConnell’s office, in order to provide additional views to our panel. We’ll extend that invitation continually for future shows so that our audience can benefit from their contributions to this topic.

Mr. Tivol, we’ll start with you. Your company’s been developing biometric optical surveillance systems, or what’s shortly known as BOSS, since 2010, to the degree permissible. Recognizing you can’t tell us everything, tell us how that project started and a little bit about how it works.

Ed Tivol: Yes, I’d be glad to, and I thank you for this opportunity. I would first like to say that I don’t work for any of those three letter agencies you mentioned or the administration or NTIA, but we are a private contractor.

The BOSS project began with a contract from the Department of Defense, and the goal was to be able to provide reliable results of facial recognition at a distance of approximately 80 to 125 meters. You mentioned, I think earlier, that you’d seen this effort. There was an article in the New York Times last year concerning this written by Mr. Charles Savage, and that article pointed out that using facial recognition at a distance was a significant challenge and so our goal was to try to see if that could be accomplished.

Craig Williams: Jennifer, domestic use of these technologies is what seemingly concerns the EFF. Your organization’s been involved with the Freedom of Information Act suit against the FBI regarding the use of facial recognition data and other biometric identifiers. Give us a little bit of history about that FOIA request and the lawsuit and a little bit about your concerns.

Jennifer Lynch: Sure. We’ve been concerned about the increasing use of facial recognition by the federal government and by state and local law enforcement agencies for several years now. We first learned about the FBI’s efforts to develop a large facial recognition database a few years ago. The FBI currently maintains additional fingerprint database, and has done so since 1999. Within the last 5 to 10 years, the FBI has realized it is outgrowing its fingerprint database, and has been working with government contractors to develop a large [multi-modal 00:05:50] biometric database. A database that would include many different forms of biometric, one of which is facial recognition.

We were concerned about that and sent several Freedom of Information Act requests to the FBI to get more information about this system, which is called Next Generation Identification or NGI. We asked specifically for information on how the FBI has been working with states to collect facial recognition data, and also about the reliability of the system, and then finally about the FBI’s plans to combine civil and criminal, or criminal and non-criminal images in the Next Generation Identification database.

We filed those requests, Freedom of Information Act requests, with the FBI in 2012. The FBI failed to respond to those, so we filed suit in 2013. Over the course of the last year or so, we’ve received many hundreds of pages of records from the FBI on the Next Generation Identification system. They’ve been really interesting and pretty concerning. They’ve revealed that the FBI’s already working with about a half dozen states inside the United States to incorporate those states’ entire mugshot databases into NGI, and also that the FBI is building out its database and plans to have the capacity to include up to 52 million images by 2015.

We also know that the FBI plans for that capacity to include up to 4.3 million non-criminal photos. Those are especially concerning because these are photographs that are taken for, let’s say, a background check for a job. If you wanted to work with children or be a police officer, you might have to submit a photograph for a background check, and the FBI plans to allow law enforcement agencies to search all of those photos along with the criminal photos, the criminal mugshot photos. That means that if you have to take a photograph and submit a photograph for a background check for a job, you could end up being a suspect in a criminal case.

Craig Williams: Jennifer, you have obviously with Facebook, millions of people put their pictures up on the internet every day. What’s the privacy concern?

Jennifer Lynch: With Facebook or with government collection?

Craig Williams: If you put your picture up on Facebook, can’t the government access it?

Jennifer Lynch: It’s a little complicated, sort of depends on how you’re restricting access through Facebook. In general, what we’ve seen through several Freedom of Information Act requests is that, if the government wants access to Facebook’s data, they need to provide a warrant to get access to that data. The other thing is that Facebook has its own proprietary facial recognition algorithm, and doesn’t share that algorithm or that underlying faceprint for the images with the government. I think that’s the first thing. That’s the first difference.

The other thing is that … I think that most people would think that, if they’re sharing information on Facebook, they’re sharing that information with a limited group of people who they have decided should have access to their information, whether that’s their family or friends, or maybe an extended network, or community. I think that most people wouldn’t think that they’re just automatically sharing information with the government, with law enforcement.

Craig Williams: I don’t know. From some of the conversations we’ve had over the last couple of years on this show, it seems like everybody believes that nothing’s private anymore. But, Ed, what do you know about NGI and do you have any idea about the government’s plans to extend the use of this facial recognition technology and biometric use to states, local agencies, and use it for law enforcement?

Ed Tivol: No. I certainly am not an expert. I have heard of NGI and in fact, I read Miss Lynch’s blog and it was quite interesting. With regards to the other part of your question, it certainly is a difficult issue, but it seems to me, and it’s unclear, whether the men who were apprehended, say, in the aftermath of the Boston Marathon bombing, would have been identified without facial recognition. That appears to me to be a very good indicator of its value, whether, in a local crime, that affects an entire nation, or as a matter of national defense, and if that’s the case, then I think that we have to take a hard look at how we can best apply this technology to improve and increase the security of all of us.

Jennifer Lynch: I just want to respond to that [crosstalk 00:10:48] point. In the Boston Marathon bombing case, I think there’s a lot of misconception about whether and how facial recognition was used. It actually was not used to identify those individuals who were alleged to be responsible for the bombing. Those individuals were identified through video surveillance camera footage but not through the use of facial recognition, and I think that that presents a pretty good example of how policing really can work. Those suspects were caught pretty quickly after the bombing occurred and are now in custody. We didn’t have to rely on any sort of newfangled technology to do that. We relied on old fashioned policing.

Craig Williams: Jennifer, we’ve seen a lot of newfangled technology increases in crime fighting over the years. The use of chemical analysis and different things like this. How is facial recognition any different?

Jennifer Lynch: I think one of the big concerns about facial recognition, and possibly from a technology like Mr. Tivol’s company is developing, is the fact that it can be used to identify people at a distance without their knowledge. Now we’re not there yet, at a point where we can just identify a random face in the crowd, but I think that that will become possible in the future as computing power increases, it becomes faster, and as digital storage continually decreases in cost.

The concern on that, and I think that the difference with that, is that people who are walking around in society can be identified without their knowledge. We have a pretty strong tradition in the United States going back many hundreds of years to the Founding Fathers of the ability to participate in society anonymously. We don’t have a belief that you need to identify yourself in order to engage in political debates or to communicate with people. That could all change with facial recognition, as it becomes deployed more widely.

Craig Williams: Do you think there’s a foundational right in the Constitution for being in society anonymously?

Jennifer Lynch: There definitely is a foundational right in the Constitution. It’s the First Amendment right to free speech. Many court cases, including at the Supreme Court, have established that there’s a right to speak anonymously and participate anonymously, and of course, the Federalist Papers, which were a big debate between the Founding Fathers about what should be in a constitution and how our government should set up, those Federalist Papers were written anonymously.

Craig Williams: Ed, how necessary is BOSS and what’s the rationale behind using facial recognition technology at a distance? Why do we need it?

Ed Tivol: That’s actually a pretty easy question to answer. It can save lives. It can save lives by identifying potential threats before they can get close enough to implement whatever plan they have. When we look at situations such as we have seen with suicide bombers and others who are trying to cause us significant harm, the ability to identify that threat, contain that threat, neutralize that threat before it can reach its target is essential. Facial recognition is simply one additional tool in a whole variety of tools that could be employed. It seems to me that the rationale is very clear.

I’d like to make one other quick comment. The issue of national security that we’re facing and our country’s been facing for a number of years, it seems to me is no longer the sole responsibility of the federal government. If we’re to be effective, this is going to require that all levels of government, federal, state and local, have the means to detect, identify threats and the means to share that information in a timely manner to protect all of us.

Craig Williams: Jennifer, we’ve seen the rise in the Middle East and other areas of suicide bombers and other types of things. I think that’s kind of one of the concerns that Ed’s pointing out. What is your response to the need to … for that coming to our country? Certainly it’s, I think, unrealistic for us to expect it’s not going to come because it will. If we don’t have this system, how are we going to defend ourselves, both on a city and county and state and national level? It’s not just a federal problem here, is what I think is Ed is trying to say.

Jennifer Lynch: Yeah, of course it’s a concern. I don’t want to come across as saying that I want the terrorists to win or that we shouldn’t stop crime, but I think that our system of government is set up very differently from other countries and I think that we can’t make comparisons to what’s going on in the Middle East.

It’s highly likely that we could catch more criminals and prevent more crimes if there were a camera in every room of every house, but we don’t do that. Part of the reason that we don’t do that is because we have a Fourth Amendment to the U.S. Constitution. That amendment was drafted to create a balance, to prevent against indiscriminate searches and seizures, and to require that searches are based on individualized suspicion.

The reason for that is because of what happened prior to when the Constitution was drafted, that the British government was going around and searching everybody’s house under general warrants, with no individualized suspicion. Our system of government is set up to protect the people and to create a balance and recognize that unfortunately indiscriminate searches and seizures can be used to target disfavored communities and minority communities and people who have less political power in society.

Ed Tivol: [crosstalk 00:16:50] I’d like to just make one comment, because I’m in complete agreement with Miss Lynch when it comes to her statements regarding the value of our form of government and our Constitution. None of us, none of us, wants to have law enforcement or surveillance in our bedrooms, but fortunately we do live in a country that has three branches of government and a Constitution. I think the Supreme Court will ultimately make the decisions, as they have in the past, on privacy issues and I feel comfortable with the court taking those steps to balance our Constitutional rights to privacy with the national security and public policy concerns. That’s really where I come down on that.

Craig Williams: Ed, what are we looking at in terms of time frame for this? Are we going to see this in the next year, two years, or five or ten? If we’re going to see it shortly, do we expect private companies to be utilizing the technology that you’re developing?

Ed Tivol: I think it’s pretty clear and it’s certainly been in the open press and other media. Facial recognition technology is being used today, by various agencies, and I certainly am not privy to all that are using them but I’m pretty sure they are. As far as their use in private industry, I really don’t have any insight. We have never been contacted by private companies with regards to BOSS or any of our other projects, but facial recognition technology is certainly something that is in place and is used today, I’m sure, in a number of government agencies.

Craig Williams: Jennifer, where do you draw the line here? Obviously you’re going to draw the line at not being in someone’s home. Maybe even at not entering or exit a home. What about something a little bit more common like a public square, where there’s little expectation that you being there is a private affair?

Jennifer Lynch: I think that the way we look at privacy in public is really changing, and when I say “we,” I mean the legal system and the way the courts are looking at privacy in public is really changing. I think that that’s being spurred on by the big data movement, the fact that we can collect and store and combine data so easily, and that that is being done across society. What we’ve seen recently with court cases in the last few years is that courts have looked not just at the fact that a person is in public when information is collected on that person, but how much information has been collected and over what time period and how revealing that data could be.

For example, in a case from a couple years ago, the Supreme Court, United States v. Jones, the court looked at the privacy concerns and the Fourth Amendment concerns of putting a GPS tracking device on a car and monitoring that car’s movements for a month. The main holding in the case was that placing the tracking device on the car was a violation of the Fourth Amendment because of a trespass essentially.

But five justices on the court also looked at the privacy concerns. Based on that car’s movement over time, and the fact that through tracking a car for 28 days, even if the car is traveling in public, you could learn where they go to pray, you could learn when they’re staying out late at night, when they’re going home early. You could learn a lot about who a person associates with. That’s very private information.

Taking that back to facial recognition, if we’re ultimately setting up a system where we can identify a face in the crowd, where we are linking the surveillance cameras if those surveillance cameras have a facial recognition system built into them, and it’s sophisticated enough to identify people as they move about, then suddenly you can track a person’s patterns of movements. That is very, very sensitive and private information, and I think that’s where it gets concerning. When you live and work in an area, and you go to a doctor in that area, and all of your friends live in that area and the government has a database on you of all of your movements.

Craig Williams: Before we move on to our next segment, we’re going to take a quick break to hear a message from our sponsor.

Kate: Hi, my name is Kate Kenney from Legal Talk Network and I’m joined by Jack Newton, President of Clio. Jack takes a look at the process of moving to the cloud. Now, how long does it take to move to the cloud and is it a difficult process?

Jack: No. With most cloud computing providers, moving your data into the cloud is something that takes just minutes, not hours or days to do. You can get signed up and running with most services in just a few minutes. Even if you have an existing, legacy set of data that you want to migrate to a web-based practice management system like Clio, there’s migration tools and migration services that we’re able to offer to ease that process. Most firms can be up and running in the cloud in less than five minutes, and can have their data imported in a matter of hours or days.

Kate: We’ve been talking to Jack Newton, President of Clio. Thank you so much, Jack.

Jack: Thank you and if you’d like to get more information on Clio, feel free to visit www.goclio.com. That’s G-O-C-L-I-O, dot com.

Craig Williams: Welcome back to Lawyer 2 Lawyer. I’m Craig Williams and with us today is Mr. Ed Tivol from EWA Government Systems, Inc. and Jennifer Lynch from the Electronic Frontier Foundation. Before our break, Jennifer was talking about the scope of privacy. Ed, what is your sense of the system, the BOSS’s system’s and the facial recognition technology system’s capability of being able to track people, first identify them, and then track them over a distance and keep track of them until they may disappear into their own home?

Ed Tivol: It’s all a matter of physics and geometry. The ability to physically do that, if a person is in the field of view of a camera, if the lighting conditions and the geometric conditions are correct, and identification is made and they move to another field of view, it certainly is technically possible to do that.

The question of course is who would you want to be tracking, for what reason would you want to be tracking them, because they’re on a watch list perhaps, or that there’s been some other information that has been generated. From a strictly technical standpoint, that’s not difficult. However, you do have these conditions of light and shadow, you have angles, you have the occlusion of the face. All of those impact on the ability to successfully identify and track someone.

Craig Williams: Jennifer, what limits do you want to see put on facial recognition technology?

Jennifer Lynch: I think, as Mr. Tivol noted earlier, we’re already seeing facial recognition use in society and in government, by law enforcement, and by the State Department. Many different levels of government are already using facial recognition. There’s nothing we can do at this point to prevent that collection of information, but I think what we can do is place limits on how it’s used. We can place limits on who has access to information in the Next Generation Identification database, how the data is shared between federal and state agencies, and how the data is collected. What kind of restrictions are there on data collection?

For example, we’ve seen that in San Diego, law enforcement officers are able to stop people on the street and take a picture of them, using a Samsung tablet and put that picture into a facial recognition database. It’s unclear what rules are in place to prevent the misuse of that technology. Are law enforcement officers stopping people for a legitimate law enforcement reason, or are they stopping people because you have an African American or a Latino person in a wealthy neighborhood and you don’t know why that person is there?

I think that my main point is that we need to place meaningful restrictions on how the data can be collected and used, and we need to start doing that now before the technology gets to the point where it’s ubiquitous in society.

Craig Williams: Jennifer, let’s assume that facial recognition technology is available to everybody, easily. I can download an app on my computer and I can install a security camera on my home. Do you have any objection to me having a security camera on my home to identify people that are walking up to my home on my property?

Jennifer Lynch: If it’s your own property, you definitely have a right to have a security camera on your home and to collect information using that camera. I think my bigger concern is what happens to that information. I haven’t consented for you to collect my information, obviously if I’m just walking out, if I’m your neighbor, and I’m walking out on the street. I would be pretty upset if you were just indiscriminately sharing that with law enforcement agency and that image was going into a law enforcement database. I think that there needs to be restrictions on that, about how the image can be used and shared, even if it’s collected by a private person.

Craig Williams: Ed, what’s your sense of how this technology is going to evolve over the next ten years? What kind of changes are we going to see? Will it become, do you think, available to the general public?

Ed Tivol: I think we have to look back a little bit and also I’m going to draw a little from popular culture. I’m a big fan, as many are, of NCIS, and when I see the amazing efficiency of those supposed tools that they’re using, I’m really impressed because I’ve yet to see anything quite that good in real life, but perhaps someday it will happen.

The entire field of biometrics, fingerprints, iris scans, DNA, and facial recognition, is evolving. I would think, as has been the case, that as technology improves, so too will our ability to be more precise with these things, and if there is a market, because our country is clearly a market-driven economy, then I wouldn’t be surprised to see a scenario such as you just described where individuals could download an app.

But as you’re aware, fingerprint technology, which was unheard of 150 years ago, is essential as a biometric tool in a law enforcement today. DNA has proven so effective, not only to convict, but more importantly I think, to exonerate innocent people of crime for which they have been convicted. If I’m not mistaken, there were some grave concerns raised about the DNA and the storing of DNA and those databases. But I don’t think anyone would say that the way DNA is being used today is a bad thing. I think facial recognition technology follows that same path in the future and we’ll just have to wait and see.

Craig Williams: Jennifer?

Jennifer Lynch: In terms of facial recognition, I agree that the technology is improving year after year. We’ve seen huge improvements over the last five to ten years, and I think that in the next five to ten years, technology like Mr. Tivol’s that is able to identify faces out in a crowd, in society, will become more widely used. I hope that before we get to that point, that we can have a conversation as a society about how we want this technology to be used and what kind of restrictions we want to place on its use.

Craig Williams: Ed, what kind of limits do you want to see on this technology? What concerns you, as one of the people that are developing it?

Ed Tivol: I find myself in broad agreement with Miss Lynch. We all want privacy, but we all also want security. I’m pretty confident, although I won’t say this is from personal direct knowledge, but I’m pretty confident that other nations’ groups, both state and non-state actors, are developing using this technology. It is critical to our security that we be on the cutting edge, and perhaps even the leading edge, of facial recognition technology, but I think Miss Lynch has a good point. Throughout this whole thing, and that’s one of the beauties of our society, is that there will be a dialogue, a discussion, and that we have a system of laws and courts to administer that in such a way that our rights are protected.

Craig Williams: Are there places that you think that facial technology should not be used?

Ed Tivol: I imagine there are. I don’t know what … Nothing specific comes to mind, but it would seem to me that one would always have to have an underlying rationale that has been clearly developed, looked at by the appropriate authorities, that would warrant the use of facial recognition or any other type of surveillance technology.

Craig Williams: Jennifer, we’re already opening up our phones with fingerprint technology. I know because I’ve used it. I’ve entered buildings using iris recognition technology. I don’t think it’s unrealistic to expect that somewhere along the line, either my phone or a corporation, a company is going to scan my face and use facial recognition technology to determine whether to let me in or not. Is that fair? Is that reasonable? What are your concerns about that?

Jennifer Lynch: That’s already being used. I think that we have to make a distinction between that kind of a use of the technology and a broader use, a law enforcement use of the technology. A private company using facial recognition as a verification system to determine whether you should be allowed in a building or not. In general that private company has a smaller database of images and all they are doing is checking your image against that database that you walk into the building. The camera looks at you and says, is this person in the database or not?

Craig Williams: Is your concern the size of the database?

Jennifer Lynch: No, I think my concern is with law enforcement use of the information and misuse of the information. A private company using facial recognition to provide access or your phone using facial recognition to give you access to your phone, that data is stored by either the private company or on the phone. It’s not necessarily shared with law enforcement agencies and so it doesn’t implicate the same kind of Fourth Amendment concerns that some of the uses of the technology that we’ve been talking about earlier do.

Craig Williams: Some people would argue that some of our large corporations are akin to our federal government, or that matter, most of them are bigger than state governments. Doesn’t that concern you? Like Google and Microsoft and Apple. They have those kinds of technologies and they can use them on a much broader scale and without government supervision.

Jennifer Lynch: I think that there’re two ways of looking at that. The first is that Google and Apple can’t put me in jail. They can’t restrict my liberty and law enforcement can very easily do that. That’s the first thing.

But I think that there is a concern about private companies collecting too much data and how they use that data. A lot of that concern has to do with the fact that we don’t really know exactly how they’re using the data and who they’re sharing it with. We saw this come up in several reports, the President’s report and the FTC report on big data that came out within the last month or so, where those reports looked specifically at the misuse of data and how it can be used to discriminate against people in society, whether it’s through advertising different products or denying access to a loan, for example.

I think that that’s something we definitely need to be concerned about with private companies. The Department of Commerce has a division called the NTIA or the National Telecommunications and Information Administration. Currently that agency is working on a multi-stakeholder project to develop a voluntary code of conduct for commercial use of facial recognition. That’s an effort that I’ve been involved in and several other civil liberties organizations have been involved in. I think that those conversations are occurring right now about how private companies should and can collect facial recognition data and combine it with other data.

Craig Williams: We’ve just about reached the end of our program, and we’d like to invite Jennifer and Ed to share their closing thoughts and their contact information. Ed, let’s start with you.

Ed Tivol: First, I appreciate the opportunity to participate. I think that this is the kind of healthy dialogue that we need. I think that, for the work that we’re doing, we’re very glad and very proud to be doing this kind of work, because we think it’s going to help secure the security of our country and hopefully save lives, so I appreciate the opportunity. Thank you.

Craig Williams: Your contact information if people would like to reach out to you, if you’re able to share it?

Ed Tivol: Certainly, if they want to go to www.ewa.com.

Craig Williams: Wonderful. Jennifer, your final thoughts and your contact information.

Jennifer Lynch: Thank you again for having me on the show. I think it’s been a great discussion and I agree fully with Mr. Tivol that it’s important to have these discussions and for people to talk about the uses of technology before we just accept them into society. I can be reached through EFF at www.eff.org, and people can also follow me on Twitter at @lynch_jen.

Craig Williams: Would you like to identify your blog and where it can be found for our listeners?

Jennifer Lynch: People should just check out the EFF website, so www.eff.org.

Craig Williams: We’ve got to the point in the show where I’ve got 30 seconds to share my closing thoughts before I get cut off by the buzzer, so here we go. I agree with both Jennifer and Ed. I think that we need to place limits on this. I think we need both federal government and private company limits on the use of facial recognition technology. Frankly, I see the need for a large database that everybody can access to be able to share this information between them in order for it to be the most effective use of what it can be, which seems to make sense to me.

That’s it. There’s the buzzer. That brings us to the end of our show. I’m Craig Williams and thank you for listening. Join us next time for another great legal topic. When you want legal, think Lawyer 2 Lawyer.

Thanks for listening to Lawyer 2 Lawyer. Produced by the broadcast professionals at Legal Talk Network. Join J. Craig Williams and Robert Ambrogi for their next podcast covering the latest legal topic. Subscribe to the RSS feed on LegalTalkNetwork.com or on iTunes. The views expressed by the participants on this program are their own, and do not represent the views of, nor are they endorsed by, Legal Talk Network, its officers, directors, employees, agents, representatives, shareholders and subsidiaries. None of the content should be considered legal advice. As always, consult a lawyer.