In July, a sniper, later identified as Micah Xavier Johnson, opened fire at a march against fatal police shootings, held in downtown Dallas, Texas, killing 5 police officers and wounding many others. After a 45 minute gun battle and hours of negotiation with the sniper, who was holed up in a parking garage, Dallas Police Chief David Brown gave an order to his SWAT team to come up with a plan to end the mayhem before more police officers were killed.
This led to the use of as robot, the Remotec Androx Mark V A-1, manufactured by Northrup Grumman and a pound of C-4 explosive, which was sent in eventually killing the sniper.
Today on Lawyer 2 Lawyer, hosts J. Craig Williams and Bob Ambrogi join attorney Edward Obayashi, deputy sheriff and legal advisor for the Plumas County Sheriff’s Office and Dr. Peter Asaro, assistant professor and director of graduate programs for the School of Media Studies at the New School for Public Engagement, as they take a look at the recent tragedy in Dallas, the use of robots by law enforcement, criticism, ethics, policy, and regulation when it comes to the use of robots.
Attorney Edward Obayashi is deputy sheriff and legal advisor for the Plumas County sheriff’s office and a licensed attorney in the State of California. Ed’s law office specializes in providing law enforcement legal services to California law enforcement agencies and he also serves as the legal advisor and a legal consultant for numerous law enforcement agencies in California. His duties include patrol, investigations, administration, training, and providing legal advice to department management and personnel.
Dr. Peter Asaro is a philosopher of science, technology, and media. Dr. Asaro is assistant professor and director of graduate programs for the School of Media Studies at the New School for Public Engagement in New York City. He is the co-founder of the International Committee for Robot Arms Control and has written on lethal robotics from the perspective of just war theory and human rights. Dr. Asaro’s research also examines agency and autonomy, liability and punishment, and privacy and surveillance as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles.
Special thanks to our sponsor, Clio.
Your Opinion Matters
Help us make your favorite shows better by completing the 2022 Listener Survey.
Lawyer 2 Lawyer: Law News and Legal Topics
Law Enforcement and the Use of Robots
Edward Obayashi: So from a legal standpoint, which is hopefully what your listeners are interested in, yeah, military has nothing to worry about when it comes to civil liability. Law enforcement has everything to worry about when it comes to the civil liability, and under those circumstances they are never going to let a device like a robot or other machinable device operate independently without human control.
Dr. Peter Asaro: What you had in Dallas was really an explosive removable robot being used for a different kind of purpose, and I would think also that even the use of explosives by police in that case was pretty novel and poses significant kinds of risks. So it was really kind of an exception in a number of different ways, but I think going forward if we are going to see the widespread adoption of these kinds of robotic technologies by police when they are doing other sorts of activities, there would need to be some kind of standardization I would think, and you would want to try and do that on a Federal level.
Intro: Welcome to the award-winning podcast Lawyer 2 Lawyer with J. Craig Williams and Robert Ambrogi, bringing you the latest legal news and observations with the leading experts in the legal profession. You are listening to Legal Talk Network.
Bob Ambrogi: Hello and welcome to Lawyer to Lawyer on Legal Talk Network. This is Bob Ambrogi coming to you from Boston, Massachusetts, where I write a blog called LawSites, and also co-host another Legal Talk Network podcast called Law Technology Now with Monica Bay.
J. Craig Williams: And this is Craig Williams coming to you from scorchingly hot sunny Southern California. I write a legal blog called May It Please the Court.
Bob Ambrogi: And for once Craig I have got to say it’s scorchingly hot and sunny here in Boston as well, so it’s not just Southern California. Before we introduce today’s topic, I would like to thank our sponsor Clio. Clio is the world’s leading cloud-based legal practice management software. Thousands of lawyers and legal professionals trust Clio to help grow and simplify their practices. Learn more at HYPERLINK “http://www.clio.com” clio.com.
Well, in July a sniper, later identified as Micah Xavier Johnson, opened fire at a march against police shootings, in downtown Dallas, Texas, killing five police officers and wounding many others. After a 45 minute gun battle and hours of negotiation with the sniper, who was holed up in a parking garage, Dallas Police Chief David Brown gave an order to his SWAT team to come up with a plan to end the mayhem before more police officers were killed.
J. Craig Williams: And Bob, this led to the use of a robot, the Remotec ANDROS Mark 5A-1, manufactured by Northrop Grumman, and with a pound of C-4 explosive later sent in, and eventually killing the sniper; no real word on how the robot is doing.
So today on Lawyer to Lawyer we are going to take a look at the recent tragedy in Dallas, Texas, the use of robots by law enforcement, the criticism of them, and ethics and policy regulations when it comes to the use of robots.
Bob Ambrogi: And to help us talk about this today we have two guests joining us. Let me introduce our first guest, he is Edward Obayashi, Deputy Sheriff and Special Counsel for the Plumas County Sheriff’s Office, also a licensed attorney in the State of California. Ed’s law office specializes in providing law enforcement legal services to California law enforcement agencies, and he also serves as the legal advisor and legal consultant for numerous law enforcement agencies in California.
His duties as a Deputy Sheriff include patrol investigations, administration training and providing legal advice to department management and personnel. He is also an official US Government and international use of force expert. So we are very happy to welcome to Lawyer to Lawyer today Edward Obayashi. Thanks for joining us.
Edward Obayashi: Thank you Bob. Thank you Craig.
J. Craig Williams: And Bob, our next guest is Dr. Peter Asaro. He is a philosopher of science, technology and media. Dr. Asaro is an Assistant Professor and Director of Graduate Programs for the School of Media Studies at the New York School for Public Engagement in New York City.
He is the Co-Founder of the International Committee for Robot Arms Control and has written on lethal robotics from the perspective of war theory and human rights. Dr. Asaro’s research examines agency and autonomy, liability and punishment, and privacy and surveillance, as it applies to consumer robots, industrial automation, smart buildings, and autonomous vehicles. Welcome to the show Dr. Peter Asaro.
Dr. Peter Asaro: It’s a pleasure to be here.
J. Craig Williams: So going to turn to you, Dr. Asaro, first Peter, to find out, just give us a little bit of an overview and a background of how robots are used and how they come into killing people.
Dr. Peter Asaro: Well, I think it’s important to draw the distinction between policing systems and military systems. A lot of what we are doing at the International Committee for Robot Arms Control is working with an international campaign to stop killer robots. At the UN, really trying to develop an international treaty that would prohibit fully autonomous weapons systems, and those would be weapons systems that would automatically select target and engage them with lethal force independently of meaningful human control.
We have seen the development of a lot of robotic systems in the military, including remote operated drones, but you still have humans that are controlling those systems and crucially making the targeting decisions for those systems. What we are really trying to prohibit is the extension of that sort of technological evolution into systems that would use software, sensors, cameras to make those kinds of decisions.
What we have also seen is a lot of militarization of police and a lot of transfer of these kinds of military technologies into police forces. So within the United States, it has been mostly bomb disposal robots and some small drones for surveillance purposes, but we have seen certain companies; there is a South African company that has developed small drone that’s armed with tear gas; there has been some attempts to put tasers and other kinds of weapon systems on to these drones or small robots.
And so I think there is a concern both at the military level of the autonomy question, but I think even within police forces, are we going to have armed robots, what kinds of situations are those being developed for, how are they actually going to end up being used?
I think they go a long way towards protecting police officers in dangerous situations, and that’s a lot of the motivation behind the bomb disposal robots, and that’s why they always had this report to begin with, using it as a weapon or weaponizing these systems I think raises a number of concerns.
This was certainly a very exceptional circumstance, but if it became a more regular occurrence or many police departments have this kind of system in their arsenal, as it were, they would find new kinds of uses for it, and that would be, I think, potentially very concerning.
Bob Ambrogi: Edward Obayashi, we have read reports saying that this was the first I guess lethal use of a robot by a police department, but what have you seen in terms of police departments using robots? Peter has talked a little bit about the military uses, what have you seen going on with police departments around the country?
Edward Obayashi: What Peter says is very well-taken. There is a perception issue here when it comes to police tactics. We all know whatever the news of the day may be, if the police have shot someone or especially something like this, which seems the matter of first impression, it’s going to take the headlines, especially from the public perception.
We have today’s media; I am talking about both news, entertainment, et cetera, everybody has an impression of RoboCop, Terminator, and it conjures all sorts of images of a futuristic police enforcement type atmosphere. So police get that.
The reality though is that these devices, these robots or mechanized devices have been around for decades. They have just gotten more and more advanced to the point where there is a lot of optional choices for police law enforcement to utilize them under certain circumstances.
I will give you a good analogy, several months ago in San Bernardino, California there was a high-speed pursuit down the interstate highway, I believe it was 215. A vehicle was traveling in pursuit at over 100 miles an hour. Deputy vehicles were pursuing; it was a long way chase, especially in that area of Southern California. Traffic was mounting. And the Sheriff’s Office decided to terminate the pursuit by shooting the vehicle from a helicopter.
There was a lot of coverage about that ethical debate about terminating a pursuit using deadly force on a vehicle; that had been previously unheard of. However, again, a helicopter is another technological tool. It is advanced. It’s a weapons platform when it needs to be, and in this case whether it’s a robot or a helicopter when it’s utilized in that matter for the first time, in other words, death results, everyone understandably is going to ask questions.
Now, the key here is whether it’s autonomous or not. I can’t imagine a day coming where a — or maybe I can, but not in the near future. I hope not that police departments will ever relinquish human control over device, never mind what it is. The key difference between the military use and police use is going to go what the military, yes, they do have autonomous capability. Police will never do that because of one major difference between the military and the police, and that’s going to be the issue of civil liability.
So from a legal standpoint which is I believe what your listeners are interested in.
Yeah. Military has nothing to worry about when it comes to civil liability. Law enforcement has everything to worry about when it comes to the civil liability. And under those circumstances they are never going to let a device like a robot or other mechanical device operate independently without human control.
J. Craig Williams: What are the ethics Dr. Asaro in these issues? How do we decide whether a robot can kill a life or whether it’s a human that kills the life, and is there an issue with respect to the differing detachments that you have? Is it a Palsgraf type of a situation where if you’re so far away that there’s no liability, but if you are right on top of that there is?
Dr. Peter Asaro: Yeah. I think there is a lot of interesting questions here. I think the basic legal and ethical question in one sense is unchanged regardless of the tool or weapon or system, in a sense, you have to justify the use of lethal force on the start.
And I think a lot of questions come up about — even on the example just given about high speed pursuits, whether it’s really necessary to kill this person or let them go, and state pose imminent and immediate threat to another individual, then that’s a justification ethically and legally for using lethal force.
If you simply suspect or have a sense of some probability that it could kill somebody in the future or it caused significant bodily harm, then you don’t have quite — you might not have that full legal standard for the use of lethal force. And that’s something that’s sort of interesting. Most of the State Laws and Federal requirements for use of lethal force by police don’t actually match the International United Nations Standards for the use force in this regard.
Primarily, they permit police to shoot fleeing suspects in many states which would be prohibited in the UN Guidelines.
So part of that is whether the US standards that we are following are up to International standards. But I think if you have a situation where it is in fact justified that use of a kind of robotic technology, especially a remote technology where you have a human is making a decision still about the use of lethal force, which I think is a moral requirement that you actually have a human being who makes the legal and ethical judgment, which is why an autonomous lethal system should be prohibited, not because they’re not really capable of making a legal judgment.
But if you have the human making that legal judgment and now they are just remote, so they have this remote control robot or they are using a drone of some sort. I think what you also have is that’s different than the case where the police officer would be face-to-face with the suspect, and that police officer is not in danger for their own life.
So self-defense is not a legitimate sort of argument for the use of lethal force in that case, because the robot is there, they are the one that might be damaged and that’s a sort of property damage, it’s not a life-threatening situation.
Now if the person is threatening others, third parties in the vicinity or to do some egregious crime and setting off bombs or something like that. Then I think you have a different situation on the remote that’s, again, doesn’t matter because the threat doesn’t depend on the presence of the police officer themselves.
So it’s a sort of complicated transformation of the situation by introducing these kinds of technologies. So on the one hand it’s easy to sort of simplify it and say, well, there is no difference, it’s just another way to kill people. But it does in fact change the way you make these judgments about whether the use of force is really necessary.
J. Craig Williams: Is there also a concern that if you are making it easier to kill people and if you are making it possible to kill people in a situation where your own — your own safety doesn’t have to be threatened that it will be used more frequently?
Dr. Peter Asaro: I think it’s definitely a concern and it’s also the kind of the mission creep, which we have seen already I think with SWAT teams. And the SWAT teams were actually evolved as a concept to respond to the sniper and the University of Texas in 1960s, and when you have that kind of sniper situation; it’s really good to have the trained SWAT team.
But we also see SWAT teams being used to serve regular warrants; break up poker games, bust people for growing marijuana in their closet, and things like that. And so, that kind of mission creep or use of that style of technique or technology in different ways is concerning?
Bob Ambrogi: Edward, what do you think, is there a difference here between killing somebody with a robot and police use of a gun to kill somebody?
Edward Obayashi: No, there is no difference. As Peter stated earlier, regardless of the device whether it’s a knife, a gun taser, as long as lethal force is justified under the circumstances, it’s not going to matter what was used.
The officers have lost their guns during fights and they have had to resort to secondary weapons, even sharp edge weapons, a backup knife, whatever it maybe. Not to be flip about it, but let’s say, tomorrow suddenly a raygun, Star Trek type taser or PhaZZer was available, the courts would not differentiate between a traditional handgun or that taser. How it was used, when it was used, as long as this circumstance is justified, it’s same as the robot here.
As I mentioned earlier, it’s the perception of the newness of this, the fact that this was a very first time apparently that a robot has been used to killing the suspect under the circumstances, but that touches upon a point you asked earlier, police departments across the nation are not going to reveal what kind of weaponry they have, what kind of tactics they are going to utilize for obvious security reasons. And I think this is one of those situations where it took many by surprise, the fact that it was just so new in its application, and again, it creates that perception of where we headed.
And believe me, law-enforcement managers get that, they do get that, but they’re not going to — they can’t sacrifice public security, departmental security, operational security for that purpose. It’s always a balancing act.
J. Craig Williams: How does artificial intelligence play into this? We are going to see artificial intelligence develop as a defense when it takes over and kills a person?
Edward Obayashi: I don’t think the kinds of systems we are looking at today are what we think of is a human like intelligence or artificial intelligence in that sense.
In the sort of engineering sense of artificial intelligence, what you really have is a lot of pattern recognition, machine learning algorithms, things like that that are going to be used more and more for like predictive policing, trying to figure out which areas or which individuals are likely to be suspects in the future. And we have already seen some of that in Chicago. That technology being used, that’s the kind of big data, artificial intelligence, but the smaller systems again, they are mostly remote control because you rely on the human’s vision and manual control to do most of the tasks. Because you don’t really know what bombs are going to be built like, it’s very difficult to pre-program robots to disarm bombs in general.
And AI might serve useful functions in the future in terms of grasping and manipulating and problem solving in those kinds of situations. But I don’t think they are going to be replacing humans in the end and they are certainly not going to be anywhere near the capability of making these contextual decisions about whether the use of force is necessary in a given situation or even whether an individual poses a significant threat in a given situation.
You would have to be able to understand the physical dynamics of the world in a very sophisticated way to understand when somebody is using a hammer or a stick as a weapon, or even visual recognition of knives and guns would be fairly tricky, but its open carry can be legal. So simply seeing a gun doesn’t necessarily mean it’s a threat.
And then understanding psychological intention of individuals and whether they intend to carry out a threat or commit violence. It’s very difficult for humans to judge and that’s much more difficult for computers to try to figure that out. We just don’t know how to program that.
Bob Ambrogi: Is there a more basic security issue here, one of the security I assume that police are communicating with these robots wirelessly in some way. Can those communications be hacked or intercepted so that the third party could take control of a robot in a crisis situation, like say for example the victim, and turn it on the police?
Dr. Peter Asaro: Potentially, it could be very sophisticated. One of my students was actually operated the EOD robots in the U.S. Army in Afghanistan, and they would regularly use electronic jamming systems for the whole surrounding area before they would even deploy the robot. But that was mostly because they were concerned with remote triggering devices being used to set off the roadside bombs. But it also meant to shutdown their communications with everything but their robot, because it’s designed to operate specifically in those contexts whether jamming every other frequency except the one that the robot uses.
So there are ways to sort of make those secure at least in military sense. Hacking in the drones has been done, you can spoof them, and you can take control of them, all sorts of things; so in a wireless situation that’s always possible.
Bob Ambrogi: We need to take a short break at this point, please stay with us and we are going to be back after a few messages from our sponsors to talk more about the legal issues of robots and policing.
Clio is an invaluable software solution for law firms of all sizes, handling all the demands of your growing practice from a single cloud-based platform. Clio enhances your firm with features such as matter and document management, time tracking, and even billing. Clio is an effortless tool that helps lawyers focus on what they do best, which is practice law. Learn more about Clio at HYPERLINK “http://www.clio.com” clio.com.
J. Craig Williams: And welcome back to ‘Lawyer 2 Lawyer’. I am Craig Williams and with us today is Attorney Ed Obayashi, the Deputy Sheriff and Special Counsel for the Plumas County Sheriff’s Office and a Licensed Attorney in the State of California, as well as Dr. Peter Asaro, Assistant Professor and Director of Graduate Programs for the School of Media Studies at The New School for Public Engagement in New York City.
In our last segment, we’ve been discussing the tragedy in Dallas and the use of robots in law enforcement and we recently have been talking about ethics and wireless issues and people taking things over.
What are the militaristic applications of these robots in terms of the police’s use of them? Should the police really be using a militaristic style robot in a public, citizen’s type environment where we are not at war with our citizens, presumably?
Edward Obayashi: As you may recall I am sure our listeners can all remember with the Ferguson impetus regarding the notoriety if you will of this issue regarding surplus military equipment being made available to civilian agencies, Sheriff’s offices, police agencies and other law enforcement type agencies. The public may not have known that this has been going on for decades, most departments — when I say, “most”, keep in mind the large departments in the country LAPD, LASO, Chicago PD, NYPD, Houston, all those large agencies are exceptions.
I don’t have the exact statistics, but by far the overwhelming majority of law enforcement agencies in the United States are probably in the neighborhood of less than a 100 sworn officers. There are many, many departments. They have less than 50 sworn officers. Those departments rely heavily on this government program and the images coming out of Ferguson again understandably so the perception was that we may be becoming a police State.
Speaking to managers of police departments they are very much aware of that perception, but at the same time today’s ever-changing, ever-going threat especially and I’d hate to use the issue of terrorism as a crutch, but it is, it is not a crutch, it’s not exaggerated. As terrorists’ acts become much more sophisticated as you and I, and everybody else have begun to appreciate the traditional tools that police have used, just are not effective enough during a life in danger situation.
Decades ago none of these issues, what’s present or concerned, the cop on the beat walking without a vest, without body protection, just walking a beat with his nightstick, a pair of handcuffs and a revolver, that was fine and well. But not anymore, you not wonder the terrorists and criminals are free to arm themselves to their desire. At the same time, the police have always been had to react to that, and so again, when you have a new technology that is suddenly introduced under these circumstances and especially under the tragedies involving officer involved shootings, yes, the perception regarding the concern for a police State is magnified.
But if you think about — when I first started I don’t know, not to be funny here, but I don’t know how old my colleagues here in the show are, but I started 25 years ago and that’s when tasers are just coming into law enforcement use. I clearly recall the outcry and the concern by the general public and media, and law commentators regarding these tasers, they thought where are we going on this?
But now there have been obviously number of landmark guiding cases regarding the use of tasers, when it’s appropriate when it’s not, et cetera. So again, new technology is always going to be just, for lack of a better term, it’s always going to be a point of interest especially when it’s in the hands of law enforcement and how it’s used, just because of the mere fact that it is law enforcement and everybody obviously has a stake in how law enforcement conducts itself. So again, it’s always the newness of the tactic worthy tool. Who knows what tomorrow may bring.
J. Craig Williams: I read that most of the major police departments in the country and many smaller legal departments have robots, in fact I am not sure a lot of people are aware of that, do we need either of your opinion legislation of any kind, either on a state level or on a federal level to address the use of robots in law enforcement?
Edward Obayashi: I don’t believe so, I don’t know what area of the use of robots any type of legislation would address other than say Fourth Amendment issues dealing with privacy issues. Like robot being used to sniff around a residence for presence of illegal drugs, but when it comes to the issue at hand, use of force, again, the laws that exist under Graham v. Connor another US Supreme Court law, they are not going to specify you can’t use this particular type of device, it’s always going to be what is reasonable under the circumstances regardless of the type of force. And that’s why they can’t specify no law, no legislation, no court is going to say, well, you can’t use this device, it’s always going to be under the principle of what constitutes deadly force.
So regardless of whether it’s a robot, a taser or a raygun, a bazooka or whatever it be, it’s going to be what was reasonable under the circumstances, regardless of the means that were used to kill someone.
J. Craig Williams: Well, we have just about reached the end of our program and it’s time to wrap up and get your final thoughts as well as your contact information. So, Ed, let’s turn to you.
Edward Obayashi: My contact information first is, Edward Obayashi. My phone number is always accessible, so it’s area code (619) 857-2359. My law office email is HYPERLINK “mailto:[email protected]” [email protected], and I am always available for any questions, comments, et cetera. So thank you very much for having me today.
Dr. Peter Asaro: Yeah, and you can reach me on the Internet at HYPERLINK “http://www.peterasaro.org” peterasaro.org that’s my website, or @peterasaro it’s my Twitter handle if you want to hear all about military robots and robotics technology in general.
And just I guess a response to a couple of those last questions, there has been some legislation in the State of Virginia that requires warrants for the police to use drones for surveillance and things like that, very much under the Fourth Amendment as I had mentioned, and I think there is concerns when we move into weaponizing these systems and there would need to be some kind of regulation I think placed with the companies who would manufacture such systems to feel that that was an acceptable use.
What you had in Dallas was really an explosive removable robot being used for a different kind of purpose, and I would think to also even the use of explosives by police and that was pretty novel, and poses significant kinds of risks in terms of the structural integrity of a building and the 29:04 for hurting other individuals in the area.
So it was really kind of exceptional in a number of different ways, but I think going forward if we’re going to see the widespread adoption of these kinds of robotic technologies by police that are doing other sorts of activities there would need to be some kind of standardization I would think and you would want to try to do that at a federal level to ensure that civil rights are being respected and so forth.
J. Craig Williams: Great. Well, thank you both for participating in our show today, that brings us to the end of our show. Bob?
Bob Ambrogi: Thanks to both of you for taking the time to be with us today and to all of our listeners out there. Thanks for listening. Join us next time for another great legal topic on ‘Lawyer 2 Lawyer’.
Outro: Thanks for listening to Lawyer 2 Lawyer, produced by the broadcast professionals at Legal Talk Network. Join J. Craig Williams and Robert Ambrogi for their next podcast, covering the latest legal topic.
Subscribe to the RSS feed on HYPERLINK “http://www.legaltalknetwork.com” legaltalknetwork.com or in iTunes.
The views expressed by the participants of this program are their own and do not represent the views of, nor are they endorsed by Legal Talk Network, its officers, directors, employees, agents, representatives, shareholders, and subsidiaries. None of the content should be considered legal advice. As always, consult a lawyer.