Lawyer 2 Lawyer
John Weaver is the author of Robots Are People Too, which explains amendments to existing laws that...
Todd Berg is a former trial attorney and legal news reporter with Michigan Lawyers Weekly. Today he...
Anna Eby is a business litigator and appellate attorney in Austin, Texas, where she represents businesses and...
J. Craig Williams is admitted to practice law in Iowa, California, Massachusetts, and Washington. Before attending law...
Are you tired of driving yourself to work? Have you always wanted a chauffeur but never could afford one? If this sounds like you, then happy days are here with the advent of the driverless car. Institutions like Google, Carnegie Mellon, and Uber are developing what they hope to be totally autonomous vehicles capable of ushering passengers to and from destinations without the need for a human driver. But what does that mean for the law, safety standards, and our freedoms?
In this episode of Lawyer 2 Lawyer, host J. Craig Williams interviews attorney and author of Robots Are People Too John Weaver, researcher and writer for Michigan Auto Law Todd Berg, and litigator and author of Motorista Anna Eby. Together they discuss liability for passengers, possible federal regulations, and risks associated with vehicle hacks. In addition, they debate when the government might pilot your driverless car, how medical emergencies in autonomous vehicles will be handled, and the possibility of the repo man summoning your automobile. Tune in to hear about existing driverless car laws and much much more!
John Weaver is the author of Robots Are People Too, which explains amendments to existing laws that will become necessary as artificial intelligence and autonomous technology become more widespread. In addition, he is a contributing writer for Slate magazine, addressing similar topics, and an associate attorney at Morgan, Lewis & Bockius LLP where, among many areas, he provides legal services for property acquisition, financing, and development projects.
Todd Berg is a former trial attorney and legal news reporter with Michigan Lawyers Weekly. Today he provides analytical research and writing support to the attorneys at Michigan Auto Law, a firm entirely dedicated to the representation of clients involved in auto accidents. They boast the largest auto/truck verdict in their home state during the last 15 years and guarantee a victory for their clients or they won’t charge a dime.
Anna Eby is a business litigator and appellate attorney in Austin, Texas, where she represents businesses and individuals in complex commercial disputes before Texas state, federal, and appellate courts. In addition and relevant to our episode, she is an avid car photographer and motorsports attendee who maintains a car blog called, Motorista, which covers legal issues in the automotive industry.
Special thanks to our sponsor, Clio.
Lawyer 2 Lawyer: Driverless Cars: Who’s “Driving’ and Who’s Responsible? – 2/21/2015
Advertiser: If we do drive into a new driverless car world, there are certain things in the law that are going to have to change. Everyone is used to having new technology being introduced every year, every month. Now the law always is following behind the technology, so I don’t know that the law is ready for it yet. That gets us into a dangerous gray area where we’re either trying to apply existing laws that really don’t fit the situation, or we’re not addressing at all, which is even more dangerous
Welcome to the award-winning podcast Lawyer to Layer, with J. Craig Williams and Robert Ambrogi, bringing you the latest legal news and observations with the leading experts in the legal profession. You’re listening to Legal Talk Network.
John Weaver: Thank you for having me.
Todd Berg: Thanks a lot Craig, it’s great to be here.
Anna Eby: Thank you, happy to be here.
Anna Eby: I don’t know that the law is ready yet, I think consumers are. Everyone is used to having new technology introduced every year, every month. So I think folks are ready for it; now the law always is following behind the technology, so I don’t know that the law is ready for it yet.
Todd Berg: I would agree with Anna. Aside from the need to perfect a technology as much as possible, I think that if we’re going to move into this brave new world, we need to have some brave new laws to deal with the implications of that for people’s safety in using driverless cars.
John Weaver: I think some people are and some people very much aren’t and some of the people who aren’t never will be. If you listen to manufacturers of some of the higher end automobiles in the market right now, they might be developing self driving technology or looking into it, but they’re not exactly embracing it.
John Weaver: Their customers – or at least the ones they’ve identified as their ideal customers – want to drive; that’s part of what they’re looking for and they’re buying a certain type of machine to enhance their driving experience. A lot of drivers like commuters – I certainly have myself in that – are looking forward to having a car that does the driving for them so that they don’t have to have the ball drop.
Todd Berg: That’s a great question; in a way, it’s all of the above. I think that even though we have the technology involved, it’s still going to come back to the same thing: who’s in control of the vehicle. If you’ve got a driverless vehicle, certainly the technology qualifies as an operator or a driver, but I think whoever else is in the vehicle who is probably directing the technology, which is then directing the travel of the vehicle, I would say that they also qualify as a driver or operator of the vehicle. If you’re just riding in there as a passenger, either in the passenger seat or in the backseat and you have no say in how fast the car goes and what the direction is, I think the argument there would be that you’re just a passenger.
Todd Berg: Yeah, I think so and I think if you do that you kind of self-appoint yourself as a driver to some extent.
Todd Berg: Theoretically, he had the option of not following those instructions, but I think a person in the backseat of a driverless vehicle who maybe gives a voice command to turn here or take I-275, I think they’re an operator or they’re controlling the destination of the vehicle as much as if they reached over and hit the accelerator or turned the steering wheel.
Anna Eby: That’s an interesting question and since I’m not an engineer, I don’t know if I have a scientific answer to that question. I would hope that there would be one person who would be designated the operator, if you will, separate apart from the actual technology and the vehicle. Someone who would be responsible for engaging the autonomous system, so that they would sort of be like the driver even though they don’t actually have their hands on the steering wheel or doing something like that. Otherwise, I agree with what Todd was saying and I think that creates some issues if you’ve got 4 people in a vehicle and everyone can give voice command. It’s hard for me to imagine a system where we’ve got cars where you can just put someone, like a child, in your example, in a car and just send them off. I’m hopeful that we will always have a responsible person who knows how to operate a vehicle in the vehicle.
Anna Eby: I hope not, although we’ve got plenty of adults who act like 12 year olds when they drive, so maybe we’ll be better off.
John Weaver: This is a topic that I address in my book pretty extensively, with the idea of where does the liability go and who is the operator. In some situations, typical product liability law is going to govern, if the body mechanism breaks down and it’s the manufacturer’s flaw or a design flaw, then yes, absolutely; the liability is going to go to the liability is going to go to the manufacturer. If the car works exactly as it’s supposed to and still gets into an accident, I think that the line as to where the liability should go gets a little murkier. It’s entirely possible that a car driving law gets into a situation where a deer jumps in the middle of a road and the car has to make a split second decision and the best case decision can still cause an accident, should the least of possible accident that it can cause; liability obviously has to go somewhere. The states have addressed this – California, Florida and Nevada – have indicated it in their statutes exactly what Anna was talking about: that the operator, person behind the wheel, has liability as the operator, even if the self-driving mechanism is engaged. Or, if there isn’t somebody in the driver’s seat, whoever turns on the car is the operator under state law.
Todd Berg: Well, not in Michigan you’re not. That’s kind of what I was saying, at the beginning of the show is that if we do drive into a new driverless car world, there are certain things in the law that are going to have to change to account for situations where tragedy occurs. And one of those is the liability situation. Here in Michigan, we’ve got a very strict product liability law. And if I had a wishlist in a driverless car world, the top of that wish list would be to amend or change the law so that manufacturer’s wouldn’t be exempt in situations like this. Under the existing law, it’s very difficult to establish liability for a manufacturer in a situation like this. You’d have to jump the moon to practically do it. And recently in the beginning of March of last year, a law was passed which allows manufacturers to put driverless cars on roads to road test here in Michigan. And there was a liability provision in that bill. And I think that is probably – for a lack of anything else – it’s probably a predictor of what the future may hold. And that bill said that liability for the manufacturer would only exist if a defect could be shown, and it could be shown that the defect was there at the time of manufacture. And that provides, as you might guess, a very broad immunity for the manufacturer. I just think that if these cars become the thing of the future and they’re all over the road, you have a situation where they’re creating a possibility or accidents, injury, death. They have to be held accountable on some level. I understand that it’s a developing technology and they need to have the room to grow and to move and flexibility, but at the same time the public needs to be protected. And somebody needs to be held accountable when things go wrong and someone gets hurt or someone dies.
Todd Berg: That’s a great question. Yeah, I think we are going to need a set of federal regulations. I think we’re going to need a set of regulations on two levels. One, setting minimum operational or functioning standards to make sure that these driverless vehicles can handle everything that’s out on the road; so that’s one set. What do they need to show to actually get on the road; but also, I think that in order to deal with these issues of product liability – and I think Michigan is not totally unique in the sense that we have a very broad immunity in our law and I think a lot of states have that – and I think the best way to deal with that, especially when you’re going to have manufacturers that are selling all across the country, I think we need federal regulations that may establish some level of liability on the part of these manufacturers. Either require them to start maybe a compensation fund, akin to the vaccine compensation fund, or maybe require them to take out a liability insurance policy for the vehicles that they’re putting out on the road.
John Weaver: One of the mechanisms in place, at least under the law – and the states have addressed this – is a killswitch that would turn off the autonomous functioning. California and Nevada and Florida all require some form of this where the driver or somebody in the car is able to easily turn off the self driving function and they could take control of the car regularly. Presumably – and as I’ve said, I think we’re all attorneys here and not engineers – but presumably, the technology that will be developed, and undeveloped now, will be designed so that there is a firewall between the autonomous system that could be hacked into and just regular car functioning so that when the killswitch goes, the driver or would-be driver doesn’t have to worry about the hacker taking over their car even though human beings should be responsible for it. However, in terms of where the liability goes, if a hacker takes control of a car and runs it into another car or building, I think that’s another layer of complexity that requires some government attention and then some assistance in the form of legislation or regulation indicating where, collectively, we have decided as a country, where we want liability to go and how we want to address that. The laws right now don’t say anything about that and that gets us into a dangerous gray area where we’re either trying to apply existing laws that really don’t fit the situation, or we’re not addressing at all, which is even more dangerous.
Anna Eby: Well, in Texas, I think the possibility is nil. We’re not very interested in having too much government control in Texas, so I don’t see that happening ever. As for the rest of you folks, I don’t know. That’s an interesting question that remains to be seen. I would be surprised if there was a great deal of government control over autonomous vehicles. I have a hard time envisioning that, that sounds like the dystopian novels that we read and think hopefully that would never happen in real life.
Todd Berg: I think it’s a great question and it definitely gives me a chuckle when I see it, because it’s really forward thinking and it’s possible. If we have driverless cars on the road, this is possible as well. The government always has a police power, they have the ability to tell people what to do and what not to do under very strenuous circumstances. So no doubt, the government might say, we’re invoking our police power and we’re going to take over this and we’re going to stop all of these cars going onto the road because it’s a public health hazard. Maybe that’s justifiable. Maybe if they try to do it because there’s too much traffic and it’s causing too many delays, obviously there would be an abundant pushback on the police power.
Todd Berg: There would definitely be a lot of scrutiny on decisions like that, but there’s also the other aspect of the other Constitutional side of this, which is fourth amendment privacy rights. The Supreme Court has ruled in the last couple of years that you’ve got to have probable cause and a search warrant to put a GPS on vehicles. You’ve got to have probable cause and a search warrant to check out the cell phone of somebody who’s been arrested. I think a lot of those – those two cases and the constitutional principles in those cases – would definitely come to the surface very fast as soon as the government starts trying to tap into the computer or whatever that’s governing their vehicle.
Kate Kenny: Hi. My name is Kate Kenny from Lawyer to Lawyer, and I’m joined by Jack Newton, President of Clio. Jack takes a look at the process of moving to the cloud. Now how long does it take to move to the cloud, and is it a difficult process.
Jack Newton: No. With most cloud computing providers, moving your data into the cloud is something that takes just minutes, not hours or days to do. You can get signed up and running with most services in just a few minutes. Even if you have an existing legacy set of data that you want to migrate to a web-based practice management system like Clio, there’s migration tools and migration services that we’re able to offer to each that process. Most firms can be up and running in the cloud in less than five minutes, and can have their data imported in a matter of hours or days.
Kate Kenny: We’ve been talking to Jack Newton, President of Clio. Thank you so much, Jack.
Jack Newton: Thank you, and if you’d like to get more information on Clio, feel free to visit www.goclio.com. That’s G-O-C-L-I-O.com.
Anna Eby: Well, I think the driverless vehicles will be much better behaved than the ones being driven by the rest of us to the extent that the autonomous cars have systems where they can communicate with each other. That can create some interesting situations where you’ve got both regular, traditional vehicles, and these newer ones that are on a different wavelength than the rest of us. But I don’t envision there being a huge issue with having both types and I also don’t think we will – hopefully not in my lifetime, anyway – ever have roads where we only have autonomous vehicles. I think they will coexist, I think they already are to a certain extent more than we even realize. There are vehicles on the road right now that have some amazing autonomous ability. The new Mercedes Benz S-Class, for example; and they’re already out there and being used. So I think they’ll be able to co-exist.
John Weaver: I think that in terms of when there’s a medical emergency in a car, I think that will be handled relatively similarly to what happens now when there’s somebody in a car with a medical emergency. The car will make its way, as safely as it can, to a hospital with all haste. However, I doubt that the self-driving car will go as quickly as, say, the story about that expecting father whose wife was in the back seat and he drove 125 miles an hour to the hospital to get her there so that the child can be delivered more safely. He ended up getting a police escort and the police gave him a ticket for speeding right after the child was born. I don’t think a self-driving car is going to be going 125 miles an hour, I just don’t think it’s going to be programmed to do that anyway. But other than that, I think that the car will make its way as quickly as possible to a hospital.
Todd Berg: John had talked earlier about the killswitch idea, and it seems to me like that type of mechanism might be appropriate in a situation like this; a killswitch or some kind of override so that somebody can grab the wheel and take it. And, or, something akin to the OnStar system. Because if you are having a medical emergency, maybe the last thing in the world you should be doing is grabbing the wheel and driving like crazy to the nearest hospital. If the computers are controlling your vehicle and you’re tapped into the internet in every other respect, it seems like that would be a perfect opportunity for a feature like OnStar to kick in and detect what’s wrong and alert the authorities and have them come to you.
Anna Eby: I think that’s an interesting scenario. I would be very interested to see how different states handle that. I can tell you, again, in Texas, I don’t see that happening. But I think the law will need to address that. That’s a very good example of-
Anna Eby: Well I would prefer to keep the system we already have so that we’re not giving increased power to whether it’s the government or business or even to individuals through the use of this autonomous technology. I’m sure there are some clever folks out there who are going to figure out a way to make that an official for them.
Todd Berg: Hopefully that day never comes.
John Weaver: That’s a terrible idea. It would be like Lord of the Flies on wheels, it’s a terrible idea. I think there will always be a situation where we will always have the need to chaperone kids. And I think that the school bus is a good example of that. Even if there is a self driving school bus, there would be an adult at the head of the bus that will offensively be in charge of the kids.
Todd Berg: I think it still goes back to what we were talking about earlier as to who is the actual operator and who is giving the vehicle direction as to how it’s supposed to function. I know it seems crazy, but if you’re the intoxicated individual and you’re programming the vehicle to take a certain route or do something in particular, if a result of that command a crash occurs, it seems that you would have to be the responsible party.
Todd Berg: But I think the responsibility is going to flow to whoever has control over the vehicle, and arguably that would be you because you’re giving the commands to the vehicle. Because the vehicle wouldn’t actually do anything unless it was commanded to do so.
Anna Eby: Well maybe, and I am a happy user of Uber, I think it’s a great service. Potentially, I think it also creates a huge opportunity for all sorts of other apps and services. Yeah, I would say that it could very well be a boom for a lot of creative folks out there.
John Weaver: Sure. I think that driverless cars are exciting forms of technology, I think they’re going to be here, commercially available by the end of the decade. Some forms of them already are, in limited fashion, and I’m looking forward to getting one myself so that I don’t have to drive anymore. People could reach me at [email protected], and they can find me on Twitter, @RobotsRPeople.
Anna Eby: Yes, well I really appreciate the opportunity to be on here today. There’s one question that I keep coming back to and it relates to us talking about the driverless school buses and some of these other issues or inebriated drivers. I think we’re going to see a potential issue with liabilities of individuals in autonomous vehicles who engage the autonomous feature at times when they shouldn’t. I think that’s an issue we didn’t really discuss today, but that’s something that’s been in the back of my mind and something that we will probably continue to talk about. When is it negligent to engage the autonomous vehicle. So, just throwing that thought out there, that’s something that I think a lot about. And I can be reached via email. My email address is [email protected], and also on my blog, that’s MotoristaBlog.com.
Todd Berg: Craig, thank you very much for having me on Lawyer 2 Lawyer today, I really appreciate it. My closing thoughts would be two things. One, driverless cars are going to be driving into our world sooner or later and the safety benefits – if they are half as great as what people are saying – I think everybody will ultimately end up welcoming them in because they will reduce the number of accidents and injuries and deaths dramatically, and that will be a happy day. The second point is that in order to prepare for that world, certain changes do need to be made to the law. Immunity laws, like we have here in Michigan for manufacturers, they need to be reconsidered so that when and if something bad happens with a driverless car, people that are injured are able to get the compensation they deserve. And whether that’s just adjusting immunity laws, setting up some kind of compensation fund, or increasing insurance liability limits; one of those items, all three, they would all be great. So thank you very much for having me on today. I do appreciate it.
Todd Berg: Again, my name is Todd Berg, I’m at Michigan Auto Law in Farmington Hills, Michigan, and the best way to reach me is on email at [email protected].
Advertiser: Thanks for listening to Lawyer to Lawyer, produced by the broadcast professionals at Legal Talk Network. Join J. Craig Williams and Robert Ambrogi for their next podcast covering the latest legal topic. Subscribe to the RSS feed on legaltalknetwork.com or in iTunes. The views expressed by the participants of this program are their own, and do not represent the views of, nor are they endorsed by, Legal Talk Network, its officers, directors, employees, agents, representatives, shareholders, and subsidiaries. None of the contents should be considered legal advice. As always, consult a lawyer.
Notify me when there’s a new episode!
|Published:||February 20, 2015|
|Podcast:||Lawyer 2 Lawyer|
|Category:||News & Current Events , Legal Technology & Data Security|
Lawyer 2 Lawyer
Lawyer 2 Lawyer is a legal affairs podcast covering contemporary and relevant issues in the news with a legal perspective.