The rapid development of ever-advancing driverless technologies brings with it a whole slew of legal questions. For one, who is responsible for regulating the use of this new technology? Can the term “autonomous” really be used to describe any of today’s vehicles? And, if a vehicle boasting some level of autonomy is involved in an accident, who is at fault? State Bar of Texas host Rocky Dhir welcomes Quentin Brogdon and Ron Hedges to answer common questions about autonomous vehicles and their many intersections with different areas of the law.
Quentin Brogdon is a partner at Crain Brogdon Rogers in Dallas, where he specializes in personal injury trial law.
Ronald J. Hedges is senior counsel with Dentons US.
State Bar of Texas Podcast
Autonomous Vehicles: Legal Implications of a Driverless Future
Intro: Welcome to the State Bar of Texas Podcast, your monthly source for conversations and curated content to improve your law practice, with your host Rocky Dhir.
Rocky Dhir: Hi and welcome to the State Bar of Texas Podcast. What are you doing this very moment while you listen in to the podcast? Do you have the podcast on while you work at your office, maybe you are doing the dishes, working out maybe?
I wager there is a good chance that you are on the road while you listen and I bet there is a decent chance that you are stuck in traffic. For those of us with commutes, traffic can make all the difference, right, unexpected delays due to construction or an accident make for a rough start to the morning.
For me an uncharacteristically light traffic day can add a little pep to my step.
Wow, I sound like an actual adult now, a really lame adult to be honest.
But anyway, kidding aside, according to a New York Times article from January of 2019 titled “Stuck and Stressed: The Health Costs of Traffic”, traffic related stress correlates to, believe it or not, an increase in domestic violence. This is serious business for sure.
But what if we no longer have to navigate through traffic, what if someone or something does it for us? That same New York Times article poses the following question, and I quote, “Although self-driving cars won’t cure traffic woes on their own, they may be able to reduce stress. If you’re crawling along in traffic and are late to an appointment, but are allowed to take a nap, play video games, watch your favorite TV show or sip on a cocktail, will that reduce your stress?”
Hmm, self-driving cars, to us lawyers that question is chock-full of even more questions, questions that we lovingly refer to as “legal issues”.
To help us drive through the traffic jam of legal issues posed by the prospect of self-driving cars, we have with us today Quentin Brogdon and Ronald Hedges.
Quentin is a partner at Crain Brogdon Rogers in Dallas, where he represents plaintiffs in complex and catastrophic personal injury cases. Quentin has an illustrious résumé filled with accolades and credentials, but what might be most important for purposes is that he has represented plaintiffs against carmakers whose products boast some form of autonomous driving and he has spoken frequently and articulately about the legal issues surrounding those vehicles.
Ron Hedges is Senior Counsel in the New York City office of Dentons US. Ron’s legal talent is multifaceted. He is an expert on electronic discovery, he was a Federal Magistrate Judge for 31 years and he knows a thing or two or 100 about self-driving cars, especially from the defense perspective having written an article titled “Can States Steer Clear of Liability for Accidents Involving Autonomous Vehicle Technology?” That article appears in the fall 2019 edition of TortSource, a publication by the Tort Trial & Insurance Practice Section, aka TIPS of the American Bar Association.
TIPS also has an Auto Law Committee where you can learn more about this fascinating and growing topic.
Now, without any further ado, let’s welcome Quentin and Ron. Thank you both for being here.
Quentin Brogdon: Thank You Rocky.
Ronald J. Hedges: Thank you for having us Rocky.
Rocky Dhir: Oh absolutely. So Ron, let’s start with you and let’s get some basics. What is a self-driving car and do any actually exist?
Ronald J. Hedges: We have to actually step back a little from the concept of the self-driving car and talk about the vehicles that are on the road today. Are there autonomous vehicles on the road today? Yes, they are testing more than anything else, so some states are beginning to regulate for it. And we are seeing areas where AVs, autonomous vehicles are being operated.
But in addition to that, we have vehicles on the road that are not autonomous, but have automatic systems or the like within the vehicles to assist in driving. So we have those on the road all the time, and I am sure that Quentin can talk about those from how you describe what his practice is. So I expect he has had some litigation involving that.
So we have two types of vehicles. The last estimate I have seen for all autonomous vehicles on the road says that that should happen in around 2040. So we have a couple of years to go before we worry about that and we are always going to have the legacy system, if you will, where — I am assuming we are going to have a legacy system where there are going to be automated vehicles or autonomous vehicles, driver-assisted vehicles and old vehicles, if you will, maybe antiques, whatever you want to call them that are going to be on the road until we regulate somehow to separate those vehicles or eliminate one from the road.
Rocky Dhir: All right, so let’s maybe walk-through those different categories now. First of all Quentin, do you concur with that overview of this industry or is there some other nuance that we need to be aware of?
Quentin Brogdon: No, I concur with Ron’s view of the industry.
Rocky Dhir: All right, so let’s walk through these. So I think obviously we have got the, what I will call traditional cars, the ones that we operate completely on our own and those go back from the 1960s on before to when people did all their driving themselves.
Now Ron, you also mentioned where you have got cars with some kind of driver assistance, would an example of that be just your traditional cruise control or are you talking about something else?
Ronald J. Hedges: Cruise control is what I expect most people would think of, but now we have other aiding mechanisms, for example, that might apply brakes automatically, that might allow you to park automatically and the like. So those are some examples I would give of these assisted vehicles.
We have some other vehicles that are self-driving, but they still have drivers there as a safety and we have seen a couple of accidents unfortunately, one or two fatal, involving vehicles like that. So we are running the gamut of things now.
Rocky Dhir: Sure. And you mentioned the regulatory hurdles that some states, maybe some local governments are trying to run through the regulations and figure out how to regulate for I guess autonomous vehicles or semiautonomous vehicles. What are those regulatory hurdles and what are the issues surrounding those regulations?
Ronald J. Hedges: First there is the question of which governmental entity is going to be doing the regulating. The federal government sets a number of standards now for vehicles on safety and the like, but the federal government seems to be taking a pretty, not lax, maybe a step back attitude to allow experimentation by the states, at least as far as allowing these vehicles on the road.
What I expect we will see one of these days are federal regulations that are going to be talking about the levels of autonomy and the like, that may be prescribing safety related matters to be applied uniformly across the United States. The natural operation of vehicles is something that I believe is going to be regulated at the state level.
And the US DoT, the Department of Transportation just came out with a document earlier this year talking about autonomous vehicles. I think I am summarizing accurately what that — that’s a document 3.0 and another earlier document 2.0, where the federal government is going as far as regulation.
Rocky Dhir: Now, at that point we are talking about regulation from the federal or state level. There is also the issue of individual liability in the event of a collision, whether that’s the driver’s liability or it’s an automaker’s liability.
So Quentin, why don’t we have you weigh in on that question? So we know that there is this question that Ron just posed about the regulations and how you — who regulates autonomous vehicles and how they are to be regulated, but in the event of an accident, is there — are we relying simply on a regulatory scheme or are we still going back to driver error or manufacturer error? Can you walk us through that tightrope especially in the era or the impending era of autonomous vehicles?
Quentin Brogdon: Rocky, there is a raging debate going on as to whether the existing regime can adapt. The Brookings Institution is representative of those who say the current regime can adapt and that we do not need — the federal government does not need to preempt state court authority regarding autonomous vehicle liability, that that would be a mistake and that our existing regime of products liability law has proven to be remarkably adaptive to new technologies and it will do so again in the field of AVs.
The RAND Corporation on the other hand is representative of those who say that we cannot adapt and that Congress should consider preempting inconsistent state court remedies to basically minimize what the manufacturers and others have to deal with across state lines.
And so I for one believe that our existing regime has been adaptable traditionally to airplanes and automobiles when they first came out, for example, and that it can be adaptable again and that the industry does not need to be coddled, it needs to be held accountable when shortcuts are taken with respect to safety.
If you have just a profit motive and an incentive to be the first out there to race with the new technologies onto the road, then on occasion and we have already seen this happen, not just arguably, but I think in fact shortcuts have been taken and technologies are being beta tested in essence on our roadways and we know crashes are happening with some regularity now. And if a phone, an iPhone is being beta tested and it crashes, you lose data, but if an AV is being beta tested and it crashes, you lose lives.
And so traditionally when you have had a car crash, you have had one driver suing another in state court, and then on occasion you will have products liability cases where you are suing a manufacturer and often those end up in federal courts. But what we may now be trending towards in the future is more and more cases being in essence miniature products liability cases involving AV manufacturers and designers and software developers and component parts manufacturers and those cases may become more expensive and may become more often cases that end up in federal court.
And our insurance companies are having in essence emergency meetings now because of the unknowns. We know that more than 90% of all crashes are caused by human error. We know that in the long-term probably as crashes decrease, premiums will go down, but in the short-term it’s unclear and premiums probably will go up because of the unknowns, because of the increased cost of repairing these more sophisticated vehicles, the AVs and we are going to see a migration of liability from the individual driver more and more to the manufacturers and to corporations and that’s going to change automobile insurance dramatically.
Rocky Dhir: So it sounds Quentin like what you are doing is saying right now we are going from a shift from individual driver liability to now some sort of hybrid, of liability between the manufacturer and the driver and so now you have got two defendants whereas before you maybe just had one. Is that a pithy way of summarizing or is that oversimplifying?
Quentin Brogdon: No, I think that’s accurate and furthermore we are in really a phase of partial autonomy, not full autonomy in spite of manufacturers such as Tesla who arguably promote and sell their vehicles as fully autonomous, at least those have been the claims.
I think Robert Sumwalt, the Chairman of the National Transportation Safety Board, said this week or last week, this was his quote, “If you own a car with partial automation, you do not own a self-driving car. Don’t pretend that you do.” And some have argued, mainly plaintiffs’ attorneys, but the families of some who have been killed and Tesla in particular have argued that they are being hyped and the designers of these vehicles, they are being hyped as fully autonomous.
And when you call a feature on a Tesla, for instance, autopilot, are you not engendering at least the thought that I can turn over completely the driving of the automobile to the machine and those cars are not ready for that to occur. We have seen that when they crash, not just Tesla’s, but others.
So we are still testing the technology. One day we will get there and there are going to be ethical issues in the meantime as well, do the cars need to be perfectly safe, the AVs or do they just need to be safer than the average human driver and that’s a big ethical question that will control how quickly these get rolled out, what is our tolerance for deaths in passengers and drivers in these autonomous vehicles.
Rocky Dhir: Well, and if morning rush hour is any indication the average human driver is a pretty low bar when you are looking at safety for a AV.
Quentin Brogdon: Sure.
Rocky Dhir: Now Ron, I want to hear your perspective on what Quentin just laid out for us. So he talked about Tesla’s Autopilot or I think Enhanced Autopilot as they have rolled out in the last year or two. Now, I have actually been in cars with autonomous control, I have been in cars with autopilot or enhanced autopilot and the one thing I have always noticed is there is all these warnings that go up and say keep your hands on the wheel, stay attentive, don’t play a video game while your car is an autopilot, is that enough for Tesla to avoid liability or is there more that’s needed? I want to hear your perspective to what Quentin just laid out.
Ronald J. Hedges: The example you just gave about what you experienced is a perfect example of the types of liability that you are going to be looking at with these vehicles.
If a vehicle, a Tesla vehicle or another vehicle gets involved in a crash, the immediate thing that’s going to come to mind with regard to the vehicle is products liability. And we have to talk about reasonableness in design and manufacture and the like, just like we do with any other product.
But at the same time the warnings are going up. And if you are an individual driving the vehicle or is what appears to have happened in a fatal accident in Arizona, you are being inattentive at the time, it sounds like a negligence cause of action. We don’t insure against everything going wrong under products liability or negligence regimes; we insist on a reasonableness standard.
So you would think if all the warnings go up, that might be enough to protect the manufacturer from liability, but then someone could say, I could easily see this argument being spun out, it’s not enough to have a visual sign going up, if that’s what it is, you have to have an audio broadcast or whatever in the car as another reminder to someone. And the jury one of these days is going to have to decide whether if a visual image goes up, is that enough, do you need an audio image or audio recording; the same thing with the negligence of an operator, was the operator being inattentive, did the operator not look at anything and the like.
But one thing I want to throw in on top of everything else, we are really talking about multiple layers of liability here beyond the manufacturer. If the autonomous vehicle transmits information through some type of provider somewhere else, then the question is going to be going back into discovery where along the chain of communication might there have been some type of problem.
I will give you another example, something that might be farfetched but we talked about it yesterday. There are a lot of systems and phones and whatever now that allow you to have remote access to stoves or whatever in your home. If you are in a vehicle and you are using a communication system in the vehicle to communicate with the home and if a mistake somehow is made somewhere and the home burns down because you turned on the stove, where is the liability along that?
That’s the Internet of Things, so it’s a little beyond autonomous vehicles, but I am trying to make a point that if you are in situations where there are accidents, you as an attorney are going to have to be able to talk to your client or engage in discovery to find out where something might have gone wrong along the chain of events that led up to an accident.
Rocky Dhir: So Quentin, let’s stick to what Ron just said. He laid it out in multiple layers, but let’s take the example of, if I am driving a Tesla and I decide to engage autopilot and then I think Quentin, in the report that you are alluding to earlier, in that case the driver was actually playing a game on his phone when his car crashed while it was engaged in autopilot. So at that point is that strictly driver error or is there something else that the auto manufacturer has to do to make sure that the human does not start playing video games or read a book or take a nap or do something else while that car is engaged in its autopilot function? What else should Tesla do from a plaintiff’s perspective?
Quentin Brogdon: As far as the warnings and things beeping at you, anybody who has sat in a hospital room with a loved one and heard the alarms and beeps going off and seen the nurses and even the loved ones of the patient ignoring them after a period of time knows that warnings and beeps quickly become white noise, it’s just human nature.
As far as negligence, I think there is a longstanding principle in products liability law that misuse of a product is foreseeable. And simply because you warn or put something in an owner’s manual does not in and of itself necessarily absolve you as the manufacturer of liability in the situation, it’s a case-by-case determination.
Some would argue — many have argued that the very name autopilot creates an expectation on the part of the users that it is indeed safe for me to check my emails as my Tesla drives itself down the highway. And you have seen in some of these crashes that literally these aren’t situations where the hand was off the wheel for 20 minutes before the crash, the hand will be off the will for 20 or 30 seconds of the minute preceding the crash.
So it’s not an extended period of time, you will notice as you study some of these crashes that hands are even off the wheel, but it’s enough and the human vigilance has been down and it’s an open question whether the designers of these systems are up to the task of keeping the human driver vigilant and engaged.
And if you contrast how airplane systems are designed, there is a dramatic contrast, pilots in airplanes and the design of the system as such, the pilots are expected to say stay and must stay engaged and then they are warned when they make a mistake.
The Tesla design on the other hand lets you disengage and then warns you to reengage and that has been shown time and time again, once you allow a human operator to disengage then it is very foreseeable that is going to be difficult to reengage that driver. Vigilance is dropped, attention is elsewhere whether it’s on a video game or checking an email and the beep and the warning in and of itself does not get the driver engaged when it should on many occasions.
Rocky Dhir: Now if I am trying to play the other side of that coin. I would say, look, you get plenty of drivers without autopilot or without some form of autonomous control who are still checking their emails while their hands are on the wheel, and accidents happen that way too, but we don’t call that a product defect, that becomes driver negligence. So how do you parse that and is that even a fair question. Ron, is that the argument you would make in response to Quentin or have I just totally botched the defense side of that?
Ronald J. Hedges: I think the argument you’re making is something that would be very good to present to a Jury, and the Jury will have an opportunity to say that there was no negligent design or manufacturer or whatever the product reasonable warnings were given and it’s entirely possible in a situation like that if you are in a comparative negligence regime.
I can easily see a Jury say, well, the driver is responsible for it and the manufacturer is responsible for it. One thing you could say, for example, you are looking at devices to put into vehicles to avoid the problem, the vehicle will not engage autopilot unless both hands of the driver are on this particular pad of the wheel.
So it’s everything else in this area, it’s a question of cost to do something whether it’s reasonable to incur that cost, whether the solution is a reasonable one, but then again, I don’t know about making the vehicle the insurer of an individual who’s negligent. And I think that’s a lot of what goes into the question you just posed.
Rocky Dhir: Well, so Quentin, what would be your take on, on that question? So you raised the analogy of people sitting in a hospital room and ignoring the beeps or maybe even a fire alarm going off in a building and people staying at their desks, might be another example of that or people just ignore warnings and therefore that means that there is a misuse of the product if I understood the argument correctly, but then people are misusing non-autonomous cars by checking their emails or talking on the phone with a handheld – without their hands-free system engaged, they are just talking on a phone while driving a vehicle. So at that point is that a product defect on the vehicle’s side or is that purely user error? I mean, can you try to — maybe answer that question from a plaintiff’s perspective?
Quentin Brogdon: It’s a situationly dependent analysis and if you are driving your grandfather’s car non-autonomous and you are distracted and you are checking your emails or you are playing a video game on your phone, you are unambiguously in control of that vehicle and you are negligent as the operator if you smack into the back of another car and someone is injured. So that’s the easiest extreme of the spectrum to analyze.
The other end of the spectrum is, if you are in a completely autonomous vehicle in 2040 or whatever year that occurs and you are going down the road and there’s no expectation, no legal requirement, no regulatory requirement that you remain vigilant and you are literally allowed to read a book, assuming books still exist or check your cell phone messages or do whatever as you go down the road, and there’s a crash, you should be completely absolved.
What we have today is what one, believe it or not, there are professors of AVs now. What one AV professor calls the mushy middle of partial autonomy, we are somewhere in between in the spectrum and Ron is correct that in a comparative negligence or comparative fault regime the Jury would have to weigh in and balance what were the expectations, what was the parsing of responsibilities here, what responsibility do the operator maintain, was the operator apprised of the limitations of the system and when the operator would need to step in, and did the operator somehow fall down in carrying out the duties of the operator?
But now some of these operators at least are being lolled, some would argue into a sense of false dependency on a system the belief that the system has capabilities that it does not, the belief that the system literally can drive the vehicle without human input and that human input and human control is the exception rather than the rule. And there’s an argument to be made, it’s all I am saying, there’s an argument to be made that the designers of the systems can do and should do a better job of keeping the drivers engaged and of alerting the drivers when they fail to become engaged.
And for instance, with respect to Tesla one of the discussion points was about knowing when the driver is disengaged, Tesla uses a torque sensor on the steering wheel to monitor whether the user is engaged in control of the vehicle, and that has been criticized by the NTSB and others in comparison to, for instance, what GM Cadillac uses, which is an actual camera monitoring the eyes of the user. That system monitoring the eyes of the user most believe and agree is much more effective and truly gauging whether the operators is staying vigilant. However, it has implications of big brother filming you as you are driving down the road.
Rocky Dhir: Or cybersecurity I would imagine too, right? There’s somebody hacks into her car and now suddenly your privacy has been compromised.
Ronald J. Hedges: Well, let me raise something else just on what Quentin said.
Rocky Dhir: Please.
Ronald J. Hedges: If this vehicle is being used by a California consumer and biometric information is being collected, it’s entirely possible that the California Consumer Privacy Act may come into effect. Same thing in New York State because New York has the Shield Act now that is intended to protect biometric information. Illinois has that.
So imagine you are in a situation where I am not the owner of the vehicle, I haven’t given whenever consent I am suppose we give to allow my biometric information, my eyeballs to be looked at and recorded, I let a friend drive the vehicle, where’s the consent that the friend gives when he uses a vehicle that he knows his biometric information is being recorded, which is probably going to happen here, it’s got to go somewhere to be recorded someplace. What are the privacy interests on that?
Rocky Dhir: So arguably if I, again, I don’t know much about these privacy acts, but I would think arguably the consent would follow the vehicle as opposed to the consent following the individual, is that not how the consent works in those particular jurisdictions?
Quentin Brogdon: My understanding is consent means you have to effectively give informed consent to what’s going on and in some jurisdictions it may even have to be in writing. Now, assuming that’s right whether it’s informed consent or written consent or whatever, use my example. I own a car, I just go to my friend, and hey, drive for me for 15 minutes, I am tired. I don’t know where the consent is when I sit behind the wheel and I operate the car, there’s a screen pop up and say by the way you are consenting to the capture of all my biometric information, so I can monitor your driving, and then I have got to assume in these vehicles and the manufacturers – someone is collecting the information and maybe even using it for research purposes at some point.
I think that’s an area of regulation by litigation if you want to call it that or regulation in California by the California Attorney General as well as possible class actions to think about in this area and cybersecurity something else I was positing before what happens when you need to do discovery to find out the way electronic communications or electronic information is communicated, what’s the cyber security controls in place with all this information here because Quentin is positing a lot of personal information that may fall within the scope of privacy statutes going out there. So what reasonable security measures are being done with data that’s being transferred?
Rocky Dhir: Or somebody hacks into your car for whatever reason and wants to monitor what’s going on, I mean, that’s a distinct possibility if it’s being sent over the airwaves, is it not?
Quentin Brogdon: It’s a fun thing to think about. It’s like some instances now where we seem to have seen where medical devices are being hacked.
Rocky Dhir: Right. Now, one thing that I think both of your perspectives have in common and you both said this is that ultimately this is up to the Jury and it’s very case-dependent. So if we look at this from a litigation perspective, whether it’s plaintiff’s side or defense side, is there a situation where you can get summary judgment or is this purely a factual inquiry every single time where you have to go in front of a jury in order to get a definitive response?
Now I know, Ron, you were Magistrate Judge for a number of years, but Quentin, I know you have had your share of summary judgment hearing. So let’s start with you, Quentin, do you think this is always going to be a factual inquiry that has to go in front of a Jury or do you see there being a situation where summary judgment would be warranted one way or the other?
Quentin Brogdon: I think it will be fact-specific. I do not think summary judgment will always be warranted or always not be warranted. I can foresee a set of circumstances or facts in a scenario where it is unambiguously shown that I fell asleep or I checked out and didn’t have my hand on the steering wheel for the last 30 minutes or whatever it may be and it’s just unambiguously clear that I failed to stay vigilant, that I was warned repeatedly and then the crash happens. In that scenario, perhaps a summary judgment would be completely warranted.
Rocky Dhir: Ron, what about you?
Ronald J. Hedges: There are so many variables that go into an award of summary judgment in a product’s liability case. One of the major things you have to consider that we haven’t even talked about today, is how you go about putting evidence in when you’re dealing with these systems? We’re talking about expert testimony. That’s going to drive up the cost of discovery, it’s going to drive up the cost for a party getting an expert report, getting a rebuttal report.
Is summary judgment possible? Sure. And there have been a number of situations where summary judgment can be granted in product’s liability cases. Causation is probably one of the best ones; the simplest ones, a basis for an award of summary judgment. I just think it’s going to take a long way to get there.
Rocky Dhir: Interesting. Now, let’s think about — for any of our listeners who want to maybe get more steeped into issues involving autonomous vehicles or self-driving cars, as some people call them, where can you go to learn about this? Are there publications, are there forums, are there CLE programs?
Quentin, let’s start with you. I know you’re very knowledgeable in this area, how did you come about learning all this, was it on-the-job or did you read a book of some kind?
Quentin Brogdon: Well, it’s ripped from today’s headlines. There have been a whole series of spectacular crashes and then there have been follow-up investigations by the NTSB and the National Highway Traffic Safety Administration and others and detailed reports have been issued. There have been governmental studies, there have been studies by Think Tank such as Rand and Brookings and others. There have been Law Review articles.
And so in the last five to seven years there’s been just sort of a tsunami, if you will, of legal commentary and writings and analysis and studies and the initial waves seem to predict and prognosticate this will be here, tomorrow or in five years and everyone, it seems now is taking a little bit of a step back and realizing there are issues that still need to be resolved not just technological issues, but even ethical issues and regulatory issues and our government has sort of been a day late and a dollar short, they’re working on that now belatedly. There have been some bills in Congress circulating in recent months.
And so, it’s something if you’re interested in you will see an article almost every single day that touches upon autonomous vehicles in some way, if not every day.
Rocky Dhir: Well, it’s interesting. One of the other issues and I think Ron might have touched on this was the insurance aspect of it and how you insure these vehicles. I know I mentioned earlier the TIPS Auto Law Committee. There’s been a lot of talk at that Committee about the insurance ramifications and liability ramifications. So, yeah, there’s many, many layers to this. Ron, where do you recommend lawyers go to learn more about the legal issues surrounding these types of cases?
Ronald J. Hedges: There are a number of CLE programs out there that are talking about this and related technologies, and I’m coming to autonomous vehicles from a couple different perspectives.
I’ve written in addition to what you mentioned before for TIPS. I’ve written articles and I’ve spoken on Fourth and Fifth Amendment implications of individuals using autonomous vehicles. I’ve done a lot on other types of technologies, drones for example, and the conversation we’re having today about AVs kind of mirrors a webinar I did yesterday or a couple days ago on drones, who’s monitoring them, who’s regulating them, what are the liability schemes. So coming out of my background, doing a lot with electronic information and litigation, I’ve gotten into these new technologies.
Something else that’s related to this that Quentin and I were talking about a little bit before with the biometrics is special recognition technology.
Rocky Dhir: Sure.
Ronald J. Hedges: I don’t know how that might fit into the vehicle, but now that I’m thinking of it, I could easily see an argument being made. You’re not going to be able to use this vehicle unless your biometrics match what we have on file.
Rocky Dhir: Yeah, what happens if you chose to wear glasses that day or didn’t shave for a few days, grew a beard whatever and it can — your biometrics may change, so that’s interesting.
Ronald J. Hedges: Well, eyeballs theoretically don’t, but you’re right, if you’ve got glasses on. I wish I had good answers for all that, but the way technology is advancing in these areas now, it’s just exponential and we as lawyers need to figure out how we deal with it in the regimes we have now. So we have Products Liability Law, we have Negligence Law, where for example you’re thinking about the law differently again, thinking about the law of trespass, has the law of trespass help you when a drone is flying over your property.
So there are a lot of different things we have to think about and until there are new regulatory or liability regimes out there we take what we’ve got and we fit everything in, like we’ve been doing for a very long time.
Rocky Dhir: Wow! Well, obviously there’s many layers to this and it’s probably more than we can fit into one podcast. I could talk about this all day. This is fascinating stuff and we’ve got two great guests who helped us with this, but unfortunately, that is all the time we have for today.
I do want to thank my guest, Quentin Brogdon and Ronald Hedges, for joining us.
Thank you both!
Quentin Brogdon: Thank you! Thank you, Rocky!
Ronald J. Hedges: Well, I enjoyed it. Thank you for having me!
Rocky Dhir: Absolutely. And of course, I want to thank you, the listener for tuning in. If you like what you heard today, please rate us and review us in Apple podcasts, Google podcasts or your favorite podcast app.
Until next time, remember, life is a journey, folks.
I am Rocky Dhir, signing off for now.
Outro: If you would like more information about today’s show, please visit legaltalknetwork.com. Go to texasbar.com/podcast. Subscribe via Apple podcasts and RSS.
Find both the State Bar of Texas and Legal Talk Network on Twitter, Facebook, and LinkedIn or download the free app from Legal Talk Network in Google Play and iTunes.
The views expressed by the participants of this program are their own and do not represent the views of, nor are they endorsed by the State Bar of Texas, Legal Talk Network, or their respective officers, directors, employees, agents, representatives, shareholders, or subsidiaries. None of the content should be considered legal advice. As always, consult a lawyer.