Are you tired of driving yourself to work? Have you always wanted a chauffeur but never could afford one? If this sounds like you, then happy days are here with the advent of the driverless car. Institutions like Google, Carnegie Mellon, and Uber are developing what they hope to be totally autonomous vehicles capable of ushering passengers to and from destinations without the need for a human driver. But what does that mean for the law, safety standards, and our freedoms?
In this episode of Lawyer 2 Lawyer, host J. Craig Williams interviews attorney and author of Robots Are People Too John Weaver, researcher and writer for Michigan Auto Law Todd Berg, and litigator and author of Motorista Anna Eby. Together they discuss liability for passengers, possible federal regulations, and risks associated with vehicle hacks. In addition, they debate when the government might pilot your driverless car, how medical emergencies in autonomous vehicles will be handled, and the possibility of the repo man summoning your automobile. Tune in to hear about existing driverless car laws and much much more!
John Weaver is the author of Robots Are People Too, which explains amendments to existing laws that will become necessary as artificial intelligence and autonomous technology become more widespread. In addition, he is a contributing writer for Slate magazine, addressing similar topics, and an associate attorney at Morgan, Lewis & Bockius LLP where, among many areas, he provides legal services for property acquisition, financing, and development projects.
Todd Berg is a former trial attorney and legal news reporter with Michigan Lawyers Weekly. Today he provides analytical research and writing support to the attorneys at Michigan Auto Law, a firm entirely dedicated to the representation of clients involved in auto accidents. They boast the largest auto/truck verdict in their home state during the last 15 years and guarantee a victory for their clients or they won’t charge a dime.
Anna Eby is a business litigator and appellate attorney in Austin, Texas, where she represents businesses and individuals in complex commercial disputes before Texas state, federal, and appellate courts. In addition and relevant to our episode, she is an avid car photographer and motorsports attendee who maintains a car blog called, Motorista, which covers legal issues in the automotive industry.
Special thanks to our sponsor, Clio.
Lawyer 2 Lawyer: Driverless Cars: Who’s “Driving’ and Who’s Responsible? – 2/21/2015
Advertiser: If we do drive into a new driverless car world, there are certain things in the law that are going to have to change. Everyone is used to having new technology being introduced every year, every month. Now the law always is following behind the technology, so I don’t know that the law is ready for it yet. That gets us into a dangerous gray area where we’re either trying to apply existing laws that really don’t fit the situation, or we’re not addressing at all, which is even more dangerous
Welcome to the award-winning podcast Lawyer to Layer, with J. Craig Williams and Robert Ambrogi, bringing you the latest legal news and observations with the leading experts in the legal profession. You’re listening to Legal Talk Network.
- Craig. Williams: Hello and welcome to Lawyer 2 Lawyer on the Legal Talk Network. This is Craig Williams,coming to you from a very sunny and warm Southern California. I write a legal blog called May It Please the Court. My co host, Bob Ambrogi, is off today visiting the Massachusetts State House. Before we introduce today’s topic,we’d like to thank our sponsor, Clio, an online management system for lawyers at www.GoClio.com. Imagine a world where you no longer have to drive your car to work. Simply type in the destination, then your automobile takes care of the rest; leaving you free to do work, drink a coffee, or otherwise relax as you hurtle down the road in an autonomous vehicle. In recent news, there’s been a fair amount of talk about Google and a number of other companies’’ driverless cars. Though many applaud the idea of an automobile that can both drive and park itself, there remains a lot to consider about the licensing, safety, liability and freedom issues associated with these vehicles. So to discuss this topic, we have joining with us today three guests. First is John Weaver. John is the author of Robots are People Too, which explains amendments to existing laws that will become necessary as artificial intelligence and autonomous technology become more widespread. In addition, he is a contributing writer for Slate magazine, addressing similar topics, and he’s also an associate attorney at Morgan, Lewis & Bockius LLP where, among many areas, he provides legal services for property acquisition, financing, and development projects. Welcome, John Weaver.
John Weaver: Thank you for having me.
- Craig. Williams: In addition, we have joining us Todd Berg. Todd is a former trial attorney and legal news reporter with Michigan Lawyers Weekly. Today he provides analytical research and writing support to the attorneys at Michigan Auto Law, a firm entirely dedicated to the representation of clients involved in automobile accidents. They boast the largest auto/truck verdict in their home state during the last 15 years and guarantee a victory for their clients or they won’t charge at all. Welcome, Todd Berg.
Todd Berg: Thanks a lot Craig, it’s great to be here.
- Craig. Williams: And last, certainly not least, we have Anna Eby. Anna is a business litigator and appellate attorney in Austin, Texas, where she represents businesses and individuals in complex commercial disputes before Texas state, federal, and appellate courts. In addition, and more on point to our episode, she is an avid car photographer and motorsports attendee who maintains a car blog called, Motorista, which covers legal issues in the automotive industry. Welcome, Anna Eby.
Anna Eby: Thank you, happy to be here.
- Craig. Williams: Well, let’s just start with Anna. Is the United States ready to have driverless cars?
Anna Eby: I don’t know that the law is ready yet, I think consumers are. Everyone is used to having new technology introduced every year, every month. So I think folks are ready for it; now the law always is following behind the technology, so I don’t know that the law is ready for it yet.
- Craig. Williams: Todd, what’s your perspective?
Todd Berg: I would agree with Anna. Aside from the need to perfect a technology as much as possible, I think that if we’re going to move into this brave new world, we need to have some brave new laws to deal with the implications of that for people’s safety in using driverless cars.
- Craig. Williams: John, let’s tweak this question just a little bit. For consumers, for the people who are buying cars, are they ready to start buying consumer driverless cars?
John Weaver: I think some people are and some people very much aren’t and some of the people who aren’t never will be. If you listen to manufacturers of some of the higher end automobiles in the market right now, they might be developing self driving technology or looking into it, but they’re not exactly embracing it.
- Craig. Williams: So we can expect Rolls Royce to remain a Rolls Royce and not a driverless car.
John Weaver: Their customers – or at least the ones they’ve identified as their ideal customers – want to drive; that’s part of what they’re looking for and they’re buying a certain type of machine to enhance their driving experience. A lot of drivers like commuters – I certainly have myself in that – are looking forward to having a car that does the driving for them so that they don’t have to have the ball drop.
- Craig. Williams: Todd, what’s your understanding of who’s going to be the driver in these driverless cars from a legal liability standpoint? Will it be the person that sits behind the wheel – in fact, if there even is a wheel; or will both of the people or anybody that’s sitting in the cars be considered passengers and the driver will be the car itself?
Todd Berg: That’s a great question; in a way, it’s all of the above. I think that even though we have the technology involved, it’s still going to come back to the same thing: who’s in control of the vehicle. If you’ve got a driverless vehicle, certainly the technology qualifies as an operator or a driver, but I think whoever else is in the vehicle who is probably directing the technology, which is then directing the travel of the vehicle, I would say that they also qualify as a driver or operator of the vehicle. If you’re just riding in there as a passenger, either in the passenger seat or in the backseat and you have no say in how fast the car goes and what the direction is, I think the argument there would be that you’re just a passenger.
- Craig. Williams: What about the situation where the car takes voice commands? Can’t anybody in the car issue a voice command?
Todd Berg: Yeah, I think so and I think if you do that you kind of self-appoint yourself as a driver to some extent.
- Craig. Williams: I remember when my mother used to drive with my father and she would regularly instruct him how to drive.
Todd Berg: Theoretically, he had the option of not following those instructions, but I think a person in the backseat of a driverless vehicle who maybe gives a voice command to turn here or take I-275, I think they’re an operator or they’re controlling the destination of the vehicle as much as if they reached over and hit the accelerator or turned the steering wheel.
- Craig. Williams: Anna, when I was talking with my wife about this program this program, she was like, “Great, how do we use these cars? Can I just pack up my grandson in the car and send him back to his mother? And what about crimes in a situation where someone says to the car, drive into this building, or hit that person.” Is that even possible?
Anna Eby: That’s an interesting question and since I’m not an engineer, I don’t know if I have a scientific answer to that question. I would hope that there would be one person who would be designated the operator, if you will, separate apart from the actual technology and the vehicle. Someone who would be responsible for engaging the autonomous system, so that they would sort of be like the driver even though they don’t actually have their hands on the steering wheel or doing something like that. Otherwise, I agree with what Todd was saying and I think that creates some issues if you’ve got 4 people in a vehicle and everyone can give voice command. It’s hard for me to imagine a system where we’ve got cars where you can just put someone, like a child, in your example, in a car and just send them off. I’m hopeful that we will always have a responsible person who knows how to operate a vehicle in the vehicle.
- Craig. Williams: I guess that means that there won’t be any 12 year olds driving cars.
Anna Eby: I hope not, although we’ve got plenty of adults who act like 12 year olds when they drive, so maybe we’ll be better off.
- Craig. Williams: What kind of liability, John, do you think that Google and whoever invents these driverless cars is going to have? I Mean, certainly, since there’s more control invested in the car, certainly the people who make the car are going to be liable.
John Weaver: This is a topic that I address in my book pretty extensively, with the idea of where does the liability go and who is the operator. In some situations, typical product liability law is going to govern, if the body mechanism breaks down and it’s the manufacturer’s flaw or a design flaw, then yes, absolutely; the liability is going to go to the liability is going to go to the manufacturer. If the car works exactly as it’s supposed to and still gets into an accident, I think that the line as to where the liability should go gets a little murkier. It’s entirely possible that a car driving law gets into a situation where a deer jumps in the middle of a road and the car has to make a split second decision and the best case decision can still cause an accident, should the least of possible accident that it can cause; liability obviously has to go somewhere. The states have addressed this – California, Florida and Nevada – have indicated it in their statutes exactly what Anna was talking about: that the operator, person behind the wheel, has liability as the operator, even if the self-driving mechanism is engaged. Or, if there isn’t somebody in the driver’s seat, whoever turns on the car is the operator under state law.
- Craig. Williams: Well, Todd, you work for a firm, you’re going to sue the manufacturer of the car every time, aren’t you?
Todd Berg: Well, not in Michigan you’re not. That’s kind of what I was saying, at the beginning of the show is that if we do drive into a new driverless car world, there are certain things in the law that are going to have to change to account for situations where tragedy occurs. And one of those is the liability situation. Here in Michigan, we’ve got a very strict product liability law. And if I had a wishlist in a driverless car world, the top of that wish list would be to amend or change the law so that manufacturer’s wouldn’t be exempt in situations like this. Under the existing law, it’s very difficult to establish liability for a manufacturer in a situation like this. You’d have to jump the moon to practically do it. And recently in the beginning of March of last year, a law was passed which allows manufacturers to put driverless cars on roads to road test here in Michigan. And there was a liability provision in that bill. And I think that is probably – for a lack of anything else – it’s probably a predictor of what the future may hold. And that bill said that liability for the manufacturer would only exist if a defect could be shown, and it could be shown that the defect was there at the time of manufacture. And that provides, as you might guess, a very broad immunity for the manufacturer. I just think that if these cars become the thing of the future and they’re all over the road, you have a situation where they’re creating a possibility or accidents, injury, death. They have to be held accountable on some level. I understand that it’s a developing technology and they need to have the room to grow and to move and flexibility, but at the same time the public needs to be protected. And somebody needs to be held accountable when things go wrong and someone gets hurt or someone dies.
- Craig. Williams: Is this going to be a subject of federal regulation? We see new technologies and new things come forward all the time – especially in the biological world when there’s all types of designing that can be done for babies and the like. I’m familiar with a couple of law schools’ model rules or model laws project, where the class will sit down with a professor and come up with an entire structure of laws for new technology. And you think about whether or not this is a state issue – and it’s certainly a state-by-state issue because each state has its own set of laws, but are we going to need federal regulation on this? And who’s going to sit down and draft up a new set of laws that address this kind of stuff? Todd?
Todd Berg: That’s a great question. Yeah, I think we are going to need a set of federal regulations. I think we’re going to need a set of regulations on two levels. One, setting minimum operational or functioning standards to make sure that these driverless vehicles can handle everything that’s out on the road; so that’s one set. What do they need to show to actually get on the road; but also, I think that in order to deal with these issues of product liability – and I think Michigan is not totally unique in the sense that we have a very broad immunity in our law and I think a lot of states have that – and I think the best way to deal with that, especially when you’re going to have manufacturers that are selling all across the country, I think we need federal regulations that may establish some level of liability on the part of these manufacturers. Either require them to start maybe a compensation fund, akin to the vaccine compensation fund, or maybe require them to take out a liability insurance policy for the vehicles that they’re putting out on the road.
- Craig. Williams: What happens in situations – since these cars are obviously going to be computer-based – that somebody finally figures out how to hack into it and then, say for example, just directs the car to drive over the cliff and kills the driver? What kind of fail-safes are in these systems and who gets liable for what the car does when it’s been hacked? John?
John Weaver: One of the mechanisms in place, at least under the law – and the states have addressed this – is a killswitch that would turn off the autonomous functioning. California and Nevada and Florida all require some form of this where the driver or somebody in the car is able to easily turn off the self driving function and they could take control of the car regularly. Presumably – and as I’ve said, I think we’re all attorneys here and not engineers – but presumably, the technology that will be developed, and undeveloped now, will be designed so that there is a firewall between the autonomous system that could be hacked into and just regular car functioning so that when the killswitch goes, the driver or would-be driver doesn’t have to worry about the hacker taking over their car even though human beings should be responsible for it. However, in terms of where the liability goes, if a hacker takes control of a car and runs it into another car or building, I think that’s another layer of complexity that requires some government attention and then some assistance in the form of legislation or regulation indicating where, collectively, we have decided as a country, where we want liability to go and how we want to address that. The laws right now don’t say anything about that and that gets us into a dangerous gray area where we’re either trying to apply existing laws that really don’t fit the situation, or we’re not addressing at all, which is even more dangerous.
- Craig. Williams: Anna, what about the situations where the government wants control over the killswitch for these cars? I could imagine that if I were in Boston and I were the mayor of the city, I might want to just flip a switch and turn off all the driverless cars so that nobody can get out onto the roads in these massive snowstorms that they’ve been facing. Other, less enticing ideas can come to mind about government control, but what’s the possibility there and who’s going to get control over these cars?
Anna Eby: Well, in Texas, I think the possibility is nil. We’re not very interested in having too much government control in Texas, so I don’t see that happening ever. As for the rest of you folks, I don’t know. That’s an interesting question that remains to be seen. I would be surprised if there was a great deal of government control over autonomous vehicles. I have a hard time envisioning that, that sounds like the dystopian novels that we read and think hopefully that would never happen in real life.
- Craig. Williams: I can imagine a situation where Boston would, say that they set up a WiFi network and just jam all of the cars coming into the city so that nobody can come in. They don’t have a switch in the car but they have a methodology of keeping a car out of a particular area. Todd, what’s your sense?
Todd Berg: I think it’s a great question and it definitely gives me a chuckle when I see it, because it’s really forward thinking and it’s possible. If we have driverless cars on the road, this is possible as well. The government always has a police power, they have the ability to tell people what to do and what not to do under very strenuous circumstances. So no doubt, the government might say, we’re invoking our police power and we’re going to take over this and we’re going to stop all of these cars going onto the road because it’s a public health hazard. Maybe that’s justifiable. Maybe if they try to do it because there’s too much traffic and it’s causing too many delays, obviously there would be an abundant pushback on the police power.
- Craig. Williams: Gas rationing, just imagine that. You can drive your car every other day, and it’s turned off on odd days and you can drive on the even days.
Todd Berg: There would definitely be a lot of scrutiny on decisions like that, but there’s also the other aspect of the other Constitutional side of this, which is fourth amendment privacy rights. The Supreme Court has ruled in the last couple of years that you’ve got to have probable cause and a search warrant to put a GPS on vehicles. You’ve got to have probable cause and a search warrant to check out the cell phone of somebody who’s been arrested. I think a lot of those – those two cases and the constitutional principles in those cases – would definitely come to the surface very fast as soon as the government starts trying to tap into the computer or whatever that’s governing their vehicle.
- Craig. Williams: Before we move onto our next segment, we’re going to take a quick break to hear a message from our sponsor. We’ll be right back.
Kate Kenny: Hi. My name is Kate Kenny from Lawyer to Lawyer, and I’m joined by Jack Newton, President of Clio. Jack takes a look at the process of moving to the cloud. Now how long does it take to move to the cloud, and is it a difficult process.
Jack Newton: No. With most cloud computing providers, moving your data into the cloud is something that takes just minutes, not hours or days to do. You can get signed up and running with most services in just a few minutes. Even if you have an existing legacy set of data that you want to migrate to a web-based practice management system like Clio, there’s migration tools and migration services that we’re able to offer to each that process. Most firms can be up and running in the cloud in less than five minutes, and can have their data imported in a matter of hours or days.
Kate Kenny: We’ve been talking to Jack Newton, President of Clio. Thank you so much, Jack.
Jack Newton: Thank you, and if you’d like to get more information on Clio, feel free to visit www.goclio.com. That’s G-O-C-L-I-O.com.
- Craig. Williams: Welcome back to Lawyer 2 Lawyer, I’m Craig Williams and with us today is is Todd Berg from Michigan Auto Lawyers, John Weaver, a lawyer and author of Robots Are People Too, and Anna Eby, lawyer and author of Motorista. We’ve been talking about government control of cars and its exclusion of particular days of driving or areas. Anna, what’s it going to be like when you have both driverless cars and conventional vehicles on the road? How are they going to interact with one another?
Anna Eby: Well, I think the driverless vehicles will be much better behaved than the ones being driven by the rest of us to the extent that the autonomous cars have systems where they can communicate with each other. That can create some interesting situations where you’ve got both regular, traditional vehicles, and these newer ones that are on a different wavelength than the rest of us. But I don’t envision there being a huge issue with having both types and I also don’t think we will – hopefully not in my lifetime, anyway – ever have roads where we only have autonomous vehicles. I think they will coexist, I think they already are to a certain extent more than we even realize. There are vehicles on the road right now that have some amazing autonomous ability. The new Mercedes Benz S-Class, for example; and they’re already out there and being used. So I think they’ll be able to co-exist.
- Craig. Williams: What happens in a situation when you’re in your driverless car, you have a health emergency like a heart attack or a seizure, how is the car going to handle it? Is it going to turn on the sirens and speed and get you to the hospital in time? Is it going to give you instructions on what you should be doing or instructing your passengers? Are we going to have births in driverless cars? How does all of this business about the health of humans going to be handled in a driverless car? John? Todd?
John Weaver: I think that in terms of when there’s a medical emergency in a car, I think that will be handled relatively similarly to what happens now when there’s somebody in a car with a medical emergency. The car will make its way, as safely as it can, to a hospital with all haste. However, I doubt that the self-driving car will go as quickly as, say, the story about that expecting father whose wife was in the back seat and he drove 125 miles an hour to the hospital to get her there so that the child can be delivered more safely. He ended up getting a police escort and the police gave him a ticket for speeding right after the child was born. I don’t think a self-driving car is going to be going 125 miles an hour, I just don’t think it’s going to be programmed to do that anyway. But other than that, I think that the car will make its way as quickly as possible to a hospital.
- Craig. Williams: Well, somebody’s going to ask that question after somebody dies of a heart attack in a car. Somebody’s going to say why isn’t there a siren and why doesn’t it speed.
Todd Berg: John had talked earlier about the killswitch idea, and it seems to me like that type of mechanism might be appropriate in a situation like this; a killswitch or some kind of override so that somebody can grab the wheel and take it. And, or, something akin to the OnStar system. Because if you are having a medical emergency, maybe the last thing in the world you should be doing is grabbing the wheel and driving like crazy to the nearest hospital. If the computers are controlling your vehicle and you’re tapped into the internet in every other respect, it seems like that would be a perfect opportunity for a feature like OnStar to kick in and detect what’s wrong and alert the authorities and have them come to you.
- Craig. Williams: Anna, what happens, now that we have these driverless cars, we have banks that own an interest in the car along with the individual. Are we going to be putting the repo man out of the business because the bank flips the switch and says you missed two payments, and boom. Car, turn around and return to the bank parking lot.
Anna Eby: I think that’s an interesting scenario. I would be very interested to see how different states handle that. I can tell you, again, in Texas, I don’t see that happening. But I think the law will need to address that. That’s a very good example of-
- Craig. Williams: I’m an owner. If I’m a bank, I’m an owner of that car and I want control over what that car does. If you’re not making payments, that car needs to return to me.
Anna Eby: Well I would prefer to keep the system we already have so that we’re not giving increased power to whether it’s the government or business or even to individuals through the use of this autonomous technology. I’m sure there are some clever folks out there who are going to figure out a way to make that an official for them.
- Craig. Williams: What are we going to do when we get driverless school buses? Is the thing just going to drive around without a driver and pick up kids at various locations and shuttle them to school and then bring them back home?
Todd Berg: Hopefully that day never comes.
John Weaver: That’s a terrible idea. It would be like Lord of the Flies on wheels, it’s a terrible idea. I think there will always be a situation where we will always have the need to chaperone kids. And I think that the school bus is a good example of that. Even if there is a self driving school bus, there would be an adult at the head of the bus that will offensively be in charge of the kids.
- Craig. Williams: What about situations where you go out to the bar with your friends and now is the car the designated driver?
Todd Berg: I think it still goes back to what we were talking about earlier as to who is the actual operator and who is giving the vehicle direction as to how it’s supposed to function. I know it seems crazy, but if you’re the intoxicated individual and you’re programming the vehicle to take a certain route or do something in particular, if a result of that command a crash occurs, it seems that you would have to be the responsible party.
- Craig. Williams: Yeah, but I’m not operating the vehicle, I’m just telling it where to go; I just get in it and say take me home.
Todd Berg: But I think the responsibility is going to flow to whoever has control over the vehicle, and arguably that would be you because you’re giving the commands to the vehicle. Because the vehicle wouldn’t actually do anything unless it was commanded to do so.
- Craig. Williams: One last question before we get to the final thoughts segment of our program, Anna, is this just going to be a boom for Uber now that everybody that’s got a car that goes to work; well it’s going to sit there for 5 or 6 or 8 hours while you’re at work and you just put it into service?
Anna Eby: Well maybe, and I am a happy user of Uber, I think it’s a great service. Potentially, I think it also creates a huge opportunity for all sorts of other apps and services. Yeah, I would say that it could very well be a boom for a lot of creative folks out there.
- Craig. Williams: Well it looks like we’ve just about reached the end of our program, so this is the point in time that we ask our guests to share their final thoughts and relay their contact information, so John, let’s start with you.
John Weaver: Sure. I think that driverless cars are exciting forms of technology, I think they’re going to be here, commercially available by the end of the decade. Some forms of them already are, in limited fashion, and I’m looking forward to getting one myself so that I don’t have to drive anymore. People could reach me at [email protected], and they can find me on Twitter, @RobotsRPeople.
- Craig. Williams: Excellent. Anna?
Anna Eby: Yes, well I really appreciate the opportunity to be on here today. There’s one question that I keep coming back to and it relates to us talking about the driverless school buses and some of these other issues or inebriated drivers. I think we’re going to see a potential issue with liabilities of individuals in autonomous vehicles who engage the autonomous feature at times when they shouldn’t. I think that’s an issue we didn’t really discuss today, but that’s something that’s been in the back of my mind and something that we will probably continue to talk about. When is it negligent to engage the autonomous vehicle. So, just throwing that thought out there, that’s something that I think a lot about. And I can be reached via email. My email address is [email protected], and also on my blog, that’s MotoristaBlog.com.
- Craig. Williams: Excellent, thank you very much. Todd?
Todd Berg: Craig, thank you very much for having me on Lawyer 2 Lawyer today, I really appreciate it. My closing thoughts would be two things. One, driverless cars are going to be driving into our world sooner or later and the safety benefits – if they are half as great as what people are saying – I think everybody will ultimately end up welcoming them in because they will reduce the number of accidents and injuries and deaths dramatically, and that will be a happy day. The second point is that in order to prepare for that world, certain changes do need to be made to the law. Immunity laws, like we have here in Michigan for manufacturers, they need to be reconsidered so that when and if something bad happens with a driverless car, people that are injured are able to get the compensation they deserve. And whether that’s just adjusting immunity laws, setting up some kind of compensation fund, or increasing insurance liability limits; one of those items, all three, they would all be great. So thank you very much for having me on today. I do appreciate it.
- Craig. Williams: Great, and how can our listeners reach you if they want to get a hold of you?
Todd Berg: Again, my name is Todd Berg, I’m at Michigan Auto Law in Farmington Hills, Michigan, and the best way to reach me is on email at [email protected].
- Craig. Williams: Wonderful. Well, we’d like to thank Todd Berg, John Weaver and Anna Eby for being with us today. That brings us to the end of our show. I’m Craig Williams, thank you for listening. Join us next time for another great topic. When you want legal, think Lawyer 2 Lawyer.
Advertiser: Thanks for listening to Lawyer to Lawyer, produced by the broadcast professionals at Legal Talk Network. Join J. Craig Williams and Robert Ambrogi for their next podcast covering the latest legal topic. Subscribe to the RSS feed on legaltalknetwork.com or in iTunes. The views expressed by the participants of this program are their own, and do not represent the views of, nor are they endorsed by, Legal Talk Network, its officers, directors, employees, agents, representatives, shareholders, and subsidiaries. None of the contents should be considered legal advice. As always, consult a lawyer.