Leela Madan is a licensed patent attorney and owner of Madan Law PLLC. She holds a Bachelor...
Hope Shimabuku is the Regional Director of the Texas Regional United States Patent and Trademark Office (USPTO)....
Catherine “Cat” Rifai is a senior associate in Munck Wilson Mandala’s robust group of trademark, copyright, and...
In 1999, Rocky Dhir did the unthinkable: he became a lawyer. In 2021, he did the unforgivable:...
Published: | July 8, 2024 |
Podcast: | State Bar of Texas Podcast |
Category: | State Bar of Texas Annual Meetings , Legal Technology , News & Current Events |
What can we learn about law from a sci-fi TV show? Rocky Dhir welcomes Leela Madan, Hope Shimabuku, and Catherine Rifai to talk about Netflix’s “Black Mirror” and the potential ethical implications of AI-fueled technology in the real world. Many now question how the rapid developments of AI tech should be regulated and held accountable. The group explores ethical rules and existing laws that could apply to different scenarios—both real and imagined.
Rocky Dhir:
Hello and welcome to another episode of the State Bar of Texas podcast. We are recording live from our state bar annual meeting in Dallas, Texas. This is your host Rocky Dhir. So I dunno how many of you guys have watched Black Mirror the show? I think it’s on Netflix, but it’s apparently a pretty popular show. And so we decided we’re going to go a little crazy today. We’re going to bring in three ladies to talk to us, not only about Black Mirror, but what we can learn from it. Alright, so joining me today we have the amazing Leela Madan, Cat Rifai, and Hope Shimabuku. Welcome ladies, welcome to the podcast.
Hope Shimabuku:
Thanks for having us, Rocky. Thank you. Good to be here.
Rocky Dhir:
Absolutely. Alright, so first of all, I got to tell you, alright, opening question. Alright, you guys are all patent and trademark lawyers? Correct. Okay, so there’s no right or wrong answer, I just want your honest opinions. Alright. Who’s the nerdier of the lawyers? Is it the patent trademark lawyers or is the appellate lawyers?
Hope Shimabuku:
I would say the patent lawyers.
Leela Madan:
I would say patent lawyers, definitely patents.
Catherine Rifai:
Yeah. It’s a specific type of lawyer to, yeah,
Hope Shimabuku:
Because well, especially on patent prosecution, the majority of the people are going to have, well you have to have
Rocky Dhir:
Technical, the science backgrounds have
Hope Shimabuku:
The technical backgrounds to practice in that area.
Rocky Dhir:
That’s true. But now they’ve thrown down the gauntlet. I think the appellate lawyers are now going to have a word next time they get on here
Hope Shimabuku:
We have appellate lawyers at the U-S-P-T-O. That’s
Rocky Dhir:
True. So it’s
Hope Shimabuku:
Super nerdy, double nerdy at that point in time.
Rocky Dhir:
Oh yeah, yeah, yeah.
Hope Shimabuku:
They argue and they defend our cases before the Supreme Court as well as the Fed Circuit.
Rocky Dhir:
Oh dang. Okay. These are people, I probably wouldn’t be able to understand them if I was in the same room. Like, wait, what? Excuse me.
Leela Madan:
They would not be in the same room as you though because they’re extremely introverted and prefer to be found in their office with the door closed.
Rocky Dhir:
That is Lilo’s. Very nice way of saying they’re too smart to be seen in the same room as me. I get it. I understand. Alright, so I got to tell you, this is how out of touch I am. Okay. I’ve never watched Black Mirror, I’m going to confess. So what is this show about? Can somebody walk me through what’s the premise behind the show?
Catherine Rifai:
I think the best explanation is it’s similar to Twilight Zone, but with modern technology.
Rocky Dhir:
Oh, I love Twilight Zone.
Catherine Rifai:
Each episode takes a different scenario and kind of takes it to its most terrifying conclusion. And it’s a lot of ai, a lot of deep fakes, a lot of things that pop up that honestly we’re going to start seeing here in the next few years in our practice.
Leela Madan:
And we’re already seeing it’s technology that in many cases is real already. Yeah,
Rocky Dhir:
It sounds like your premise is that Black Mirror is supposed to be fictional, but it’s really got elements of reality mixed into it. Is that
Catherine Rifai:
Absolutely, yeah. We found some instances where the technology we talked about that we thought was so far off and so strange is actually happening.
Leela Madan:
So for example, one of the episodes that we will discuss today, there is a company who is doing what? The episode fantasizes.
Rocky Dhir:
You know what, let’s just get into this. So when you guys were talking here at the annual meeting, y’all were talking about ethics, right? The ethical implications of Black Mirror and these AI type types of scenarios. So let’s talk first, I understand there’s a show or episode called Hated in the Nation, which honestly it sounds like my biopic, but okay, whatever. So hated in the nation, what was this episode about? And what do we learn when it comes to AI and ethics and lawyers?
Hope Shimabuku:
Well, hated by the nation actually is not specifically addressing ai, but hated by the nation. I would call it cancel culture on steroids. And so what happens is that in this particular episode, there is a technology, a drone technology that has been developed and that drone is shaped like a
Rocky Dhir:
Bee. And actually hope I need to interrupt. For anybody who is just spoiler alert, you might learn the endings of some of these episodes. We’re going to talk about three of ’em. So don’t get upset. That’s so nasty. If that’s what you want to do, go back and watch. Hate It in the Nation. Watch Jonah is awful, and be right back and then you can come back and tune right back in where you left off. Hope, I’m so sorry. Please
Hope Shimabuku:
Continue. That’s okay. That’s okay. That was a great spoiler alert. And so with this, so the drone technology that has been developed as I mentioned, and so what happens is that there’s a hacker and what he has done is that he has asked people to put in this hashtag, hashtag death to, and you’ll put the name of the person as well as a picture of the person. And at the end of the day, by five o’clock, the person who gets the most votes will be killed and they’re killed by this specific drone technology. The bee will come and find them. And so they do this and the clock resets every midnight. And so at the end of the entire show, one of the ironic things is that anyone who actually voted, who had inputted into death to whoever, they also ended up getting killed by the drones at the end of the show.
Rocky Dhir:
So the voters as well as the target, everybody gets killed.
Hope Shimabuku:
Yes. And so like I said, it’s cancel culture to on steroids or to the extreme as I would like to call it. And so some of the things that we were looking at from an ethical standpoint, it causes, there’s a lot of questions that are actually arising with respect to that. That’s
Rocky Dhir:
A great way to get an extension on a deadline though. I mean it’s like
Hope Shimabuku:
It is.
Rocky Dhir:
Okay, opposing counsel, guess what? Yeah, okay.
Hope Shimabuku:
Right. And so in speaking of which, so sometimes one of the ethical rules that we talk about is the rights of a third person. And so a lot of times as an attorney we think about the rights and our obligations to the client themselves, but we don’t think about our obligations to a third person. And so specifically a lawyer or practitioner cannot use any means to embarrass, delay, or burden a third person or violate the legal rights of such a person. And specifically they should not present or threaten criminal or disciplinary charges to gain an advantage in a civil matter. And so these are things that you could think about in which, hey, if you actually put death to, right, if you’re participating in this, are you actually threatening the rights of a third person? I mean that’s a really great question and it possibly the answer at the end of the day.
One of the other questions that we have to ask is in light of that, is there misconduct If you do participate and there is the rights that you do participate, is there ancillary misconduct that you are participating in? Because a lawyer and practitioner shall not violate these rules knowingly or knowingly assist or induce another to violate these rules or do so through the acts of another. And so if you are voting as a lawyer and you actually contribute to this, are you also conducting misconduct because you are actually contributing possibly to the murder of someone or a criminal action of someone else who is acting on your vote or your behalf? And so those are some of the things that we were talking about.
Rocky Dhir:
But then in the actual law practice context, what can we liken this to? Is there an analogy that we say, well, because I don’t think too many lawyers going to be saying hashtag death to opposing counsel. They might want to, but it doesn’t usually happen. What’s kind of the real world corollary maybe that
Hope Shimabuku:
Arise? Arise? That’s a really good question. I mean, I don’t think that we would specifically say, Hey, death to, but the question is when
Rocky Dhir:
You’re sanctions upon,
Hope Shimabuku:
Right? Or when you’re liking something or you are doing some sort of hashtag on social media, the question really is what is your moral obligation? As you are liking something, you’re reviewing something and you actually do comment on that.
Rocky Dhir:
Do
Hope Shimabuku:
You have some sort of ethical obligation if there is some result, it may not be as farfetched as someone dying, but do you have some obligation at the end of the day with what you are liking and commenting on
Rocky Dhir:
To balance and temporary your judgments versus interesting. Okay. Okay, interesting. So that’s hated in the nation. Now there’s another one, I dunno why everybody in the nation is hating on Joan, but apparently there’s another episode called Joan is Awful. So what did Joan do?
Catherine Rifai:
Joan just tried to live her life and there’s a streaming service that really took advantage of her. So Joan is just kind of your average nine to five employee. She goes through, you open the episode going through her day, she goes into work, she’s staying along with Cardi B, she has to fire someone because she’s a manager at her job. She just goes, she goes to a therapy session, goes about her day, comes home, sits on the couch with her fiance, opens up the streaming network, which they call stream bury in the show
Rocky Dhir:
Stream, bury Stream,
Catherine Rifai:
Bury,
Rocky Dhir:
Yeah. Okay.
Catherine Rifai:
And opens it up, looks for something to watch and popping up is the show, Joan is awful, and it’s the likeness of Salma Hayek going through her entire day. So everything we just saw in that episode gets played out for her on that episode. So
Rocky Dhir:
It’s SMA Hayek as Joan doing exactly what Joan did that day. Correct. And I wonder, did the makers, did the makers of the show have to have to trademark Stream bury?
Catherine Rifai:
That’s actually a really key point to the show is she goes and she consults a lawyer I think as we all would in this situation to say, what are my rights? How do we stop this? How do we get my whole life off of this TV
Rocky Dhir:
Show? I don’t know how I make money off of, I want to monetize this.
Catherine Rifai:
I think that’s an important thing. They didn’t explore here.
Rocky Dhir:
No such thing as bad publicity.
Catherine Rifai:
Exactly. Except in this case there is some bad
Rocky Dhir:
Publicity going on. General hated by the nation. Yeah,
Catherine Rifai:
Exactly. They all tie in together. So she goes and she consults with a lawyer and the lawyer describes the technology that Stream Barry is using as the super advanced deep fake quantum computer mumbo jumbo.
Rocky Dhir:
So
Catherine Rifai:
Basically says that when you signed up for Stream Berry, you signed away all of your rights. They are fully allowed to do this and there’s nothing the lawyer can do essentially, which I think we’ve all questioned that. That makes no sense. There’s plenty you could do in this situation. But from an ethical standpoint, this lawyer consultation comes in as lawyers. We have a duty, especially when it comes to AI, as it’s growing, to be reasonably informed of what these technologies can do and how they’re working. So as this lawyer is explaining to Joan how Stream Barrier is gathering her information, which turns out to be from cell phones, from video footage, from online emails, everything that she does day to day, they’re gathering that information and then immediately making this TV show about her life. And as this lawyer is describing this to her, the lawyer’s cell phone is right next to her in the room. So
Rocky Dhir:
The attorney client privilege is getting all, wow,
Catherine Rifai:
Everything the attorney is saying stream bury his hearing. And then lo and behold the show later that night shows her attorney consultation. So obviously that lawyer I think breached some ethical duties no matter what the technology is or when it comes up. That was a pretty obvious one to not have your phone in the room while that’s going on, the other thing this brings up is Selma Hayek’s likeness. So it’s not actually Selma Hayek going into a studio acting this out. It’s using her likeness that’s been licensed to the studio and it’s a deep fake technology.
Rocky Dhir:
They probably got her office stream Berry because she subscribed to it. There you go. Oh, is that what happened? I’m just guessing. I’ve not even seen the show. Okay, I know.
Catherine Rifai:
So they broadly
Rocky Dhir:
Say I be a producer, I should be a Hollywood producer. I’m so good at this.
Catherine Rifai:
Unfortunately they didn’t let us review the terms and conditions, but they broadly say that she licensed her rights to the show and that’s why they’re allowed to use this. Now Selma Hayek is not acting this out on a day-to-Day. And this obviously brings us to deep fakes, which DeepFakes is obviously the technology where someone’s likeness is being used to. When you see a video, it looks like someone’s speaking, but it’s not really them and it’s usually used in a malicious way. And we’re seeing that technology evolve at a scary rate.
Rocky Dhir:
It’s happening I think even in presidential campaigns. Exactly
Catherine Rifai:
Right. And there’s not a clear way to determine when something’s a deep fake right now. So there’s general regulations out there that are trying to be put in place. Tech companies are saying they’re trying to put invisible watermarks onto these and ways that we can authenticate videos or see if they’re deep fakes. Now the problem is these are currently very tamperproof or they’re not tamperproof. Okay. So it’s almost impossible for us as lay people and it experts to determine when something’s deep fake. And when this comes up for us as attorneys is when we’re introducing evidence into court, it’s going to become increasingly difficult to authenticate the evidence that we’re introducing into court. And we have a supervisory duty as lawyers to make sure that we’re using reasonable efforts to make sure that what we’re presenting is correct.
Rocky Dhir:
So here’s a question that, I dunno if it’s ever come up in these discussions, but if we’re talking about lawyer’s ethical duties, do you think there might’ve been a breach from say the lawyer who wrote and drafted the terms and conditions for Stream Bury? Could he or she have maybe committed some ethical violations by putting into print knowing that people are going to sign this, are going to accept the terms and conditions without reading it? Could we foresee someday maybe trying to say that in and of itself is an ethical breach? Because now going back again to Hope’s episode, now you’re harming people knowing full well that they won’t have the capacity or the time or the inclination to read all these terms and conditions. I dunno if that’s come up in the conversations, but I wonder if that’s a way to control this AI as it moves forward.
Catherine Rifai:
Yeah, I imagine that’s an argument that will come up at some point. But then we have the flip side of that, which is that we have a duty to our client to try and reasonably represent them as well. So if this is what they’re asking for, can we say no? Where’s the line that we’re drawing here?
Rocky Dhir:
This is going to get interesting.
Catherine Rifai:
In our bigger presentation we were talking about criminal acts as well and not acting on that. So honestly it touches on so many things in this episode, it’s hard to condense it down.
Rocky Dhir:
Absolutely. So speaking of condensing down, I wish we could keep talking about how awful Joan is, but before we go anywhere we’re going to be right back. That’s the next episode, it’s called Be Right Back. Alright, so Leela, tell us about this episode and what you guys were discussing with Epic.
Leela Madan:
So be Right Back is an episode where a young woman, her boyfriend dies in a car accident and she is very upset about his death and she finds out that she’s pregnant and she’s really struggling with this grief of losing her guy. So she employs an AI bot essentially to read all of his emails and read his social media stuff and look at his pictures and it creates an AI version of him and version one of him is just a chat bot that she chats with and it’s got his personality and it tells his jokes and she enjoys chatting with him. And then the tool says to her version two, you can call me. And then so she starts having phone calls with him with this AI version of him. And then version three is a video call, like a FaceTime, and it looks like him and she even remarks like, oh, you look so good. And then as this tool progresses, it lets her know, hey, there’s actually a version where you can build an Android doll that looks, walks, talks, and acts like him and she goes for it. So now she’s got this very tall, he was like six foot two or something very tall doll that she is got in her house that is
Rocky Dhir:
Basically I making one of those for myself. I can be taller and better looking and never age. Okay.
Leela Madan:
And the scary thing is that, I know this sounds really farfetched, but I did come across a company that’s actually doing this that is an AI tool that you can personally opt into to create a legacy of who you are after you pass away. So there’s a first couple who is opting into this. The husband has been diagnosed, I think with pancreatic cancer and he wants to stick around I guess, and his wife is okay with that. And that’s a real life company that’s doing that right now. And on their website they talk all about ethics and how you can only opt into this for your own self and not for another person, your dead ex-boyfriend like the girl in the episode did. But I question just the ethics overall and anyway, let’s go into the ethics of what we talked about today in our presentation. Sure. So we talked about competency and communication and confidentiality in particular. So personally, I think that as attorneys, if we’re going to employ AI technology, so we’ve all heard about the New York attorney who used AI to draft his brief, right? Yes. So I think a big part of that is that we have a duty to communicate to our clients that we’re using those AI tools and part of that communication is not just, Hey, we’re using this tool, but here are the potential risks involved.
And we also have a duty of competency. We have to understand the technology that we’re using if we’re just blindly going in and using a that we don’t know what it’s doing with the information that we input or is it potentially breaching confidentially or hallucinating, right? We need to communicate that and understand that and protect our client’s rights in those cases. So the attorney in the New York case, which is a real case, he explained to the judge that he didn’t understand what chat UPT was that he thought it was a super search engine, and the judge basically said, that’s not an excuse. And sanctioned him, his law partner and his firm and made him apologize to all of the judges whose names were used in connection with fictitious cases. And I think in that case it had a negative impact on his client because it ultimately resulted in the client missing out on a statute of limitations filing. Right, right.
Rocky Dhir:
So again, this kind of brings me back to this overarching question for all three of you as IP lawyers and as folks that have been thinking about the ethics of IP law, what role do you think the lawyers that are, obviously if you’re fighting against the ai, the services we talked about, like Cat, you talked about where somebody’s getting cheated by the AI through Stream bury in that episode. As lawyers representing the victim, we would be fighting against the company that’s employing the ai. What do you think are or ought to be the ethical obligations of the lawyers representing these AI type companies? Should they be acting more responsibly or should legal authorities and theBar associations, should we be putting guardrails on what attorneys are allowed to do in terms of representing these clients? So for example, terms and conditions, should we be saying, look, if you’re going to put terms and conditions like these in, then you’re violating the public at large by effectively perpetrating a fraud on people because these episodes we’re talking about somebody’s getting defrauded, right? So is this going to change the way ethics is thought about in the
Catherine Rifai:
Country? So this is a developing area and we’ve, Texas right now has what they’re calling the trail committee, which is a task force that was appointed by the state bar to explore the overlap of AI and the law and specifically put forth a report that’s going to outline where we can strengthen our own ethics rules and address these a little bit better when they come up. They did put out a preliminary report in January of 2024, I believe, and they’re going to follow up with their official recommendations at some point. We don’t know when the A BA already put some guidance out regarding AI in 2019 and now we’ve seen Florida and California put forth some guidance as well. So I think from our state bars, we’re getting a little bit of guidance on how this is going to develop, but so much of this I think is going to come from individual ethics opinions and it’s going to be a really long process for the law to catch up with where this technology is going.
Rocky Dhir:
Okay. So maybe final question, final question of the day is, aside from Black Mirror, for each of you, what is your favorite guilty pleasure when it comes to watching something on a streaming network? So like your favorite TV show, movie, whatever, to watch. What’s your favorite guilty pleasure? Hope we’ll start with you.
Hope Shimabuku:
Well, right now I’m really into kdr, which is the Korean drama, and Netflix actually has a lot of them. And so I’ve watched an amazing number of them and a lot of them actually are lawyer shows. And so I just finished the series called Stranger, which is an attorney prosecution show in which they’re investigating crimes.
Rocky Dhir:
Interesting. Okay. Leela, your turn.
Leela Madan:
My favorite show is Survivor. I am a Survivor Super fan.
Rocky Dhir:
Oh, this is, okay. You obsessed. You’re going back. Okay. This is,
Leela Madan:
I’m not going back. It’s still on. It’s still happening still on right now. They’re going season 46 right now and they’re going to 50 baby 46.
Rocky Dhir:
Wow. I didn’t even know that. I thought they got rescued and everything. I never watched Survivor, but I thought
Leela Madan:
It’s amazing. And there are a lot of attorneys who go on the show and a lot of law students who go on the show,
Rocky Dhir:
The ultimate survivor is the show survivor. It’s gone this long. That’s amazing. Okay, Cat, your turn.
Catherine Rifai:
Well, actually, Lela and I were just talking this morning about Bridger. I think that’s a bit of a guilty pleasure show. And then honestly, I have a bunch of shows that I rewatch. I don’t know about you guys, but I can rewatch new girl a million times and never get tired of it.
Rocky Dhir:
Okay. Okay. Well, I have to say I’m a little disappointed that nobody mentioned either Ted Lasso or Cobra Kai, but maybe that’s just
Catherine Rifai:
Ted Lasso.
Rocky Dhir:
Ted Lasso. Yeah,
Catherine Rifai:
I’ll add that to my list.
Rocky Dhir:
That’s a good one. Well, unfortunately that is all the time we have for this installment of the State Bar of Texas podcast. But do not be a goldfish. Don’t forget what you learned here in the next 10 seconds. Okay. Remember it. I want to thank our guests. Thank you, all three of you for joining us today. Thank
Hope Shimabuku:
You,
Catherine Rifai:
Rocky. Thank you so much for having us.
Rocky Dhir:
Absolutely. And also, I want to thank you, our listeners for tuning in. If you like what you heard, please rate and review us wherever you get your podcasts. I’m Rocky Dhir. Until next time, thanks for listening.
Notify me when there’s a new episode!
State Bar of Texas Podcast |
The State Bar of Texas Podcast invites thought leaders and innovators to share their insight and knowledge on what matters to legal professionals.