How easy has it become to fake photos, video, audio, and other data, and how will this affect your practice of law? In this edition of the Kennedy-Mighell Report, hosts Dennis Kennedy and Tom Mighell discuss what lawyers need to know about the deep fake phenomenon. They canvass the increasingly sophisticated fakes cropping up online and give their take on the implications of this issue for the legal industry. In their second segment, Dennis and Tom respond to comments they received on their previous episode about digital mindfulness.
As always, stay tuned for the parting shots, that one tip, website, or observation you can use the second the podcast ends.
Have a technology question for Dennis and Tom? Call their Tech Question Hotline at 720-441-6820 for the answers to your most burning tech questions.
The Kennedy-Mighell Report
Deep Fakes: Preparing Lawyers to Combat Counterfeits
Intro: Web 2.0, Innovation, Trend, Collaboration, Software, Metadata… Got the world turning as fast as it can, hear how technology can help, legally speaking with two of the top legal technology experts, authors and lawyers, Dennis Kennedy and Tom Mighell. Welcome to The Kennedy-Mighell Report here on the Legal Talk Network.
Dennis Kennedy: And welcome to Episode #238 of The Kennedy-Mighell Report. I am Dennis Kennedy in Ann Arbor.
Tom Mighell: And I am Tom Mighell in Dallas. Before we get started, we would like to thank our sponsors.
Dennis Kennedy: First of all, thanks to TextExpander for sponsoring our show. Communicate Smarter with TextExpander. Gather, Perfect, and Share Your Knowledge. Recall your best words instantly and repeatedly. Learn more at textexpander.com/podcast.
Tom Mighell: And we would also like to thank ServeNow, a nationwide network of trusted, prescreened process servers. Work with the most professional process servers who have experience with high-volume serves, embrace technology, and understand the litigation process. Visit serve-now.com to learn more.
Dennis Kennedy: Well, in our last episode we talked about Tom’s notion of digital mindfulness as opposed to digital sabbaticals, detoxes and just plain quitting social media and digital altogether. I was personally happy to work in a Marie Kondo reference before Tom did in that episode.
In this episode however we want to take a look at the implications of how easy it has become to fake photos, to fake video, and to fake other data and how it will probably get even easier and how that might impact you and also the practice of law.
Tom, what’s all on our agenda for this episode?
Tom Mighell: Well, Dennis, in this edition of The Kennedy-Mighell Report, we will indeed be discussing deep fakes; fake photos, fake videos, other types of fakes and how we think the technologically competent lawyer might need to think about them and get prepared.
In our second segment we are going to reply to some comments about our digital mindfulness episode. Remember, that we love comments, we encourage you to send them to us, either by email or by our phone line, which we will talk about at the end, our voicemail. And as usual we will finish up with our parting shots, that one tip, website, or observation that you can start to use the second that this podcast is over.
But first up, deep fakes, that’s a term, it’s a real term, it’s something that really exists, but what’s interesting is it seems like only yesterday we were talking about fake news and now that seems kind of like old news, because deep fakes have a potential to be a much bigger problem. They are fake photos, fake videos, fake audio that is starting to show up all over the place, and what’s interesting is that artificial intelligence is playing a role in this phenomenon.
Dennis, what made you think that now is the time to start talking about the deep fake issue?
Dennis Kennedy: Well, there were two things and they are sort of at the opposite ends of the spectrum. So one was there was — I saw this thing on Twitter where somebody had taken the Mona Lisa and used this AI programming to turn it into a smiling video, and that’s sort of the tip of the iceberg of what people are starting to do with video. So that was on the one end.
And the other, our friend Craig Ball had a recent post about the war for E-Discovery being over and that E-Discovery education needs to focus on the basics, and he was sort of bemoaning the fact that conferences went to these very esoteric E-Discovery issues instead of focusing on the basics.
So I think that that — it’s that sort of continuum that got me thinking Tom, because I feel that lawyers especially look at this world of email and documents and there is all these other things going on, where there is more video, there is more audio, there is more photos, and we used to say the picture was the proof, but we can’t rely on that anymore, and in fact, we probably couldn’t have relied on it for at least the past maybe 30 years, but we have.
But now, as you said, you could do even more, so it really leads you to question how much confidence you have in what you see and then what you need to know to determine whether something can be relied on. So that’s what got me thinking about this Tom.
Tom Mighell: I want to take a quick detour and veer off into Craig Ball’s post a minute, because I am not totally sure — I guess when you put it the way that you put it that things are moving into more esoteric topics, he did mention that some of these conferences are talking more about AI and blockchain and other things that are not really strictly related to E-Discovery but are more interesting offshoots.
I read that article in a slightly different way that initially I didn’t catch the connection you were making with it. And I totally agree that we are moving beyond documents and that’s why we are talking about the whole deep fake issue today. I don’t think I agree that the focus on email and documents may be misplaced. I think that there is still plenty of room there.
When I saw Craig’s article, and this is my quick detour and we will come right back to deep fakes in a second, is that his article was really making the point that this is something that happens I think that when new regulations appear, when new rules appear to deal with something, industry inevitably turns its attention to minimizing it, to figuring out how can we not have to do all this stuff. Instead of complying with discovery rules and learning how to do things let’s figure out how to make the problem go away in the first place. Let’s figure out how to utilize proportionality to make sure that we are doing the right thing. Here is how to negotiate things so we don’t have to produce anything at all.
I totally get his point and I think that that is to me one of the bigger issues. I think it’s going to happen with privacy the same way that it’s happened with E-Discovery. I don’t know that it’s totally related to this subject. I guess I would come back to you and say Dennis, I don’t see the connection, although I totally agree with Craig’s point.
Dennis Kennedy: Well, Craig’s post has a lot of depth to it and in a way I am creating a sort of fake post, if you will, by emphasizing what I chose to emphasize, but I think that there is — when I hear lawyers talk about what they are doing in E-Discovery, you are right, they are trying to avoid the difficult issues.
And what I am saying is that when you think about that and you are saying oh, there are all these difficult things out there like AI and blockchain and why can’t we just focus on the basics, this came back to me, this is like hey, there is some basics out there that are being called into question and probably will even more so and what assumptions are we allowed to make and what gives us the confidence that something we are looking at, which used to be kind of the gold standard, like saying if you don’t have a selfie of it or you don’t have a photo of it, it didn’t happen. Now, the question is, if you do have that photo, how can you be sure that it happened?
So I think that to me sort of — so part of what this is with the fake photos is saying okay, what is it that we can rely on, what do we need to do about that, but to say — to be — if my technology competence is in sort of the basics of E-Discovery, how broad does that have to be, because I think you do need to know something about photos and videos that’s probably beyond what most lawyers expect these days.
Tom Mighell: Well, and I think we are 100% back on track. I think that the whole deep fake issue and the idea of being able to fake photos and videos has elevated photos and videos from basic, from what we would have thought is basic, to complicate it and to potentially a big problem in litigation and as part of evidentiary consideration.
So yeah, I totally agree with you there. I mean do we want to start kind of with, I hate to say it, with the basics of deep fakes and kind of how easy it is to get it done and then kind of move into the more complicated stuff?
Dennis Kennedy: Well, let’s just talk about some super basic stuff. So I sort of think about it’s just how easy it is these days on your phone to take a picture and then edit it, enhance it, filter it, crop it and do other things with it, just on your phone, when you are taking — right after you have taken that picture. So it’s just kind of like swipe, swipe, do this, do that. And you have a completely different photo than what’s there that’s happened, and it’s so incredibly easy, and that’s just the basics.
And then Tom, I think as you step it up a level, you are talking about, you go into the Photoshop and the more detailed editing things, where you can almost make magic happen.
Tom Mighell: Yes, using Photoshop to alter an image is often successful, but I would say that depending on your skill using it, it may not come across that way, it may not be as easy. I think we are going to find the same with some of these deep fakes that we are going to talk about, but I think that a lot of it depends on the talent of the person who is doing it.
And I think the real fear that we should have is that somebody really talented gets a hold of some software and some tools that are really advanced and starts to make things that are really believable but also completely fake. I think that’s where we have the problem.
I think that when you are dealing with tools like the things on your smartphone, to a certain extent Photoshop, it’s easier to tell fakes that way than it is in some of these new methods that we are talking about, with using artificial intelligence, which we will talk a little bit later. I think that it makes it difficult, but I haven’t — to be quite honest, Photoshop has been around a long time. We don’t hear a lot of stories, I am not aware of any frankly; there may be some, but any stories where Photoshopped pictures were used as evidence in cases. I just don’t hear that that sort of thing is happening.
So even though that’s relatively, maybe not simple, but anyone can get Photoshop, anyone can use it, so it’s available to anybody, I would think that we would be seeing more of that happening if it were really something that could be successful.
Dennis Kennedy: Well, I think — so we rely on a couple of things. So one is we have our ethical rules which would say it is totally wrong to use Photoshop in a bad way. But it’s not that far of a stretch from saying, let’s just make this a little more clear and let’s take this out of the picture and let’s change the lighting here just to make it look better to actually really distorting the story.
So there are sort of two developments that I think have happened that — or are happening that have really brought this into play. So one is that you tend to see or I have seen in the last year or so in this highly partisan world we live in that photojournalists have kind of hurt their cause, because they will have an image that’s very affecting, very emotional, sort of like the classic photojournalist photo, and then there will be people at the scene who kind of clicked the whole set up and showed the whole scene. You see that it was really cropped and done in a way that really distorted what was happening there. So we have lost confidence I think in some of the traditional photography.
Then I think this other thing that’s just starting to happen, and Tom, I always have the sense you are more of a camera person than I am, but the new cameras are talking about just grabbing all the data about a scene that would be in the image, almost as if there is — so there is not like the actual image that you start with, and then you create it from all that data, so this raw data.
And so there is this notion that instead of just like mirroring the scene or capturing the scene, as a photographer you are starting to create what the scene would be with the data there so you can make things look better or sharp or all of those things. And again, it becomes — as the technology happens in photos, then I think there is a certain amount of editing and changing that we are going to start to take for granted and there is this line that we need to be aware of.
And then you would also want to say like how do I know if somebody is using Photoshop, like you say, I am not aware of anything either, but that doesn’t — I mean part of that reason is that people don’t know how to trace that. So there are a couple of things happening out there before we get the video even that are changing our approach to how much we rely on photos.
Tom Mighell: And I will say, you bring up the ethical rules as a potential barrier, I wasn’t even thinking about lawyers altering photos; I was thinking about clients bringing photos that have been altered. I mean I think that that to me stands the better chance of bringing a photo to a — a digital photo to a lawyer and saying here is what happened. I can see that happening more often than the lawyer.
But I will say I have been in positions where lawyers have taken videos in the past. I mean it doesn’t need any fancy technology, you shoot the scene or you take the photo to best advantage your position, and that’s something that lawyers have been doing forever. And I think there is a difference there between advantaging your position and making it in the best light and faking something, something that wasn’t there. I think those are two very different things.
You are very true about the abilities of cameras and I am thinking more of smartphone cameras these days, because for — the best example really is the Google Pixel Phone, the camera, the hardware on a Pixel Phone is not any better than any other camera, in any other phone, any iPhone or anything else, but it’s generally considered that the software on the Pixel Phone is really what makes the camera so good. It takes in all the information, it decides how it wants to compose the picture and provide it in the most pleasing delivery of pixels as it can to show the right picture. And I think when we give over that much control to the software, there are all sorts of things that can happen.
So maybe let’s transfer a little bit now and talk about video, and I think the same ideas are going to apply, but what you really have with video that’s making it different, something that is — I imagine it’s happening with photos too, but it’s going to happen more often with video, because you need something smarter than software to do this is, instead of sitting in Photoshop and manually setting up a revised video, you are actually — artificial intelligence is actually taking it to a new level and you are training artificial intelligence to change a video for you. And in many cases, and as you see by that Mona Lisa, pretty realistically, doing something that actually looks like the Mona Lisa is smiling.
Now, there are some examples of fake videos that aren’t quite this sophisticated, they aren’t using artificial intelligence. At the time that we are recording this episode, there was a video circulating of the Speaker of the House who appeared to be drunk and slurring her words. And it was pointed out and kind of debunked rather quickly that they had basically taken that video and slowed it down just enough to make it look like it was still going in regular time, but slow enough to where it did look like she was slurring her words.
And so some are sophisticated and some are less sophisticated, I think artificial intelligence makes the possibility that there is going to be — there are going to be more sophisticated videos that are capable of popping up in the future.
Dennis Kennedy: So a couple of things. So one is Tom, I remember, God, it was probably like 10 years ago, we were being videoed and we had a script that you and I were following and I can’t remember we were reading it or we had memorized it, but somebody was taking a video of us, and I did like this whole long thing and I messed up the very last sentence. And you were running late to something so I irritated you more than even usual by doing this.
And I am thinking this is like audio, I just do the last sentence over, and we couldn’t, we had to do the whole thing, because you couldn’t splice the video together in quite that way without a very sophisticated arrangement.
So I sort of feel what’s different is, and I don’t know whether it’s AI, but it’s, we do have all these tools out there, and I would call them — like they give you this ability to do smoothing is what I think of it is, so that you can say I could do something with video where it’s not herky-jerky like we think with film, but it’s very smooth. And so I can speed things up, slow things down, do a little distortion, maybe doing something even more so than that, like you said with the Speaker of the House and it’s just not noticeable.
And so I think that aspect of the video editing technology, and you put that together with CGI, where we could create this whole thing in front of a green screen to have somebody appear as if they were in a place they weren’t, it just raises all kinds of questions.
And then to touch back on the thing you were saying that there does become this concern when you think about it that you have a client who is bringing you a photo or a video, and especially if it shows like a spouse with someone they are alleged to be having an affair with, let’s say, and it’s very obvious the two people are in the same picture and then you find out later that the person has been Photoshopped in from something else, or something else has been taken out of that picture, so that’s interesting. So all this video editing technology has now kind of changed what’s possible and how we notice it less I guess when we watch.
Tom Mighell: Well, and one of the more interesting sites that came up in the past six or so months is, if you just go in and google or just actually go to thispersondoesnotexist.com, you will be presented with a photo, and you refresh the screen, and it’s a new photo every time you refresh the screen. None of those photos are of real people. Those are photos that were generated using artificial intelligence. And some of them you can see a little bit, I mean some of them are not absolutely perfect, but most of them are close enough, and the environment in which their faces appear, like maybe as part of a group photo, some of the details make it look real to you, whether it’s real or not, they have managed to make the photo look real.
I think that for evidentiary purposes I think that it really could be a huge issue. I mean so far there hasn’t been that video that really has done it, has been fake information that has really altered the public’s trust in video or photos as evidence. But frankly, one single convincing video could make it very difficult in the future for us to trust what we see on the news, what we see in the courtroom, and I think that it’s really kind of a worrying situation, because the tools are getting better and because the people who can wield those tools are probably getting smarter at using them.
So I guess, Dennis, how do you fight against it? How do you show something is fake? What are lawyers supposed to do to make sure that the evidence that gets put in against them isn’t a fake video or a fake photo?
Dennis Kennedy: Well, this is where I think the whole technology confidence notion comes in. So you would say what do I need to learn, who do I have to engage, do I need a forensics person, what are the telltale signs, I would think it is a minimum, that would be one thing that you’re looking at, what’s the chain of custody which we may talk about a little bit more in a second, Tom.
But I think you go into some of those things where you say, what is it that I really need to know and then to become aware just what you can do yourself like it’s pretty amazing how just with the simple editing tools on your smartphone, you can not only create a better picture and filter things, so it looks different, but you can crop things and do other things, take people out of shots and it’s super-easy and it will — it changes the information that comes from that.
And I sort of think that as a profession and probably the courts as well need to kind of step back and say, okay, what’s in these situations like who should have the burden of proof on showing that something is fake or that it’s real, does that change giving sort of the background and circumstances of what’s going on in a case. Is it different in civil versus criminal? I mean, like a fake video in a criminal case could have devastating consequences.
So I think there is a whole range of things that technology has moved out once again way ahead of where the law is and it’s going to be difficult, but I think the lawyers who could kind of step up to this are going to have once again as they hadn’t the early days of eDiscovery, they’re going to have a big advantage.
Tom Mighell: I agree and there are a couple of other things that I was thinking about could be used to show that something is fake. Some of these lawyers can use today, some of these may be coming in the future to help lawyers to the extent that they need it. One of the ideas would be if you’re looking at a photo, it’s possible that that photo was taken from something that’s already available out on the Internet and there are reverse image search engines where you can plug that photo in and it’ll go look for similar images and bring them back.
You may be able to find a fake that way, that’s happening today. Sometimes, magnification of that image can show differences. One of the things that that the government is doing that the Defense Department is working a lot on deep fakes, and right now, they are actually if you can use AI to create a deep fake, then why not use AI to detect the deep fake.
And what they are doing is they are feeding them real videos, they are comparing them against fakes, and they are training them to spot the differences and it turns out that those differences tend to be around the edges of the face where you put somebody’s face on somebody — on another person, there are digital errors around the edges of the face that are hard to smooth over completely. And an AI can detect it where a human doesn’t necessarily have the ability.
One thing that is very difficult in a fake video is lip synching is getting the actual motion of the words down correctly, and so that maybe one way to tell a fake. It’s been suggested that some fakes can be detected using forensics.
Dennis, you mentioned is forensics right just like an electronic document has its own unique hash identifier, photos and videos ought to have metadata that could be examined to determine if it was real or if it had been altered somehow. One potential fix that I heard about was that you could embed some type of cryptography into cameras that would make it relatively simple to verify footage.
Obviously, you’d have to make that a standard, all cameras would need to have that sort of thing so I don’t know that that’s ever something that’s going to gain traction but it’s one way to verify against fakes. I think we ought to consider all of these as open options until we kind of figure out what needs to happen but I think that the bottom line like you say is lawyers who know about this and who are thinking about ways to protect against it, already have a leg up on the situation.
Dennis Kennedy: Well, you do have the — a lot of photos have date, timestamps, location information which raises its own set of privacy issues. So we do have that balance that’s taking place. It’s possible I think that you’re looking at a chain of custody issue in some of these things.
So if you said we have a — it’s like a body-cam from a police officer, we want to know that nothing happened to that.
When you tug in chain of custody, Tom, you potentially have a blockchain use case over time to give some kind of certainty that something hasn’t been altered. So there are some things out there, but I think we really are at the basics and if we’re — if somebody is listening to this and said I deal with photos all the time in cases and you’re not aware of some of these editing tools, some of these developments, the metadata associated with photographs, it should be the wake-up call to say, hey, this is something I really need to get on top of because it’s going to happen and there are bad actors out there.
And we know that — well, we know, I have no doubt that in the election in 2020 we’re going to hear a lot of talk about videos and photos being faked. So we all need to become much more aware of that.
So I don’t know, Tom, I think this is an area where when technology and law come together, there are new career opportunities. So there could be some interesting things that happen here for people who really like this, like this stuff I think some new career options, whether it’s forensics, whether it’s some aspect of law, but some things are opening up there.
So that’s my thought. I guess, Tom, if you can wrap up but my thought is that this is really building issues that we’re probably not paying enough attention to and maybe it does need to start to at least spend a few minutes or one class in an eDiscovery course on photography and video would be useful for today’s lawyers.
Tom Mighell: Yep, I’m actually going to head in a non-technology-based direction to wrap this up in, which is one area that may prove effective is there’s — it’s one thing that ethical rules prevent lawyers from doing things like this, it’s another thing if you criminalize the practice, so if you say that if you’re introducing evidence that is fake then that is a crime that you could be punished for.
Back a number of years ago when I was a younger lawyer in Texas, there was a group here that basically had set up their own courts and had their own judgments that while they weren’t doing anything but just having their own hearing at the Airport Holiday Inn, it wasn’t causing a lot of problems but then they actually started filing these fake judgments with the county clerk, and they became liens when that happened and it caused lots of people lots of problems to have to deal with those liens and the minute that the Legislature criminalized filing fake documents in court, it stopped completely, they stopped doing it, it never happened at all and this is just a different kind of fake.
So maybe criminalizing it also would help, although this is a technology podcast, don’t count on that happening so make sure you do what you need to do to keep up with the latest technology on this issue.
All right, before we go to our next segment, let’s take a quick break for a message from our sponsors.
Advertiser: Looking for a process server you can trust, ServeNow.com is a nationwide network of local prescreened process servers. ServeNow works with the most professional process servers in the industry, connecting your firm with process servers who embrace technology, have experience with high volume serves, and understand the litigation process and rules of properly effectuating service. Find a prescreened process server today. Visit www.serve-now.com.
Dennis Kennedy: TextExpander is a productivity multiplier. Lawyers love TextExpander, because with a short abbreviation or search, while typing, TextExpander can produce cover emails for invoices or signing instructions, insert templates for consistent meeting notes, perform accurate date math on-the-fly, and instantly present things you retype all the time. TextExpander runs on Macs, iPhones, iPads and Windows and works in any application. Visit textexpander.com/podcast for 20% off your first year.
Tom Mighell: And now, let’s get back to The Kennedy-Mighell Report. I am Tom Mighell.
Dennis Kennedy: And I am Dennis Kennedy. We’d love getting comments and questions from our listeners about our show. Our friend Steve Embry listened to our digital mindfulness episode and wrote a blog post called ‘You Disagree With Kennedy & Mighell? Are You Out of Your Mind?’ That was the actual title.
Our running gag on the show is that Tom and I always start out trying to disagree with each other on the topic and then we end up on the same page, well except for maybe blockchain and Google.
So we thought we’d reply to Steve’s post in this segment and we thank him for writing the post.
So Tom, digital mindfulness was your idea, can you — can Steve possibly have anything to disagree with us about?
Tom Mighell: Well, and Steve, you’re welcome to send in a voicemail at the number we’re going to give you at the end here, but the way that I would interpret it is that Steve thought he disagreed with us, but I think he doesn’t disagree with us. That’s I guess the position that I’m going to take. And really what Steve says in his, he really is comparing what we talk about, which is instead of going cold turkey that you should find other ways to reduce your use of technology, and we talked about using some helpful tools on both iPhones and Android phones that can help you do that.
Steve’s solution is that he completely unplugs every Sunday, and it’s just a day without tech, and he enjoys that and he is — it frees him up and it helps him connect with his family, and that’s his method of the detox. And when I think about that, to me that doesn’t feel like the detox that I hear others talk about, which is plugging unplugging for a long period of time and going cold turkey for much longer than a day. To me a day feels — it doesn’t feel like a detox, it feels like a break and it could be just a health break for that, and there’s nothing wrong with that in my opinion.
And I think what his point was and what it really revealed to me is there’s really no one-size-fits-all for how you unplug from technology. You do what works for you. If taking a day off makes sense, then take the day off. If using one of these apps to understand how you use it and adjusting your usage of technology accordingly, do that.
I think my point in the podcast before is, is that totally unplugging for long periods of time doesn’t necessarily benefit you if there are benefits to be had from technology that don’t require you to be looking at your screen 100% of the time.
So I will say, I agree with him, I don’t have a disagreement and I wish I could totally unplug on Sundays as well. I don’t do it as much as I should, but I save more power.
Dennis Kennedy: As I am going to agree with you, Tom. I mean, I think that this Steve —
Tom Mighell: What? I am happy if that’s happening.
Dennis Kennedy: I mean he raises a good point but I think it felt like — it’s one of those things where you go, I think this was what we’re hoping to be the takeaway is that this is very personal to people. And Steve has come up with this great approach and I think if you feel more stressed by your digital life, then you’re going to have to take a more aggressive approach.
But, Tom, you also know me that I hate this sort of one-size-fits-all prescriptive, you can’t do this, you can’t do that, one true way of doing things, which usually just happens to be with the person telling you this or writing about this. It does, and they want you to adopt that as well.
And so I think that’s difficult. You do kind of have to step back, so like there are people say you should never look at an iPad or your smartphone in the bedroom, let alone in the bed. You go like I read at night and I read eBooks, that’s like, that’s a crazy notion, and so, it’s just kind of stepping back. But I would also say that if I’m on Facebook and I have people stalking me and it’s truly unpleasant then actually getting off of Facebook becomes an option. I would still probably address it by settings changes and stuff myself, but sort of the higher level of stress I think the more aggressive that you want to be, and so I think that Steve has developed this sort of nice approach for him that works for him right now, but I don’t know that that works for everybody. But it’s a good reminder to say there are many ways to come to terms with the different things that you do.
I still go back to say, and I still feel that television gets — watching television gets a free pass on this for most people that you spend a lot more time on TV than people are willing to admit.
So now it’s time for that parting shots, that one-tip website or observation you can use the second this podcast ends. Tom, take it away.
Tom Mighell: All right, in the past week or so there was a conference that basically debuted a whole — the kind of the latest in new laptop technology and there were some crazy laptops that were done, but the one that I — that’s really there — I’m going to talk about two things, one was talked about at this trade show but the other one was not. The first one is the Asus ZenBook Pro Duo and what’s interesting about it is that it has this beautiful full screen but then on the bottom where the keyboard should be, half of that is dedicated to the keyboard but the upper half is actually an extension of the screen that you have below.
So it’s designed to be a kind of a second workspace. You can have things there that you refer to as you’re working on a document above, you can move things from that workspace to the main monitor that you have on the laptop, but it is a — I think an ingenious use of space to create another monitor there that helps with things. I think it looks beautiful, I can’t wait to see it when it comes out. It’s just been announced so it hasn’t come out yet, but I’m excited about sort of the future of laptops by seeing what this does.
In the same token Microsoft is teasing a dual display, a dual screen surface device that really looks a little bit like a real notebook. It looks like it has kind of a coiled binding and that you open it up and it’s got a screen on either side, like a double Kindle or something like that. We are starting to see foldable devices come out that aren’t quite living up to expectations, so I’m not sure if this is going to be exactly what we’re looking for, but I got to tell you, I’m interested in new ways to have displays and I think that Microsoft and Asus are doing some sort of interesting things here.
Dennis Kennedy: You know that’s cool, because some of the things you’re describing especially with the two screens that are bound together like a notebook, there’s always been drawings of that and people kind of playing with prototypes for that, and going back many years and it’s really cool that the technology has now reached this point where some of these things are actually very possible. And again, it goes to that whole personalization thing where that might not be the right thing for me but that is perfect, that could be perfect for somebody else and it’s great to have those options.
So my parting shot goes back to something we’ve talked about in the past and it’s this site called, if this, then that, ifttt.com, and what it does is it allows you to set up these little, oh, I don’t know the word for it, but it basically these little flows and in a way it’s an API but it allows you to say, I would like to — if I like this on Twitter, then send a copy of it to my OneNote — on my OneNote program.
And so, and I use that example specifically because I realized that I use Twitter and likes on Twitter is kind of this way to say, oh, here’s some interesting stuff I might go back to or I might like to scroll through that from time-to-time. So I use a tool called TweetDeck and I thought this was something I would be able to do easily, and maybe it is if I kind of dig into the instructions, but it wasn’t for me. And I was thinking about if this, then that.
And there’s this little — I guess they might call them recipes, but it just says, hey, if take a Twitter like and send it to OneNote and so I just do that and I log in by logging credentials to Twitter, by logging credentials to OneNote and then anytime I like something in Twitter it goes to a little notebook called Twitter in OneNote, and then I have it all in one place. And it’s just like is this great little thing that makes me — made me just want to go through if this, then that, and just find everything I could that would make my life just a tiny little bit easier, so that’s what I liked.
Tom Mighell: The best and the worst thing about things like if this, then that, are that, there are so many tools there that can make your life easier. It drives me crazy going through all of them, it’s just — there’s just so many things to look at, I can never decide what to do.
So that wraps it up for this edition of The Kennedy-Mighell Report. Thanks for joining us on the podcast. You can find show notes for this episode at tkmreport.com.
If you like what you hear, please subscribe to our podcast in iTunes or on the Legal Talk Network site where you can find archives of all of our previous podcasts. If you like to suggest a topic, please visit us at bit.ly/2QNwhZu.
If you’d like to get in touch with us, you can reach out to us on LinkedIn or as I have said and I promised, I would give you that voicemail number, it’s (720) 441-6820. We would love to hear your voice.
So until the next podcast, I am Tom Mighell.
Dennis Kennedy: And I am Dennis Kennedy, and you have been listening to The Kennedy-Mighell Report, a podcast on legal technology with an Internet focus.
If you liked what you heard today, please rate us in Apple Podcasts, and we will see you next time for another episode of The Kennedy-Mighell Report on the Legal Talk Network.
Outro: Thanks for listening to The Kennedy-Mighell Report. Check out Dennis and Tom’s book, ‘The Lawyer’s Guide to Collaboration Tools and Technologies: Smart Ways to Work Together’ from ABA Books or Amazon, and join us every other week for another edition of The Kennedy-Mighell Report, only on the Legal Talk Network.
Dennis Kennedy and Tom Mighell talk the latest technology to improve services, client interactions, and workflow.
Dennis Kennedy & Tom Mighell discuss the pros and cons of unconferences and whether they could replace traditional conference structures.
Dennis Kennedy and Tom Mighell explore the benefits of implementing personal quarterly off-sites.
Dennis Kennedy and Tom Mighell explore the 2019 Internet Trends report created by venture capitalist Mary Meeker.
Dennis Kennedy and Tom Mighell’s share their summer reading lists, discuss thoughts on how reading is changing and whether it's affects the reader’s experience...
Fake photos, fake video, and fake audio, how the deep fake phenomenon impacts the practice of law.
Dennis Kennedy and Tom Mighell share their thoughts on why a tech detox, even just for the duration of a vacation, is overkill compared...