Blogs

AI at ILTCON 2017 - Part 2

By Joe Davis posted 02-12-2018 12:25

  

While the first session in ILTACON’s AI series was about the big picture, session #2 focused on how firms and corporate legal departments are actually using AI technology today.  Five very quotable panelists shared their outlook and experiences about implementing AI.


Moderator:  Andrew Arruda - CEO and co-founder of ROSS Intelligence

Ron Friedmann - Partner at Fireman & Co.

Katie Debord - Chief Innovation Officer at Bryan Cave

Steve Harmon - VP and Deputy General Counsel, Legal at Cisco

Stephen Allen - Global Head of Legal Service Delivery at Hogan Lovells

Neil Cameron - Partner at Neil Cameron Consulting Group


Ron Friedmann on how to successfully implement AI:

The elements of a successful AI transformation are listed here: use cases, data ecosystem, techniques and tools, workflow, an open culture and organization.   I would just ask you to think about for yourselves and your law firms, how many of those boxes does your law firm or your law department tick - because that's what it takes.


Ron Friedmann on AI usage by firm size:

In firms of 1000 lawyers or more, you have over half that are beginning to make use [of AI] and almost all the rest are beginning to explore, but a big dropoff once you get to firms below 1000.”


Andrew Arruda on firm size:

“It actually does make sense that you'd start to see that penetration in the larger firms due to budgets, and also a lot of these companies will use data to train their algorithms, and those are the firms with large sets of data.  So I think the trend you're going to start seeing is now that these tools have matured with larger law firms, smaller law firms are going to be able to benefit from that maturity.”


Katie Debord on being an early AI adopter:

"Sometimes I think we all feel like AI is this force of nature that's moving us, but we're not really sure where it's moving us or where we'll end up, and where we're going to go."


Katie Debord on understanding AI:

"It's much more important for me and for our firm to understand the technology than it is just to get deployments out to say we're doing AI deployments."


Katie Debord on creating an internal AI-focused group:

“...it felt like the attorneys were not necessarily always part of the conversation, and they should be.  We have a group of attorneys throughout the firm who are really interested in this stuff.  A lot of them tend to be younger, but not all of them - we have partners in there too.”


Katie Debord on tech fatigue:

“...You need to be careful putting too much on your attorneys. I'm really worried about tech fatigue, and I talk about this alot within our firm and externally.  I don't want our attorneys to get tired of hearing about AI, but not really having what they're seeing in practice mesh with what they're reading about.  I'm worried they're going to get tired of exploring and working with this stuff.  So part of our consideration is that we only put tools in front of our attorneys that we think have a viable use case. ... If our attorneys are having to log onto ten different tools, we're doing something wrong.  So ‘playing well with others’ is the first consideration we make as we're looking at what kind of tools we want to put in front of our attorneys for experimentation.”


Stephen Allen on evaluating AI:

“I think that we've gone around this in the wrong way as an industry.  We started off with ‘there's loads of cool stuff,’ ‘let's look at what this cool stuff does,’ ‘how can we use it?’  I actually think what we need to do is go back to ‘what are the problems we're trying to solve?’  ‘What are the problems in the business?’”


Stephen Allen on Hogan Lovells organizational structure as it relates to AI:

“We operate quite a lot of the time in what we call the 'collaborative space between the blurred lines.'  This is no one person's responsibility.  This is not owned by the lawyers, it's not owned by IT, it's not owned by finance, this is a space where we work collectively. So I think that's really important - to get the structure in which you operate.  ... You want a bunch of people [for whom] it's either their whole day job or part of their day job, and their job is to get to a solution, not to follow the boundaries of their territory.”


Stephen Allen on the relationship between AI and pricing of legal services:

“Price is important because post the global financial crisis, two things have happened.  One, legal stopped being the 'too dangerous to touch' group.  All the other businesses within our clients have been outsourced and sent off, and automated.  Legal stopped being the 'it's too difficult, it's too risky, you can't do it to us [group].’  Because we were the last man standing that hadn't been reformed.  Two -  and this is where I think there's a ray of hope in what we're talking about - is actually we've seen a burgeoning level of regulation.  Year on year, we've seen a 10 percent compounded increase in the amount of regulation on our clients, and that presents us with an opportunity, but to seize that opportunity, we have to be able to do it at a price and service point.  I think that's really important.”


Neil Cameron on pricing:

"Clients want to know how much something's going to cost, and in the UK the courts now want to know how much litigation will cost, and the price of getting that guess - sorry, estimate - wrong is that if you can't come up with a good enough excuse for why your initial cost estimate changed, you - the law firm - pay for the litigation.  That cost a law firm in the UK 5 million pounds last year.  The other part of the problem is that attorneys are not good at this.  In fact they're actually awful at it.  They suffer from all the normal planning cognitive biases, and then some new ones of their own.  In particular, they conflate costing and pricing, and the difference between those two is called profit.  And they're bad at learning.  They're bad at building up an institutional memory or even a personal memory of how much something cost to do last time."


Neil Cameron on write-offs:

“This is the biggest single hole in law firm profitability there is.  If you're a law firm turning over a billion dollars a year, you will have written off 428 million.  So in fact, you're turning over 1428 million, and you're throwing away 428 million, straight out the window.  And if you can make any small difference to that, you get big returns.”


Neil Cameron on using AI to categorize matters and feed knowledge management:

“If you can get this working properly you can use it to fingerprint matters, essentially to answer the question most asked by lawyers to which they seldom get an accurate answer, and that is this question: find me matters like this one.  Essentially once we've got the complexity factors running, we'll be able to fingerprint matters and group them together, and that will help you analyze working practices, potential improvements, areas of risk, and knowledge management.  Finally, you'd be able to get around the fact that lawyers only use a knowledge management systems when they're desperate and they can't find anything.  Or it's not in their head.  They don't use it when they ought to.  We'll be able to use this push them relevant knowledge - highly relevant and accurate knowledge which is targeted to the phase of work they're doing, and that they won't mind.  They hate being pushed all sorts of generic rubbish, but if you push to them accurate, relevant knowledge at the right time, you'll help them to look at the right knowledge resources when they should be."


Neil Cameron on where AI is most useful:

“Overall, you'll notice that this application is not going to be replacing lawyers, and I think that we should not just be looking at things using robots to replace lawyers, we should be using this mantra: before we start using AI to replace what attorneys do, let's start using it to undertake what attorneys can't or won't do, and one of them is costing matters.  Another one is analyzing timecards to the right area.”


Steve Harmon on how Cisco is using AI:

“We are looking to use artificial intelligence to build a recursive analysis that allows us to run the same documents through the same tool sets to ask different questions.  At the other end of the spectrum, and we also have a pilot going on this space, is to take a document we've never seen before and analogize from it - try to figure out how this document relates to what we're used to seeing.”


Steve Harmon on lawyers as artists, and judgement vs. access:

“Lots of lawyers that I meet believe that they're artists, and removing a brush stroke from their art will somehow impugn the integrity of what they've created.  Well, the reality is that in the in-house environment that I live in, we don't need art, we need prints.  We need to produce them rapidly, reliably with a reasonable level of quality, but quickly and efficiently.  This where artificial intelligence is going to play.  It's not about replacing judgement.  The notion that we're going to have robot lawyers that are out there arguing cases, doing motion practice - maybe. Not in my lifetime - good luck with that.  Where we're going to be most successful, I think, you as vendors to clients like me will be most successful is if you can help me with the access problem.  We have a large information infusion every time we do an acquisition, every time we look at our own contracts we have a large data problem, and if you can facilitate access, leave judgement alone.  Judgement is what our lawyers think they're good at now, and we'll let them continue to think they're good at that until we can prove that they're not.”

Listen to the audio recording, download the slides and view the illustrations created during the session here:
https://www.iltanet.org/viewdocument/artificial-intelligence-in-law-ai

Read Part 1 of this post here:
https://www.iltanet.org/blogs/joe-davis/2018/02/05/ai-at-iltcon-2017-part-1

0 comments
26 views

Permalink