Here Are The Human CFO Traits That No Artificial Intelligence Can Replace

In the judicial system, one prominent example is judges making sentencing decisions based in part on AI-driven software that claims to predict recidivism, the likelihood of committing further crimes. There is concern over how the factors used in the algorithms of such software could correlate with race, which judges are not allowed to take into account when sentencing.


The challenges that artificial intelligence pose to legal ethics, while significant, can be addressed — and should be addressed, so lawyers can take advantage of the powerful tools driven by AI.

Education and Training

Part of the lawyer’s duty of competence involves keeping abreast of changes in law and in legal practice – and these changes, in 2018, inevitably involve technology.

“Large numbers of lawyers don’t take this duty to keep up with technology seriously enough,” according to David Curle of Thomson Reuters. “It’s not just AI-based technology but even more mundane things like practice management platforms, and other tools that make it easier and more efficient to practice law.”

“The ethical duty of competence requires being appropriately up to speed on technology,” says Chris Mammen of Hogan Lovells. “So AI is not something you can stick your head in the sand over, just as you couldn’t try to conduct a document review in a major litigation entirely in paper.”

Lawyers must therefore have a general understanding of technology and artificial intelligence. And they must also understand the general operation of the specific AI tools that they use in their own practices.

“We need to have some understanding of what’s going into an AI tool and what’s coming out of it,” according to ethics lawyer Megan Zavieh, who represents lawyers facing disciplinary charges. Just as lawyers can’t prove they satisfied their ethical duties simply by hiring an outside consultant, they similarly can’t establish ethical compliance simply by using an AI tool.

At the same time, lawyers are not programmers – and the ethical rules recognize this, as David Curle notes: “The current rules of professional responsibility are general enough to cover the situation. They suggest two things: that lawyers must understand enough about a new technology to see the risks, and that lawyers must understand enough to see the benefits.”

What this means in practice is that lawyers need to find trusted providers of AI-based solutions, and they need to pose smart questions to the providers whose AI tools they are considering using. Lawyers need to understand, at a basic level, how the solutions work and how the solutions were developed.

What to Look for in an AI Tool

  • Understand the technology

    No one expects you to understand exactly how the technology works, but legal professionals must have a basic understanding so that you are able to consider the benefits and risks.

  • Consider data security

    Lawyers need to be conscious of choosing a solution that ensures client confidences remain secure at every stage of the process.

  • Understand data quality

    If the data used to train the AI contains unfair biases, then the results of the AI could be correspondingly biased. Find a provider with trusted data.

  • Make sure the legal research work is being done competently

    Even when choosing the right solution, you must be mindful that there are some tasks that are not appropriate for handling by AI, as well as some tasks where it would be unethical not to use the technology– and you must know how to tell them apart.

Find out more »

Source :

The Ethical Implications of Artificial Intelligence
To beat Vegas bookies at the World Cup, these statisticians turned to artificial intelligence
The software robot invasion is underway
AI matching human intelligence still decades away: CA research scientists
Will Artificial Intelligence someday dominate humans?
Artificial intelligence can improve productivity
Is Artificial Intelligence the Future of Education?
Five reasons why robots won't take over the world
The Cybersecurity 202: States need more money to secure the vote. Congress is unlikely to send it by November.