Washington Lawyer - October 2019 - 16

IMPLICIT BIAS continued from page 14
bank lending demands a higher standard than data magnitude. Data also must
be stripped of demographic information that can lead to explicit bias in terms
of race, age, disability, and gender, as well as information interwoven into the
fabric of the data set such as zip codes and speech patterns.
Even the best efforts by technologists to remove implicit bias are no guarantee
of success, say critics, and those biases can make AI tools particularly vulnerable to error. AI relies on certain suppositions - assumptions that can be built
on incomplete data, bad coding, or inadequate models.
"The concept of 'garbage in, garbage out' generally applies to problem
solving or policy making based on data analytics and AI," says Adrienne
Fowler, a partner at Harris, Wiltshire & Grannis LLP and chair of its hiring
committee. "Many data sets are inherently biased and will result in biased
outcomes. But if you're vigilant, you can still work to reduce, or in some
cases mitigate the impact of, bias in your data sets."
For example, Fowler notes, facial recognition data sets have more problems
accurately identifying people of color than white people. If an AI program is
based on large-scale data analysis and machine learning on facial recognition,
then it's likely that data is biased. AI that offers insights and intelligence based
on biased data will produce equally flawed outcomes.
"The games potentially allow you to achieve a certain degree of blindness in
race, gender, and other factors that have led to discriminatory outcomes," says
Fowler. "If you combine the games with good information about the skills
you're looking for, try to test for analytical decision-making skills, and gather
data that enables you to see whether or not the process you employ has a
disproportionate impact on historically disadvantaged groups, then maybe
you have a process that will work."
Despite the best intentions of everyone involved, it can be almost impossible
in some cases to remove implicit bias because there are so many ways that
bias can be infused, whether by accident, error, or fraud. However, some
observers wonder if AI is being unfairly judged considering the alternative.


Many data sets are inherently biased and will
result in biased outcomes. But if you're vigilant, you
can still work to reduce, or in some cases mitigate the
impact of, bias in your data sets.
Harris, Wiltshire & Grannis LLP
process. Six months out, the firm can look and see who is performing better,
and then they'll know whether the games work."

Meanwhile, courts, prosecutors, and parole boards are turning to algorithms
to objectively measure the likelihood that a defendant or parolee will commit
more crimes, using that information to inform bail, sentencing, and early release

"AI always has this impossible standard to meet," says Dennis Kennedy, legal
innovation consultant and author. "We keep moving the goal post on AI.
I think at some point tech people could come back at critics and say, 'AI is
bad compared to what?' It's not like humans are unbiased. In some cases,
many of us would prefer to take our chances with AI."

Gascón made a media splash when he announced his plans in San Francisco,
in part because he hoped to clearly reduce racial bias. Under the new system,
after a prosecutor makes an initial charging decision based on AI, he or she will
review an unredacted version of the police report and other evidence that likely
would reveal the race of the person. Only then will a final charging decision
be made.

If it is possible to introduce neutral, scientific, and reliable data into an algorithm, there are few disadvantages to relying on its outputs, argue some
observers. The machine itself is not innately incorporating bias.

"This technology will reduce the threat that implicit bias poses to the purity
of decisions which have serious ramifications for the accused," Gascón said
in a statement. "That will help make our system of justice more fair and just."

"Machines are dumb boxes until you start feeding them data," says Arriènne
"Angel" M. Lezak, a shareholder with Polsinelli. "It's always going to come
down to who is feeding the data and where are they getting it from. The
human factor is going to be there because someone has to trigger the
machine to make assumptions. The key is to have the data be as neutral as
possible at the start."

In recent years, algorithm-driven risk assessments of whether defendants are
likely to commit new crimes have been used widely in the criminal justice
system and applied in a variety of cases, from determining bond to criminal

Still, there are concerns that smart algorithms are disruptive because they are
built on a weak structure. Some detractors say that cognitive games aren't
likely to guarantee the results that law firms are looking for in recruiting or

Correctional Offender Management Profiling for Alternative Sanctions
(COMPAS) is one such program. COMPAS incorporates 137 variables in its
scoring algorithm. Research shows it has been moderately accurate in identifying recidivism, but African Americans are more often wrongly identified as
a reoffending risk, according to ProPublica, the nonprofit newsroom that
produces investigative reports.

"It's easy to assume that we're looking at causation here with cognitive
empathy and high-performing partners," says Larry Richard, a lawyer and
founder of LegalBrain, a consulting firm specializing in the psychology of
lawyer behavior. "But does success cause the trait, or does the trait cause the
success? The only way for any law firm to know for sure is to hire half the associates based on the games and [the other] half based on the old criteria and

"If computers could accurately predict which defendants were likely to
commit new crimes, the criminal justice system could be fairer and more
selective about who is incarcerated and for how long," noted ProPublica in its
2016 risk assessment reporting, "Machine Bias." "The trick, of course, is to make
sure the computer gets it right. If it's wrong in one direction, a dangerous
criminal could go free. If it's wrong in another direction, it could result in






Washington Lawyer - October 2019

Table of Contents for the Digital Edition of Washington Lawyer - October 2019

Digital Extras
Your Voice
From Our President
Practice Management
Calendar of Events
Coding Out Implicit Bias With Ai
Rewriting the Rules on Data Privacy
Compromised Devices: Hardware Hacking Dangers
Taking the Stand
Member Spotlight
Global & Domestic Outlook
Worth Reading
Media Bytes
Attorney Briefs
Ask the Ethics Experts
Disciplinary Summaries
The Pro Bono Effect
Community & Connections
Last Word
Washington Lawyer - October 2019 - Cover1
Washington Lawyer - October 2019 - Cover2
Washington Lawyer - October 2019 - 1
Washington Lawyer - October 2019 - 2
Washington Lawyer - October 2019 - 3
Washington Lawyer - October 2019 - Digital Extras
Washington Lawyer - October 2019 - Your Voice
Washington Lawyer - October 2019 - From Our President
Washington Lawyer - October 2019 - 7
Washington Lawyer - October 2019 - Practice Management
Washington Lawyer - October 2019 - 9
Washington Lawyer - October 2019 - Calendar of Events
Washington Lawyer - October 2019 - 11
Washington Lawyer - October 2019 - Coding Out Implicit Bias With Ai
Washington Lawyer - October 2019 - 13
Washington Lawyer - October 2019 - 14
Washington Lawyer - October 2019 - 15
Washington Lawyer - October 2019 - 16
Washington Lawyer - October 2019 - 17
Washington Lawyer - October 2019 - Rewriting the Rules on Data Privacy
Washington Lawyer - October 2019 - 19
Washington Lawyer - October 2019 - 20
Washington Lawyer - October 2019 - 21
Washington Lawyer - October 2019 - 22
Washington Lawyer - October 2019 - 23
Washington Lawyer - October 2019 - Compromised Devices: Hardware Hacking Dangers
Washington Lawyer - October 2019 - 25
Washington Lawyer - October 2019 - 26
Washington Lawyer - October 2019 - 27
Washington Lawyer - October 2019 - Taking the Stand
Washington Lawyer - October 2019 - 29
Washington Lawyer - October 2019 - Member Spotlight
Washington Lawyer - October 2019 - 31
Washington Lawyer - October 2019 - 32
Washington Lawyer - October 2019 - 33
Washington Lawyer - October 2019 - Global & Domestic Outlook
Washington Lawyer - October 2019 - 35
Washington Lawyer - October 2019 - Worth Reading
Washington Lawyer - October 2019 - 37
Washington Lawyer - October 2019 - Media Bytes
Washington Lawyer - October 2019 - Attorney Briefs
Washington Lawyer - October 2019 - Ask the Ethics Experts
Washington Lawyer - October 2019 - 41
Washington Lawyer - October 2019 - Disciplinary Summaries
Washington Lawyer - October 2019 - 43
Washington Lawyer - October 2019 - The Pro Bono Effect
Washington Lawyer - October 2019 - 45
Washington Lawyer - October 2019 - Community & Connections
Washington Lawyer - October 2019 - 47
Washington Lawyer - October 2019 - Last Word
Washington Lawyer - October 2019 - Cover3
Washington Lawyer - October 2019 - Cover4
http://washingtonlawyer.dcbar.org/september 2017
http://washingtonlawyer.dcbar.org/september 2017