Skip to content
NOWCAST KOAT Action 7 News at 10pm
Watch on Demand
Advertisement

ARTIFICIAL INTELLIGENCE IS BIASED. SHE’S WORKING TO FIX IT.

ARTIFICIAL INTELLIGENCE IS BIASED. SHE’S WORKING TO FIX IT.
WEBVTT ♪ ♪ SOLEDAD: IF YOU PLAN TO VISIT THE WHITE HOUSE ANYTIME SOON, EXPECT YOUR FACE TO END UP IN A DEPARTMENT OF HOMELAND SECURITY DATABASE. THE SECRET SERVICE IS USING A FACIAL RECOGNITION SYSTEM IN CAMERAS OUTSIDE THE WHITE HOUSE. THEY SAY THEY’RE ONLY TESTING THE SYSTEM, AND THE ARTIFICIAL INTELLIGENCE, OR AI, WILL ONLY BE SEARCHING THE CROWDS, LOOKING TO IDENTIFY SECRET SERVICE AGENTS. BUT THE ACLU, THE AMERICAN CIVIL LIBERTIES UNION, IS WORRIED THAT THE SEARCHES WILL INCLUDE REGULAR PEOPLE JUST WALKING NEAR THE WHITE HOUSE, WHO WILL ALSO END UP IN A GOVERNMENT DATABASE. THIS IS JUST THE LATEST CONCERN ABOUT THE GOVERNMENT’S EXPANDING USE OF FACIAL RECOGNITION SOFTWARE, IN THE WAKE OF RESEARCH THAT FINDS THERE IS BIAS BEING BUILT INTO THE TECHNOLOGY’S ALGORITHIMS. JOY BOULAMWINI WAS ONE OF THE FIRST TO RAISE THE ISSUE WITH COMPUTER GIANTS LIKE IBM. I SPOKE WITH HER AT MIT’S MEDIA LAB IN CAMBRIDGE. WHEN DID YOU START STUDYING ARTIFICIAL INTELLIGENCE? JOY: WELL, I GOT INTO ARTIFICIAL INTELLIGENCE MORE SO THROUGH ART PROJECTS. I CAME TO THE MIT MEDIA LAB, AND I WAS REALLY EXCITED TO TAKE COURSES THAT LOOKED AT BOTH ART AND SCIENCE, SO I STARTED A COURSE CALLED "SCIENCE FABRICATION." AND THE IDEA WAS, YOU READ SCIENCE FICTION, AND YOU TRY TO BUILD SOMETHING YOU PROBABLY WOULDN’T BE ABLE TO OTHERWISE. SO I BUILT SOMETHING CALLED AN ASPIRE MIRROR. AND THE IDEA WAS, FOR THE ASPIRE MIRROR, I COULD LOOK AT WHAT SEEMED LIKE A REGULAR MIRROR, AND PUT ANYTHING I WANTED ONTO MY FACE. SOLEDAD: SO YOUR FACE WOULD COME UP. AND THEN SERENA WILLIAMS’ FACE WOULD BE SUPERIMPOSED ON TOP OF YOURS AND -- THEN THAT’S YOUR ASPIRE MIRROR. JOY: THEN I WOULD BE THE GREATEST OF ALL TIME. SOLEDAD: GOOD WAY START YOUR MORNING. JOY: GREAT WAY TO START THE SO THIS IS WHAT I WAS TRYING TO DO, IS TRYING TO APPLY EXISTING TECHNOLOGY INTO AN ART PROJECT. AND WHILE I WAS DOING THAT, I RAN INTO THE ISSUE OF HAVING TO WEAR A WHITE MASK TO HAVE MY FACE DETECTED. SOLEDAD: WHY, WHAT DO YOU MEAN? WHAT HAPPENED? JOY: I WAS USING A WEBCAM, AND THE WEBCAM HAD TO HAVE SOME COMPUTER VISION SOFTWARE SO IT COULD TRACK MY FACE. BUT INSTEAD OF FOLLOWING MY FACE, IT DID NOTHING. SO THIS IS WHAT GOT ME INTO QUESTIONING -- WHY AM I HAVING A DIFFERENT EXPERIENCE WITH TECHNOLOGY THAT I WOULD OTHERWISE THINK IS NEUTRAL? SOLEDAD: WHAT EXACTLY IS -- DID IT BECOME CLEAR TO YOU THAT AI WAS BIASED? JOY: WELL, THIS IS THE QUESTION I WAS CURIOUS ABOUT. I STARTED TESTING AI SYSTEMS FROM MICROSOFT FROM IBM AND FROM FACE-PLUS-PLUS, A LEADING BILLION-DOLLAR TECH COMPANY IN CHINA THAT’S USED BY THE GOVERNMENT FOR SURVEILLANCE. SOLEDAD: AND WHAT DID YOU FIND? JOY: I FOUND THAT IT WASN’T JUST MY FACE, RIGHT? IT WAS, OVERALL, THE SYSTEMS WORK BETTER ON MALE FACES THAN FEMALE FACES. THEY WORK BETTER ON LIGHTER FACES THAN DARKER FACES. AND THEY WORKED ESPECIALLY POORLY ON DARKER FEMALE FACES. SO YOU DIDN’T HAVE ERROR RATES THAT EXCEEDED MORE THAN 1% FOR LIGHTER-SKINNED MEN, BUT YOU HAD ERROR RATES AS HIGH AS 35% FOR DARKER SKIN. -- DARKER-SKINNED WOMEN. SOLEDAD: DID YOU GO BACK TO THE TECH COMPANIES AND SAY, "YOU HAVE A PROBLEM"? JOY: SO, IBM GOT BACK TO ME WITHIN 24 HOURS, AND SAID, "WE ACKNOWLEDGE THIS IS IMPORTANT. WE’RE GOING TO LOOK INTO IT." WHEN THEY LOOKED INTO IT AND THEY REPLICATED OUR STUDY, THEY FOUND SIMILAR RESULTS, AND SO THEY INVITED ME TO SPEAK TO THEIR SENIOR TECHNOLOGIST AND SOME OF THEIR EXECUTIVES. AND THEY THEN ROLLED OUT A NEW PRODUCT THAT WAS SUBSTANTIALLY BETTER. SOLEDAD: AMAZON HAS A PROGRA CALLED REKOGNITION, YES? AND YOU REACHED OUT TO JEFF BEZOS ABOUT THAT. JOY: WE FOUND LIKE WITH THE -- IT WAS IMPORTNT TO REACH OUT TO AMAZON, BECAUSE THEY ARE CURRENTLY SELLING RECOGNITION TO POLICE DEPARTMENTS RIGHT NOW, AND FOLLOW UP STUDIES WE DID. WE EVEN SHOWED THAT AMAZON WAS NOT GETTING OPRAH’S FACE CORRECT. SOLEDAD: IF YOU’RE FAILING ON OPRAH’S FACE, YOU CANNOT HAVE VERY GOOD SOFTWARE. JOY: I WOULD QUESTION IT HIGHLY FOR SURE. SOLEDAD: IS THE IDEA THAT HUMAN BEINGS ARE BIASE SO, ANYBODY WHO’S A PROGRAMMER IS BRIGING THEIR BIAS INTO THIS PROGRAM, AND IF IT’S AI, IT’S REALLY HUMAN BIAS THAT’S JUST BEING REFLECTED. JOY: I CALL IT THE CODED GAZE, RIGHT? THE CODED GAZE, LIKE THE MALE GAZE OR THE WHITE GAZE, REFLECTS THE PRIORITIES, REFLECTS THE PREFERENCES, AND ALSO REFLECTS THE PREJUDICES OF THOSE WHO HAVE THE POWER TO SHAPE TECHNOLOGY. SOLEDAD: YOU WANTED A MORATORIUM ON THIS TECHNOLOGY. JOY: ABSOLUTELY. SO, RIGHT NOW, BECAUSE OF WHAT WE KNOW ABOUT THE ERRORS WITH FACIAL RECOGNITION TECHNOLOGY -- SOLEDAD: IT’S CLEARLY BIASED. JOY: CLEARLY BIASED. IT’S BEING USED RIGHT NOW -- IT’S BEING SHOWN NOT TO BE EFFECTIVE. AND THERE’S NO OVERSIGHT. SO UNTIL WE HAVE PROCESSES WHERE WE CAN ACTUALLY SEE -- IS THE TECHNOLOGY WORKING AS WE INTENDED TO? AND DO WE HAVE CONTINUOUS REPORTS, SO WE SEE THAT TECHNOLOGY’S NOT BEING ABUSED? I DEFINITELY BELIEVE THERE SHOULD BE A MORATORIUM AS WE WORK ON HAVING FEDERAL STANDARDS AND REGULATIONS FOR THIS KIND OF TECHNO
Advertisement
ARTIFICIAL INTELLIGENCE IS BIASED. SHE’S WORKING TO FIX IT.
Tech companies, lawmakers, and activists say bias is baked into facial recognition. So why is the government expanding its use of facial recognition? Soledad O’Brien speaks with MIT researcher and founder of the Algorithmic Justice League, Joy Buolamwini who says the facial recognition software and technology need more regulation to ensure its accuracy.

Tech companies, lawmakers, and activists say bias is baked into facial recognition. So why is the government expanding its use of facial recognition? Soledad O’Brien speaks with MIT researcher and founder of the Algorithmic Justice League, Joy Buolamwini who says the facial recognition software and technology need more regulation to ensure its accuracy.

Advertisement