Are We Automating Racism?

31-Mar, 2021
509 317 Ko‘rishlar soni

Many of us assume that tech is neutral, and we have turned to tech as a way to root out racism, sexism, or other “isms” plaguing human decision-making. But as data-driven systems become a bigger and bigger part of our lives, we also notice more and more when they fail, and, more importantly, that they don’t fail on everyone equally. Glad You Asked host Joss Fong wants to know: Why do we think tech is neutral? How do algorithms become biased? And how can we fix these algorithms before they cause harm?

Fikrlar
  • On this season of Glad You Asked, we explore the impact of systemic racism on our communities and in our daily lives. Watch the full season here: bit.ly/3fCd6lt Want updates on our new projects and series? Sign up for the Vox video newsletter: www.vox.com/video-newsletter For more reading about bias in AI, which we covered in this episode, visit our post on Vox.com: bit.ly/3mcZD4J

    VoxVox12 kun oldin
  • Yeah it's not bad until when it comes to crime. Black people have to this day still being arrested over mistaken identity at an alarming rate . So weather its machine or human something has to be done!

    victor 91victor 9113 kun oldin
  • Best Series yet! This channel never disappoint

    Huey FreemanHuey Freeman13 kun oldin
  • I'm surprised the health risk model didn't factor in race because everything else in health care does. You are likely to be screened for heart disease earlier as a black man because the data shows they tend to get it. More simpler, women are more likely to get breast cancer even if a tiny minority of men can actually get it. Lupus is more common in women as well. Health care shouldn't be color blind.

    sor3999sor399913 kun oldin
  • I feel like a lot of these things aren’t the result of anything racist, but of other external factors end up contributing to that. The example of the hospital algorithm looking at expensive patients, for instance, isn’t inherently racist. The issue there should be with the factors that cause minority groups to cost less (ie. worse access to insurance), not with the software.

    George WillcoxGeorge Willcox13 kun oldin
  • this was actually a really interesting video! definitely makes me think more deeply about how my biases may affect technology and how they affect me O_O

    CeeSawCeeSaw13 kun oldin
  • Why did you make tea in a saucepan and not a kettle?

    George WillcoxGeorge Willcox13 kun oldin
  • 7:45 me seeing a little hand wave on the edge of the screen

    Anonymous bubAnonymous bub13 kun oldin
  • Thought provoking video! However, using AI generated faces is probably the worst thing to do in this case, since whichever model generated the faces would presumably suffer the same systemic bias. There is a reason we bother collecting actual real-world data instead of using simulated data.

    Vince BVince B13 kun oldin
  • Wait, the cameraman is in both rooms but they are face timing each other???

    Kianoush KeykhosraviKianoush Keykhosravi13 kun oldin
  • I just love their props lol

    dnyalslgdnyalslg13 kun oldin
  • Very good videos. Keep them coming

    george so___sgeorge so___s13 kun oldin
  • ♥ Joss

    kithytom007kithytom00713 kun oldin
  • Oh my goodness the handwashing thing always happens to me and I am brown 🤷🏾‍♀️

    raquellochoaraquellochoa13 kun oldin
  • Racism is taught. I would start by questioning a racist persons parents.

    Khaled KKhaled K13 kun oldin
  • So, about the music. I see all music on videos as manipulative. WTF were they trying to make me feel with this stuff. Anyway good stuff.

    Mark WallaceMark Wallace13 kun oldin
  • lets not forget how light works in camera. I am a dark skinned person and I can confirm that a light skin would physically reflect higher amount of photon which will result in higher probability of the camera to capture that picture better than that of a black counterpart. same goes for computational photography and basic algorithm that are based on photos that we upload. it only makes sense that it would be bias towards white skin. why does everything have to be taken as an offensive scenario? We are going too far with this political correctness bullshit. Again, I am a person or dark skin and even I think this is bullshit. Now if you use it as if this is an issue in identifying person's face for security reasons or such, then, yes I am all for it to make it better to recognize all faces. But please, please make this political correctness bullshit stop.

    Firoj AhmedFiroj Ahmed13 kun oldin
  • Hello VOX Team, I personally a Favourite fan of your channel Videos, Thank you for the sensational projection of various topics and Keep up the great work.

    karthick vkarthick v13 kun oldin
  • There was a Better off Ted episode. Corporate head office decided to discontinue use of the energy saving technology to save money.

    Shion ChosaShion Chosa13 kun oldin
  • Love the production especially on the set!

    CheesecakeLasagnaCheesecakeLasagna13 kun oldin
  • This is a really interesting look into machine learning - great job Glad You Asked team! It stands to reason that there would be bias no matter what because even if the machine doesn't have any inherent bias or self-interest in focusing on one face over another, people are still feeding information into the machine and the machine is basing its results on that information. And humans are still flawed beings who bring with them their own personalities, thought patterns, biases, childhood backgrounds, class backgrounds, et cetera. The only solution is to focus on what information we're feeding machines.

    Riley MadisonRiley Madison13 kun oldin
  • Algorithms are our opinions written in code.

    Anil KaundalAnil Kaundal13 kun oldin
  • Data scientists and AI researchers, please do your job ethically without bias.

    Seth DeeganSeth Deegan13 kun oldin
  • At work, I have an IR camera that automatically measures your temperature as you walk into my facility. How it is supposed to do this is by locking on to the face, then measuring the person’s temperature. Needless to say, I want to take a sledgehammer to it. When it actually works, it’s with a dark face. The type of face it has the most problem with is a light face. If you also have a bald head, it will never see you.

    Rodney KellyRodney Kelly13 kun oldin
  • 4:30. Why are you filming and driving!! No don’t read a quote!! JOSS NOOO. *boomp* *beep beep beep*

    TheSTRVmanTheSTRVman13 kun oldin
  • Joss Fong!❤️🔥

    Pavan YaragudiPavan Yaragudi13 kun oldin
  • same with medicine. sometimes they used to test it on white males, later on white females because they found out man and woman had different reactions to the same medicine now we know black people react even diferent to the medicine compare to other races... so basically the same is happening with tech...

    Gustavo MedranoGustavo Medrano13 kun oldin
  • The machines are programmed by people. Racist and bigoted and biased people. We are not at sentient machines yet.

    diane ridleydiane ridley13 kun oldin
  • Unity in Diversity

    Vance HVance H13 kun oldin
  • ooo this is so fascinating

    Maya PSMaya PS13 kun oldin
  • Algorithms of Oppression is a really good book if you want to learn about racial and gender bias in big tech algorithms, like Google searches. It shows that machines and algorithms are only as objective as we are. It seems like machine learning and algorithms are more like groupthink and are not objective.

    Meredith WhiteMeredith White13 kun oldin
  • Machines aren't racist. The outcome feels racist due to bias in training data. The model needs to be retrained.

    Atiqur RahmanAtiqur Rahman13 kun oldin
  • How did y’all de-age Steve Coogan?

    Jake from StarHeartJake from StarHeart13 kun oldin
  • also wtf shaking hands?? hello 'rona

    alex dalex d13 kun oldin
  • Worlds of Math Destruction by Cathy O’Neil really touches hard on this subject, where biases of the designers or the customers of the algorithm have big negative impacts to society. There seriously needs some kind of ethical standard for designing algorithms but it’s so damn hard... :/

    Sam LeeSam Lee13 kun oldin
  • There's also the possible issue of "white balance" of the cameras themselves. My understanding is that it's difficult to set this parameter in such a way that it gives acceptable/optimal contrast to both light and dark skin at the same time.

    BerslBersl13 kun oldin
  • How was the M.I.T lady writing on the glass, because surely she was writing backwards so we can see it?

    will_ presswill_ press13 kun oldin
  • how are you still using skype?

    alex dalex d13 kun oldin
  • 1:55 Dàmn 😐👀😐

    joel Winchesterjoel Winchester13 kun oldin
  • What I got out of this is... We need more data ?

    Christian moralesChristian morales13 kun oldin
  • 18:36 why are they pretending they are talking on a video chat, when they had crystal clear picture from another camera? Reality and perception, subtle differences.

    wj35651wj3565113 kun oldin
    • We had a camera crew on each end of our zoom call, since we couldn't travel due to Covid. - Joss

      VoxVox8 kun oldin
  • Wow, people shouldn't be viewed as data/numbers. We are all uniquely individuals. Software should not be used for calculating/predicting crime and health. We are all too dynamic. Especially during different times and places in our lives.

    C.L. BrownC.L. Brown13 kun oldin
  • Are we stealing Netflix ideas?

    MilenaMilena13 kun oldin
  • As a black person living in North America this sucks...

    tatianazim1tatianazim113 kun oldin
  • My range of emotions went from: “this is so dumb lol cmon Vox” to “holy fuck Vox thank you for highlighting a story I never would’ve known about”

    Jaryd BJaryd B13 kun oldin
  • People seem to be noticing how nicely the professor can write backwards... Fun fact: That's a camera trick! uzworld.info/player/video/no-0iKeHZay7paI She is actually writing normally, (So the original video shows the text backwards) but then in editing, the video was flipped again, making the text appear normal. Notice that she is writing with her left hand, which should only be a 10% chance. Great video btw! I thought that the visualization of the machine learning process was extremely clever.

    GreyOwulGreyOwul13 kun oldin
  • We need more black brown engineers to work on these machines. And every single person in the US should own their data and have control over it.

    Thus Spoke MoeThus Spoke Moe13 kun oldin
  • Wait.. how did the soap dispenser differentiate between the two hands???

    Kuldeep ΜauryaKuldeep Μaurya13 kun oldin
  • These videos are so solid! I'm about to graduate college with a degree in sociology and so far these videos are hitting a ton of the main points that I've learned about over my four years of education.

    Noah JohnsonNoah Johnson13 kun oldin
  • Another great interesting video!

    Sasha LeeSasha Lee13 kun oldin
  • I missed seeing Josh in videos. Glad she's back.

    Pranav KakkarPranav Kakkar13 kun oldin
  • Wait, how was the professor from Princeton able to write in reverse on that glass, so that we could read straight?

    Zubin SiddharthZubin Siddharth13 kun oldin
  • This feels like a pbs kids show. With the set and all!

    Killian BeckerKillian Becker13 kun oldin
  • 16:15 that got me trippin for a second until I realize they probably just mirror the video so that the writing comes out right and she's not actually writing backwards.

    SyazwanSyazwan13 kun oldin
  • the level of production on this show is just * chef's kiss *

    The_void_screams_backThe_void_screams_back13 kun oldin
  • love this topic!

    SofiSofi13 kun oldin
  • Designers are biased and so do their machines Simple as that

    Shahrukh ShikalgarShahrukh Shikalgar13 kun oldin
  • Thank you

    Fandy HadamuFandy Hadamu13 kun oldin
  • I never thought about it before. Thanks Vox!

    Danu Setia NugrahaDanu Setia Nugraha13 kun oldin
  • Wow. Really smart and great guests on this video.

    Yellowsnow69420Yellowsnow6942013 kun oldin
  • as always the editing is absolutely superior. keeps me hooked.

    daylightinsomniacdaylightinsomniac13 kun oldin
  • Thanks for these Joss

    CHIEF HASKY CREATESCHIEF HASKY CREATES13 kun oldin
  • Omg Dawood Khan is really cute! Who else agrees?

    Thomas MastrogiacomoThomas Mastrogiacomo13 kun oldin
  • The worst are the courts using AI to decide how long to lock up people. If you're black the machine said you should get locked up longer while reccomending setting whites with same crimes free.

    QOOQ8808QOOQ880813 kun oldin
  • It's been forever since I've last seen Joss in a video. I've almost forgotten how good and well-constructed her videos are.

    Elogene Karl GallosElogene Karl Gallos13 kun oldin
  • Not enough black people in China. Most of the datasets every algorithm uses were trained by CCTV data from Chinese streets and Chinese ID cards.

    TheAstronomyDudeTheAstronomyDude13 kun oldin
  • So Twitter said they didn't find evidence of racial bias when testing the tool. My opinion is that they were not looking for it in the first place.

    Michael FadzaiMichael Fadzai13 kun oldin
  • This reminds me of that book, "Weapons of Math Destruction" Great read for anyone interested, it's about these large algorithms which take on a life of their own

    terrab1ter4terrab1ter413 kun oldin
  • Me at first: "who even asks these questions, sreiously?" Me after finishing the video: "Aight, fair point."

    Armando René CantúArmando René Cantú13 kun oldin
  • The hand soap dispenser is a real thing. Straight up

    NATHAN BRADSHAWNATHAN BRADSHAW13 kun oldin
  • What a thought provoking episode! That young woman Inioluwa not only knew the underlying problem but she even formed a solution.. when she said that it should be devs responsibility to proactively be conscious of those that could be targeted or specified in a social situation and do their best to prevent it in advance. She’s intelligent, and understands just what needs to be done and stated in a conflict; A solution.. Hats off to her..

    Jordan JJJordan JJ13 kun oldin
  • I am amazed with the look of the studio. I would love to work there, the atmosphere is just different, unique and everyone have a place there 😍

    Alya Nur AeniAlya Nur Aeni13 kun oldin
  • • 2:58 - Lee was on the right track, it's about machine-vision and facial-detection. One test is to try light and dark faces on light and dark backgrounds. It's a matter of contrast and edge- and feature-detection. Machines are limited in what they can do for now. Some things might never be improved, like the soap-dispenser; if they increase the sensitivity, then it will be leaking soap. • 8:13 - And what did the search results of "white girls" return? What about "chinese girls"? 🤨 A partial test is useless. ¬_¬ • 9:00 - This is just regular confirmation bias; there aren't many articles about Muslims who… sculpted a statue or made a film. • 12:34 - Yikes! Hard to deny raw numbers. 🤦 • 12:41 - A.I.s are black-boxes, you _can't_ know why they make the "decisions" they make. • 13:33 - Most of the people who worked on developing technologies were white (and mostly American). They may or may not have had an inherent bias, but at the very least, they used their own data to test stuff at the beginning while they were still just tinkering around on their own, before they were moved up to labs with teams and bigger datasets. And cats built the Internet.🤷 • 14:44 - I can't believe you guys built this thing just for this video. What did you do afterwards? 🤔

    --13 kun oldin
  • A takeaway from this is also another proof that racism is taught... AI is neutral but then becomes inherently racists due to how they were taught. Learn to love instead of hate

    Robert NRobert N13 kun oldin
  • As more and more governments, like say China, India, Middle Eastern countries, are employing face recognition tools that use such AI for law enforcement and surveillance, and they're buying said software from Western Countries, I am wondering how accurate these systems are seeing as the AI were trained on primarily white faces. Do these AI then learn "locally", and if so, can this data then be fed back into the original AI to make it learn how to recognise those ethnicities in western countries with an ethnically diverse population, like USA, UK, etc.?

    Croissants & CoffeeCroissants & Coffee13 kun oldin
  • While I do think that machines are biased I think that saying they're racist is an over statement.

    Hope RockHope Rock13 kun oldin
  • I'm a simple man. I see Joss, I click.

    LorentianEliteLorentianElite13 kun oldin
  • What you have to say about Pinterest and Tumblr! 🙄

    Ruchi YadavRuchi Yadav13 kun oldin
  • I have a phone that groups faces of people. It straight up just does not pick up black people

    Mohammed AyeshMohammed Ayesh13 kun oldin
  • Isn't the programmers POC themselves? Why are they programming light faces to read better, Sounds like self hate mixed with socioeconomic biases ( who has money based off skincolor) within the programmers themselves and the companies they work for.

    S L ES L E13 kun oldin
  • Dang!!! This is a good documentary!!

    happy feethappy feet13 kun oldin
  • twitter is gonna get so much backlash for this lol

    Arthur MarchettoArthur Marchetto13 kun oldin
  • the background music is way too loud 😶

    OrphouilleOrphouille13 kun oldin
  • You need to be high to think about this

    Ayoub FeddaouiAyoub Feddaoui13 kun oldin
  • A machine can't be racist. Only the programmer that writes the code.

    Alejandro GuevaraAlejandro Guevara13 kun oldin
  • Great vid GYA team!

    Raz T.Raz T.13 kun oldin
  • I would appreciate if you could bring back the title sequence from season 1. Who's with me 🙋

    thilak sundaramthilak sundaram13 kun oldin
  • As a machine learning enthusiast, I can confirm there isn't much diverse set of data available out there. It's just sad but it's alarmingly true.

    Kumar AbhirupKumar Abhirup13 kun oldin
  • An AI Karen then ?

    Yojiv IriakYojiv Iriak13 kun oldin
  • Funnily enough, this reminds me of an episode in Better Off Ted (S1:E4), where the central parody was on automated recognition systems being "racist" and how the corporation tried to deal with it. Well, that was in 2009...

    Books and TeaBooks and Tea13 kun oldin
  • "the human desicions in the design of something (technology or knowledge)" is what actually means when academics say "facts are a social construction" it doesn't mean it is fake (which is the most common and wrong read), it means that there are some human externalities and non intended outcomes in the process of making a technology/knowledge. Tech and knowledge is presented to the public as a finished factual black box, not many people know how them were designed, investigated, etc

    Agustín BertelliAgustín Bertelli13 kun oldin
  • anyone else think of Better off Ted watching this

    Wally ThundermWally Thunderm13 kun oldin
  • Yes, as a Computer Engineering Bachelor and someone who's working with a Camera for almost 4 years now it's good to address the apparent weakness for Camera to capture darker object can mess up AI detections. My own Bachelor Thesis was about Implementation of Pedestrian Detection and its really hard to make sure the Camera is taking a favourable image... And since I am from Indonesia... Which, you guess it, have less white skinned population... Its really hard to find a good experiment location, especially when I use already developed Algorithm as a backbone. There are a lot of False positives... Ranging on "misses counts" due to the person is darker, to double counts due to a fairer skinned person passes while there are human shaped shadows. We need to improve the technology of AI with better Diversity for its Training datasets. It's good to address that weakness to create a better technology than to point fingers... Learn from our mistakes and improve from that... If a hideous person like Edison can do that with his electric lightbulb? Why aren't we doing the same while developing even more advanced tech than him? The title is very nuanced... But hey, it gets me to click it... And hopefully others can stand through the Headline.

    Ari BantalaAri Bantala13 kun oldin
  • This is why diversity is important. If the technology is developed by predominantly white male engineers then you're going to have that bias in the data. I'd love to see a study on how diversity in an software engineering team might improve data or if its an issue with public data sets as a whole.

    Allan RefordAllan Reford13 kun oldin
  • 22:09 Software “Engineer” is one of the few engineering disciplines that is not regulated by states/provinces. Structural Engineers, Electrical Engineers, etc... on the other hand, have accreditation bodies that decide who can practice as an engineer and impose codes of conduct to censure those who do not measure up to the standard of a professional engineers. Engineers have a duty of care to not only their employers and clients, but to the safety of the public. Follies of software engineers, can be just as harmful to the public as miscalculating structural forces on a beam, or the electrical current through a wire.

    William ThompsonWilliam Thompson13 kun oldin
  • Aren't we the creators of the machines, passing our own image (strengths & short-comings) to what we create? If they were to become sentient and seek to learn from the human race won't the machines pick up bias and hatred??

    Howard MnengwaHoward Mnengwa13 kun oldin
  • very good information, it reminded me of your video from 5 years ago... "Color film was built for white people. Here's what it did to dark skin"

    Alberto RodriguezAlberto Rodriguez13 kun oldin
  • This was a good video

    Emerito NacpilEmerito Nacpil13 kun oldin
  • At the beginning of the video i thought this was dumb but by midway through I’m like this is what we need.

    Mequelle KeelingMequelle Keeling13 kun oldin
  • they are create by man so I'm not surprised 😑

    iulix maxiulix max13 kun oldin
UZworld