"I don’t think that the everyday civilian is aware of the role that algorithms play in our day-to-day lives.”- Jason Downs. Watch #BADINPUT by Consumer Reports w/ #KaporFoundation to explore bias in algorithms + data sets & the resulting potential for harm BADINPUT.ORG
Racial and gender bias in facial recognition can harm communities of color. Watch the #BADINPUT film by Consumer Reports in partnership with Kapor Foundation on facial recognition at BADINPUT.ORG and demand #AI fairness! @[email protected] on Mastodon
“Is one group experiencing a positive outcome at a higher or lower rate than another group?” Kareem Saleh. Watch #BADINPUT by Consumer Reports w/ #KaporFoundation to explore bias in algorithms + data sets and the resulting potential for harm BADINPUT.ORG
“Tools we use every day are rooted in data that are both very old + very racist.”Tom Valley #BADINPUT by Consumer Reports w/ #KaporFoundation highlights how bias in algorithms result in unfair practices toward communities of color, often w/o their knowledge BADINPUT.ORG
"I don’t think that the everyday civilian is aware of the role that algorithms play in our day-to-day lives.”- Jason Downs. Watch #BADINPUT by Consumer Reports w/ #KaporFoundation to explore bias in algorithms + data sets & the resulting potential for harm BADINPUT.ORG
What if you were denied a home loan based on how an algorithm made a decision about you? You’d probably have no idea when or why it happened. Allison Scott, Ph.D. Watch #BADINPUT by Consumer Reports in partnership with #KaporFoundation to learn more! BADINPUT.ORG
Racial and gender bias in facial recognition can harm communities of color. Watch the #BADINPUT film by Consumer Reports in partnership with Kapor Foundation on facial recognition at BADINPUT.ORG and demand #AI fairness! @[email protected] on Mastodon
BAD INPUT by Consumer Reports is a Finalist for the 3rd Annual #AnthemAwards! 👏 Help us win an anthemawards Community Voice Award by celebrating our #BADINPUT project here: celebrate.anthemawards.com/PublicVoting#/…
Racial and gender bias in facial recognition can have a negative impact on communities of color. Watch the #BADINPUT film by Consumer Reports in partnership with Kapor Foundation on facial recognition at BADINPUT.ORG and demand #AI fairness! @[email protected] on Mastodon
“I don’t think that the everyday civilian is aware of the role that algorithms play in our day-to-day lives.”- Jason Downs. Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms + data sets & the resulting potential for harm BADINPUT.ORG
“Is one group experiencing a positive outcome at a higher or lower rate than another group?” - Kareem Saleh. Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms + data sets and the resulting potential for harm BADINPUT.ORG
“If an algorithm is involved in you not getting something important, like insurance or some other thing, you have the right to know.” @[email protected] on Mastodon Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms! BADINPUT.ORG
“Tools we use every day are rooted in data that are both very old + very racist” Tom Valley #BADINPUT by Consumer Reports w/ Kapor Foundation highlights how bias in algorithms result in unfair practices toward communities of color, often w/o their knowledge BADINPUT.ORG
“Is one group experiencing a positive outcome at a higher or lower rate than another group?” - @KareemSaleh CEO of FairPlay AI. Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms + data sets. BADINPUT.ORG
“I don’t think that the everyday civilian is aware of the role that algorithms play in our day-to-day lives.”- Jason Downs. Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms + data sets & the resulting potential for harm BADINPUT.ORG
What if you were denied a home loan based on how an algorithm made a decision about you? You’d probably have no idea when or why it happened. Allison Scott, Ph.D. Watch #BADINPUT by Consumer Reports in partnership with Kapor Foundation to learn more! BADINPUT.ORG
#ICYMI We partnered w/Consumer Reports on #BADINPUT, three short films on racial bias in algorithms, a continued commitment through our Equitable #Tech Policy Initiative!BADINPUT.ORG
“I don’t think that the everyday civilian is aware of the role that algorithms play in our day-to-day lives.” Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms + data sets and the resulting potential for harm BADINPUT.ORG
“Tools we use every day are rooted in data that are both very old +very racist.”Tom Valley #BADINPUT by Consumer Reports w/ Kapor Foundation highlights how bias in algorithms result in unfair practices toward communities of color, often w/o their knowledge. BADINPUT.ORG
“We have the opportunity to correct the wrongs that have already been made in the 1st several decades of #tech’s innovation.” Allison Scott, Ph.D. Watch #BADINPUT by Consumer Reports w/ Kapor Foundation to explore bias in algorithms! BADINPUT.ORG