/Female geniuses take on artificial intelligence – and racial and gender biases on Amazon

Female geniuses take on artificial intelligence – and racial and gender biases on Amazon

The movement against defective artificial intelligence began with an investigation by a computer worker at MIT Media Lab and a candidate for Dr. Joy Buolomvini. Boulmini (who went on and on with Alexandria representative Ocasio-Curtis) who went viral after testifying before the U.S. Congress in 2014, stumbled upon racial and gender discrimination in the mass sale of AI software while performing an industrial project for classes at MIT. . Only when Bumlvini, who was black, wore a white mask, did the main facial recognition software picked up on his face. The problem is not in failure, he continues to test several more facial recognition systems to discover it in standard form. From there, the concept of algorithmic judgment or the possibility of recovering algorithmic inequality increases.

Coded offsetDirector Shalini Katyar’s new documentary basically identifies the women’s movement against this bizarre new look and is grateful that both Bulamini and Kantai are seeing how AI is not presenting the problem. “Recognized” by this software is usually worse than ignoring it. All the reasons from Facebook to Google – Amazon – FBI and US Police – Police software can falsely identify all people as unsolicited consumers, criminals who do not have job seekers, perhaps rehabilitators or those who want criminals. Often these facial recognition systems identify black people of all genders, with strong prejudices against women of all races, and do not match the gender of the person. So, it has become clear that the software is basically made for white people.

Where coded bias seems to be seen, federal AI regulations in particular need to reinforce oral recognition software. It is not clear, however, whether this regulatory model will suffice, as it needs to be strictly enforced by an independent body that is not profitable to publish a shadow product. In the document, the Food and Drug Administration is shown as the desired analogue, but the FDA gloriously prescribes harmful drugs and food products. Misses (such as the diabetic drug troglitazone, which was introduced in 199 in the United States) Turn on the controller, but the FDA only had it removed) 2000). Delay in approval of life-saving measures, such as AIDS / HIV.

Coded offset Bulomvini follows up when meeting with a tenant organizer in Brownville, Brooklyn, where black women are primarily working to create a system to deal with the struggle to relocate their landlords, as well as a system with keys already in place. At first in this building, the surveillance systems already monitor the black and brown tenants inside and point out behaviors that the building management does not like. This situation with Brownville’s older brother is how the most incarcerated technologies end up in the most backward communities, where poor blacks act as guinea pigs – one might even say literally the regulator for new technologies. Tenants are left behind, and seniors and young residents join Bulamvini to challenge campaigns on their privacy and freedom.

The documentary, however, shows that even after greater intervention in a society like the United States, there is nothing better than China, which thus engages in and benefits virtually universal surveillance of citizens through a system of “social nding”. . Or punish citizens on the basis of their public actions. In fact, one researcher notes, the only difference between China and the United States in terms of observations is that China is actually transparent in its scoring system. In the United States, we also give citizens and residents “social credit assessments” because they are usually hidden from us. If you have a problem getting mortgage or car insurance, it has been found and killed by various private companies and even the government. If you clearly fail to get an interview to qualify for the vacancy, HR sections may be the reason to use biased intellectual research to re-see the works (when Amazon worked on it, women were excluded and only men were hired). If you are on a watch list, you have never been convicted of a crime and this algorithm or tracking system has given you a guess even though you are not involved in violent behavior.

These revelations make the question of why coded bias land as a means of overseeing capitalism in our highly imperfect justice system interesting. Shona Zubff, a professor at Harvard Business School, who wrote a very credible era of observational capitalism and charged the phrase herself, is not mentioned or questioned in the documentary. But his view of the “behavioral futures market” strongly demonstrates a way to predict law enforcement, namely, the expansion of police power as part of artificial intelligence, machine learning, and police reform guaranteed by the big data that lies within it, resulting in strong capitalism itself. The coded bias may benefit by presenting Bualamvini’s view of algorithmic justice and Zubef’s view of observational capitalism.

“Bulomvini really gets a response from Amazon if the vice-contestant wants to disrespect his work in response to the criticism he has mistakenly used against his organization.”

Coded offset Bualambini is hyper-focused as a leader and advocate in the algorithmic space of justice, and consequently Google and IBMs do not question such an approach to improving company behavior rather than challenging the core of the world (and especially the United States). Buolmini may have reacted if Vice Challenger mistakenly disrespected his work in response to criticism from the company, but on the other hand it has absorbed some of IBM’s criticisms and reshaped their diversity and inclusion. . Issues of acquisitions and partnerships with corporate technology companies? We never know because director Shalini Kataya never asks Bulambini about it.

Mathematician Katie O’Neill, data journalist Meredith Brosard, computer science, and African-American research scientist Dr. Sophia Umoa Nobel, among others, can give more room for ideas for ideas. Instead, the coded bias seems ready enough to form Bulamvini as the leader of a movement supported by other intelligent women. It is this branding that obscures the foundation of the Bulambini mission and many more to sustain corporations from our future decision-making.