/mdt/ - Ctrl+V № 25

Ctrl+V № 25 (49 Réponses)

1 .


2 .


3 .


4 .


5 .


6 .

Joy Buolamwini was conducting research at MIT on how computers recognized people’s faces, when she started experiencing something weird.

Whenever she sat before a system's front-facing camera, it wouldn't recognize her face, even after working for her lighter-skinned friends. But when she put on a simple white mask, the face-tracking animation suddenly lit up the screen.

Suspecting a more widespread problem, she carried out a study on the AI-powered facial recognition systems of Microsoft, IBM and Face++, a Chinese startup that has raised more than $500 million from investors.

Buolamwini showed the systems 1,000 faces, and told them to identify each as male or female.

All three companies did spectacularly when discerning between white faces, and men in particular.

But when it came to dark-skinned females, the results were dismal: there were 34% more errors with dark-skinned females than light-skinned males, according to the findings Buolamwini presented on Saturday, Feb. 24th, at the Conference on Fairness, Accountability and Transparency in New York.

As skin shades on women got darker, the chances of the algorithms predicting their gender accurately “came close to a coin toss.” With the darkest skin women, the face-detection systems were getting their gender wrong close to half the time.

Buolamwini’s project, which became the basis of her MIT thesis, shows that concerns about bias are adding a new dimension to the general anxiety around artificial intelligence.

While much has been written about ways that machine learning will replace human jobs, the public has paid less attention to the consequences of biased datasets.

What happens, for instance, when software engineers train their facial-recognition algorithms primarily with images of white males? Buolamwini's research showed the algorithm itself becomes prejudiced.

Another example came to light in 2016, when Microsoft released its AI chatbot Tay onto Twitter. Engineers programmed the bot to learn human behavior by interacting with other Twitter users. After just 16 hours, Tay was shut down because its tweets had become a stream of sexist, pro-Hitler messages.

Experts later said Microsoft had done just fine teaching Tay to mimic behavior, but not enough about what behavior was appropriate.

Suranga Chandratillake, a leading venture capitalist with Balderton Capital in London, UK, says bias in AI is as much a concerning issue as that of job destruction.

“I’m not negative about the job impact,” he says. The bigger issue is building AI-powered systems that take historical data, then use it to make judgements.

“Historical data could be full of things like bias,” Chandratillake says from his office in Kings Cross, which is just up the road from the headquarters of Google’s leading artificial intelligence business, DeepMind.

“On average people approve mortgages to men or people who are white, or from a certain town.” When the power to make that judgement is given to a machine, the machine “encodes that bias.”

So far the examples of bias caused by algorithms have seemed mundane, but in aggregate they can have an impact, especially with so many companies racing to incorporate AI into their apps and services. (Mentions of "AI" in earnings calls have skyrocketed over the past year, according to CB Insights, even from unlikely companies like Procter & Gamble or Bed Bath & Beyond.)

In recent months several researchers have pointed to how even Google Translate has shown signs of sexism, automatically suggesting words like “he” for male-dominated jobs and vice versa, when translating from a gender-neutral language like Turkish.

Camelia Boban, a software developer in Italy, also noticed on Feb. 4th that Google Translate didn’t recognize the female term for “programmer” in Italian, which is programmatrice. (She said in a recent email to Forbes that the issue has since been corrected.)

Such examples might sound surprising when you expect software to be logical and objective. “People believe in machines being rational," Chandratillake says. "You end up not realizing that actually, what should be meritocratic, isn’t at all. It’s just an encoding of something that wasn’t in the first place.”

When humans make important decisions about hiring, or granting a bank loan, they’re more likely to be questioned about their judgement. There's less reason to question AI because of “this veneer of innovative new tech," he says. "But it’s destined to repeat the errors of the past.”

Today's engineers are also overly-focused on building algorithms to solve complex problems, rather than building an algorithm to monitor and report on how the first algorithm is performing -- a kind of algorithmic watchdog.

“Today the way a lot of AI is configured, is basically as a black box,” he adds. “Neural networks are not good at explaining why they made a decision.”

MIT’s Buolamwini points to the lack of diversity in images and data used to train algorithms.

Fortunately, this is an issue that can be worked on.

After MIT's Buolamwini sent the results of her study to Microsoft, IBM and Face++, IBM responded by replicating her research internally, and releasing a new API, according to a conference goer who attended her presentation on Saturday.

The updated system now classifies darker-skinned females with a success rate of 96.5%.

Follow me on Twitter @parmy or email me here.


7 .

C'est quoi que vous comprenez pas dans Ctrl+V, philistins ?

8 .

Je te retourne la question.

9 .

Je te retourne la question.

10 .


11 .


12 .


13 .

Bises à tout le monde, et bonne nuit.

14 .


15 .

A trick is something a whore does for money... or cocaine. ...

16 .

I was born in 1994 in Albany, New York. I lived with my parents in Prairieville, New York until I was five. They divorced, and I lived with my dad until I was 14. He had an aide through the state, basically a nanny. He became very attached to the nanny. The state decided it looked like a long abuse case; they were just going to replace her. Because my dad was having relations with her, he decided, no, he’d rather sleep with her instead. He put me and my brother into the foster care system. My mother was informed that we were in the foster care system, and she started the legal process to get us out, which went on for two years.

17 .

Demiurge’s farm is essentially a concentration camp. “Bipedal sheep” is just a code word for human. Demiurge kidnaps people and performs cruel experiments/torture on them, such as forcing them to eat their own limbs and then healing them to test how healing magic works. Apparently, he’s also been conducting breeding experiments by forcing humans to mate with demi-humans. In one instance, it’s stated that he forced people to eat babies that were cooked alive. However, the main purpose of his farm is to harvest scroll materials from humans/demi-humans by skinning them alive.

18 .


19 .


20 .

R holds a reputation for getting things done with very little code. If you’re a programmer and thinking “Here comes the Hello World code”, you’re in for a surprise.

21 .


22 .

Je me ferai bien un Mooc pour apprendre les bases des statistiques avec R. Peut-être que ça ne me servira jamais, surtout si passe un jour en freelance (je me ferai manger tout cru par les ingénieurs), mais ça peut toujours servir

23 .

“She would rub her genital slit against me,” he says in “Dolphin Lover.” “And if I tried to push her away, she would get very angry with me. One time, when she wanted to masturbate on my foot and I wouldn’t let her, she threw herself on top of me and pushed me down to the 12-foot bottom of the pool.”

mtq les dauphins baisent mieux que des humaines

24 .


25 .

Radio Sputnik is funded in whole or in part by the Russian government

26 .


27 .


28 .


29 .


30 .


31 .

L'évolution humaine a un besoin vital d'avancer. Quand elle se sert seulement du moteur Science, elle marginalise le guide Croyance et avance au hasard pour finir dans une impasse. Celle où le Terrien est actuellement immobilisé s'appelle Dérèglement Climatique.

32 .

Startup wants to upload your brain to the cloud, but has to kill you to do it

33 .

vu sur Wikipédia (ce qui prouve bien encore une fois le problème qu'il y a dans ce milieu gangrené par le satanisme décoré en paganisme)

34 .

на четвереньках

35 .


36 .


37 .

Paye ton cancer.

38 .


39 .


40 .

no rage stp

41 .


42 .


43 .


44 .


45 .


46 .


47 .


48 .


49 .

This java.util.logging implementation is enabled by providing certain system properties when starting Java. The Apache Tomcat startup scripts do this for you, but if you are using different tools to run Tomcat (such as jsvc, or running Tomcat from within an IDE), you should take care of them by yourself.

50 .

It was formed by combining the performance and security benefits of C++ and includes the speed of Python.
Styles : Acrimonie Nuit