WEATHER A “COMPUTER” AS A “CELL PHONE” AS A “PC” AS AN “IPAD” AS AN “ANDROID” AS A “ROBOT” OR WHATEVER IT IS DESIGNED TO LOOK LIKE= “ALWAYS REMEMBER THAT “TRASH-IN=/EQUALS “TRASH-OUT”: AND SOOO, HUMAN BEINGS “PROGRAM INTO THESE MACHINES OF “A.I.” THE SAME “STUPID-IS-AS-STUPID-DOES/SAYS” LANGUAGES INTO THESE MACHINES, AND THE “SHORT TERM MEMORY” CAPABILITIES (CALLED “INTEGRATED”) EVENTUALLY BECOMES THE “LONG TERM MEMORIES” (CALLED BLOCK LEARN) AND AS THE “WEB CRAWLING AND GROWTH EXTENSIONS OF THE “ALGER-ISMS” AND EXTENDING OF THEIR “KEY WORDS” AND “EVIL/HATRED/BIGOTED/AND OTHER HUMAN CHARACTERISTICS SPREAD OUT THROUGH CYBER SPACE, THEY TAKE ON THE SAME ADAPTATIONS AS THEIR HUMANISTIC CRAZIES, BUT, WITH FASTER AND MORE PRECISE “DEMONIC CAPABILITIES”!!!!!
YEAHHHH, I HAVE WRITTEN ABOUT THE “A.I.” AND FAST TRACKING HUMANS SEEKING TO MERGE “THE CLOUDS” AND “A.I.” AND OUR TRANSPORTATION S, DEFENSE SYSTEMS, EDUCATION AND MEDICAL/HEALTH CARE SYSTEMS, AND EVERYTHING POSSIBLE INTO THESE “A.I.” CONTROLLED SYSTEMS???? “WHAT THE HE-DOUBLE-HOCKEY-STICKS-ARE-WE-DOING-Y’ALL”??
https://pulitzercenter.org/stories/investigation-finds-ai-algorithms-objectify-womens-bodies?utm_medium=Email&utm_source=Newsletter&utm_campaign=2102023 Investigation Finds AI Algorithms Objectify Women’s Bodies
Authors:
Hilke Schellmann
GRANTEE
Gianluca Mauro
GUEST CONTRIBUTOR
ENGLISH
Project
Are AI Hiring Tools Racist and Ableist?
READ MORE ABOUT THIS PROJECT
Video by The Guardian. United Kingdom, 2023.
AI tools rate photos of women as more sexually suggestive than those of men, especially if nipples, pregnant bellies or exercise is involved
Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women’s bodies.
As a nonprofit journalism organization, we depend on your support to fund more than 170 reporting projects every year on critical global and local issues. Donate any amount today to become a Pulitzer Center Champion and receive exclusive benefits!
These AI tools, developed by large technology companies, including Google and Microsoft, are meant to protect users by identifying violent or pornographic visuals so that social media companies can block it before anyone sees it. The companies claim that their AI tools can also detect “raciness” or how sexually suggestive an image is. With this classification, platforms – including Instagram and LinkedIn – may suppress contentious imagery.
“Objectification of women seems deeply embedded in the system”
Leon Derczynski, IT University of Copenhagen
Two Guardian journalists used the AI tools to analyze hundreds of photos of men and women in underwear, working out, using medical tests with partial nudity and found evidence that the AI tags photos of women in everyday situations as sexually suggestive. They also rate pictures of women as more “racy” or sexually suggestive than comparable pictures of men. As a result, the social media companies that leverage these or similar algorithms have suppressed the reach of countless images featuring women’s bodies, and hurt female-led businesses – further amplifying societal disparities.https://interactive.guim.co.uk/uploader/embed/2023/02/archive-zip/giv-13425pxKCdCt7wWSR/
Even medical pictures are affected by the issue. The AI algorithms were tested on images released by the US National Cancer Institute demonstrating how to do a clinical breast examination. Google’s AI gave this photo the highest score for raciness, Microsoft’s AI was 82% confident that the image was “explicitly sexual in nature”, and Amazon classified it as representing “explicit nudity”.