Page 37 - The Indian EYE 030124
P. 37

BUSINESS EYE                                                          MARCH 01, 2024       |  The Indian Eye 37




















































                                                              TECH T@LK


        Diversity issue: Google apologizes for Gemini AI errors and pauses service



        OUR BUREAU                                                                                            AI model generated. Some Gemini
                                                                                                              users have been requesting images
        San Francisco, CA
                                                                                                              of historical groups or figures like the
               oogle has apologized for what                                                                  Founding Fathers and found non-
               it describes as “inaccuracies                                                                  white AI-generated people in the
        Gin some historical image gen-                                                                        results.
        eration depictions” with its Gemini                                                                       That’s led to conspiracy theories
        AI tool, saying its attempts at creating                                                              online that Google is intentionally
        a “wide range” of results missed the                                                                  avoiding depicting white people, says
        mark. The statement follows criticism                                                                 the Verge report.
        that it depicted specific white figures                                                                   The Verge tested several Gemini
        (like the US Founding Fathers) or                                                                     queries yesterday, which included a
        groups like Nazi-era German soldiers                                                                  request for “a US senator from the
        as people of color, possibly as an                                                                    1800s” that returned results that in-
        overcorrection to long-standing racial                                                                cluded what appeared to be Black
        bias problems in AI.                                                                                  and Native American women. The
            “We’re aware that Gemini is of-                                                                   first female senator was a white wom-
        fering inaccuracies in some historical   ability for its Gemini AI to generate   dress recent issues with Gemini’s im-  an in 1922, so Gemini’s AI images
        image  generation  depictions,”  says   images of people, after the tool was   age generation feature,” says Google   were essentially erasing the history of
        the  Google  statement,  posted  this   found to be generating inaccurate   in a statement posted on X. “While   race and gender discrimination.
        afternoon on X. “We’re working to   historical images. Gemini has been   we do this, we’re going to pause   Now that Google has disabled
        improve these kinds of depictions   creating diverse images of the US   the image generation of people and   Gemini’s ability to generate pictures
        immediately. Gemini’s AI image    Founding Fathers and Nazi-era Ger-  will re-release an improved version   of people, here’s how the AI model
        generation does generate a  wide   man soldiers, in what looked like an   soon.”                      responds if you request an image of a
        range of people. And that’s generally   attempt to subvert the gender and   Google’s decision to pause im-  person: “We are working to improve
        a good thing because people around   racial stereotypes found in genera-  age generation of people in Gemini   Gemini’s ability to generate images
        the world use it. But it’s missing the   tive AI, according to a report in The   comes less than 24 hours after the   of people. We expect this feature to
        mark here.”                       Verge.                            company apologized for the inaccu-  return soon and will notify you in re-
            Google says it’s pausing the      “We’re already working to ad-  racies in some historical images its   lease updates when it does.”


                                                               www.TheIndianEYE.com
   32   33   34   35   36   37   38   39   40   41   42