• zurohki@aussie.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      9 months ago

      Yeah it is. The training data skews white, so they added a “make some people non-white” kludge. It wouldn’t be needed if there was actually racial diversity in the training data.

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        It’s the “make some people non-white” kludge that’s the specific problem being discussed here.

        The training data skewing white is a different problem, but IMO not as big of one. The solution is simple, as I’ve discovered over many months of using local image generators. Let the user specify what exactly they want.

      • GregorGizeh@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 months ago

        I don’t even see the problem with that. If western corps make an ai based overwhelmingly on western (aka majorities white people) datasets they get an ai that skews white in all things.

        If they want more well rounded data they would need to buy them from China and India, probably other parts of Asia too. Only that I don’t think they are willing to give those datasets away because they are aware of their actual value, and/or are more interested in creating their own ai with it (which will then of course skew chinese for example).