• MisterEspinacas@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I mean, law enforcement occasionally uses polygraph tests in their investigations even though that type of “evidence” isn’t admissible in court and, to be honest, what kind of scientific credibility does a piece of technology like a polygraph even have? They’ll use whatever they can get their hands on even if it’s questionable. Some police forces probably even have a psychic consultant or something. It scares me.

    • Soggy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      They’ll use it especially if it’s questionable, like handwriting analysis, because the goal is arrests not correct arrests. Trumped up, flimsy, circumstantial “evidence” is the best kind when you don’t actually want to do your job.

    • Esp@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yeah, this same exact story keeps coming up for years now just with different names. Why anyone would think that both the ineffectiveness and racial bias in these systems either wouldn’t exist or will somehow go away eventually is beyond me. Just expensive and ineffective mass surveillance for the sake of it…

    • steltek@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Who remembers the HP computer that was unable to identify black people? One of my favorite “oooph, that’s not a good look” tech fails of all time. At least the people in that video were having a good laugh about it.

      https://www.youtube.com/watch?v=t4DT3tQqgRM

      Holy hell, that was 13 years ago.

      • yA3xAKQMbq@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Yeah, but statistics is a b*tch.

        We had a similar technology for a test run some years ago at a train station in Berlin, capital of Germany and largest city in the EU with 3.8M.

        The results the government happily touted as a success were devastating. They had a true positive rate of 80% (and this was already cooked since they tested several systems at several locations but only reported the best results), which is really not that good to start with.

        But they were also extremely proud of the false negative positive rate, which was below 0.1%. That doesn’t sound too bad, does it?

        Well, let’s see…

        True positive means you actually identified the people you were looking for. Now, I don’t know the number of people Berlin’s police is actively looking for, but it’s not that much. And the chances of one of them actually passing that very station are even worse. And out of that, you have 20% undetected. That’s one out of five. Great. If I were a terrorist, I would happily take that chance.

        So now let’s have a look at the false negative positive rate, which means you incorrectly identified a totally harmless person as a terrorist/infected/whatever. The population for that condition is: everyone passing through that station.

        Let’s assume there’s a 100k people on any given day (which IIRC is roughly half of what that station in Berlin actually has). 0.1% of 100k is 100 people, every day, who are mistakenly reported as „terrorists“. Yay.

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      Discrimination is the wrong word. Technology has no morals or sense of justice. It is bias in the data that developers should have accounted for.

      • HardlightCereal@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        11
        ·
        1 year ago

        You need to learn some critical race theory. Racist systems turn innocent intentions into racist actions. If a PhD student trains an AI model on only white people because the university only has white students, then that AI model is going to fail black people because black people were already failed by university admissions. Innocent intention plus racist system equals racist action.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I’m going to take a wild stab in the dark that all the false positives were black men.

    For the same reason that my Echo dot (aka Spotify Bitch) will ignore my wife but cheerfully respond to my mumbled requests from three rooms away. If you make all this shit in Silicon Valley, it will work best for people of a similar demographic to those that work there.

    • soviettaters@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      The white liberals building this technology say they’re all progressive yet only surround themselves with people like them and only build products for people like them. A lack of diversity in tech like this is a lack of good testing.

      • Smoogs@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Also AI is taught by its creator. Tech has some of it’s most well hidden, bigotted, mid-level white people refusing to critically question their own bias and privilege. There’s a shit tone of that fragile masculinity in the tech industry just hard coding it into it.

        There was a guy fired from google for writing a manifesto about how women aren’t ‘wired’ for tech. And that’s just the one that waved his crazy flag out in the open so no one in upper management could easily keep on ignoring it.

        • BrotherCod@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          While I agree with you 100% that programming can be affected by the programmers biases, there’s a much simpler problem that face recognition was having a hard time overcoming. At least when it was a main topic about a decade ago, sensors were having a lot of problems with the low contrast of some black people’s faces. Anyone who’s had a black friend and was a shutter bug will know what kind of problems you can run into when trying to get a proper exposure and not make a black person disappear completely from a photograph. It was just an inherent limitation of the technology they were using. The last statistics I read was something like between 20 to 30% positive matches, which we know damn well is too low for it to be a workable technology. The success rate on Caucasian and lighter skin tones weren’t even that great. There was still something like a 60% false positive match rate. The software may have gotten better over the past decade but we all know that whether it did or not, they’re still going to use it.

      • OhNoMoreLemmy@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        They’re more libertarian than liberal. Anti worker rights, anti consumer rights, and anti taxation.

        The only government spending they’re in favour of is government spending and subsidies on tech e.g. Tesla, space X, and the entire military complex.

        • RaoulDook@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          2
          ·
          1 year ago

          You haven’t read much about Libertarian policy I see. They are very pro-rights, in fact that is the core of the party platform. Individual liberty is their chief concern, and I applaud their efforts in fighting for our rights and freedom.

  • SangriaFerret@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Tbf, NOPD don’t arrest many people anyway. There’s a massive cop shortage, only 944 officers for a city of 364,000 with skyrocketing crime rates. Moreover, they’ve been operating under a consent decree by the DOJ since 2012. They’re overworked, underpaid and under the thumb of the feds so in response they simply don’t do shit.

  • P03 Locke@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    So, why not just write-off the technology as unreliable and move on? Even with the atrocious false positive rate, you would have still expected more than 15 hits in 9 months. This tech has got to be expensive and even the potential ROI on this, if it ever works at all, is very not worth it.

  • DaveNa@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    And this is lemmy, a propaganda platform. That site cited as news. First source, no link. 2nd source, another “news website.” 3rd source, Twitter. Half the article, opinion. OK. I’ll see myself out, thank you very much.