• TinyTimmyTokyo@awful.systems
    link
    fedilink
    English
    arrow-up
    19
    ·
    10 months ago

    You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

    I used to be more sanguine about people’s ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.

      • YouKnowWhoTheFuckIAM@awful.systems
        link
        fedilink
        English
        arrow-up
        11
        ·
        10 months ago

        I like some people who have written for Jacobin, sometimes I even enjoy an article here and there, but the magazine as a whole remains utterly unbeaten in the “will walk the length of Manhattan in a “GIANT RUBE” sandwich board for clicks” stakes

      • skillissuer@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        this reminds me of a plankton organization or something called “blockchain socialism”, where the only thing that they have taken from socialism was aesthetics and probably they also thought that gays are fine people, but nothing beyond that. they would say “Monero can be used for anti-state purposes, therefore it’s good for leftism” and shit like that

          • self@awful.systemsM
            link
            fedilink
            English
            arrow-up
            13
            ·
            10 months ago

            I think I’ve met that guy! they’re the weirdest person I’ve ever seen get bounced from a leftist group under suspicion of being a fed (the weird crypto shit was the straw that broke the camel’s back)

              • self@awful.systemsM
                link
                fedilink
                English
                arrow-up
                10
                ·
                10 months ago

                it’s kind of amazing how many financial scams try to appropriate leftist language and motivations to lure in marks, while the actual scheme is one of the most unrepentantly greedy and wasteful things you can do without going to prison (and some of them cross even that line)

      • self@awful.systemsM
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        after what I’ve heard my local circles say about jacobin (and unfortunately I don’t remember many details — I should see if anybody’s got an article I can share) I’m no longer shocked when I find out they’re platforming and redwashing shitty capitalist mouthpieces

          • self@awful.systemsM
            link
            fedilink
            English
            arrow-up
            10
            ·
            10 months ago

            my conflicting urges to rant about the defense contractors sponsoring RustConf, the Palantir employee who secretly controls most of the Rust package ecosystem via a transitive dependency (with arbitrary code execution on development machines!) and got a speaker kicked out of RustConf for threatening that position with a replacement for that dependency, or the fact that all the tech I like instantly gets taken over by shitheads as soon as it gets popular (and Nix is looking like it might be next)

            • sinedpick@awful.systems
              link
              fedilink
              English
              arrow-up
              7
              ·
              edit-2
              10 months ago

              More details on the rust thing? I can’t find it by searching keywords you mentioned but I must know.

              • self@awful.systemsM
                link
                fedilink
                English
                arrow-up
                6
                ·
                10 months ago

                so far the results from various steering committees haven’t been fantastic, to the point where I’ve seen marginalized folks ranting about the outcome on mastodon, which isn’t a great sign. with that said, I’ve also seen a ton of marginalized folks quite happily get into Nix recently, and that’s fantastic — as long as they don’t hit a brick wall in the form of exclusionary social systems set up around contributing to the Nix ecosystem.

                overall these are essentially just general concerns around a few signals I’ve seen and the point Nix is at where it’s rapidly transitioning from a project with an academic focus to one with a more general focus. I’ve already seen many attempts by commercial interests to irrevocably claim parts of the ecosystem, especially in flakes (there have been many attempts to restandardize flakes onto a complex, commercially-controlled standard library, which could result in a similar situation to what we’ve seen with rust)

                Nix itself is still fantastic tech I use everywhere; that’s why I care if folks are excluded from contributing. unfortunately, the commercialization of open source ecosystems and exclusion seem to go hand-in-hand — it’s one of the tactics that corporations use to maintain control over open source projects, while making forks very hard or impossible for anyone without corporate levels of wealth and available labor.

    • mountainriver@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      I think funding and repetition are the fundamental building blocs here, rather than the human psyche itself. I have talked with otherwise bright people who have read an article by some journalist (not necessarily a rationalist) who has interviewed AI researchers (probably cultists, was it 500 million USD that was pumped into the network?) who takes AI doom seriously.

      So you have two steps of people who in theory are paid to evaluate and formulate the truth, to inform readers who don’t know the subject matter. And then add repetition from various directions and people get convinced that there is definitely something there (propaganda and commercials work the same way). Claiming that it’s all nonsense and cultists appears not to have much effect.

      • jonhendry@awful.systems
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 months ago

        There’s probably some blurring of what “AI doom” means for people. People might be left thinking that “there could be negative effects due to widespread job loss etc” without necessarily buying into the weird maximalist AI doom ideas or “torturing simulated you forever” nonsense.

        And the weirdo cultists probably use that blurring to build support for their cause without revealing the weird shit they actually believe.