• 2 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: September 28th, 2023

help-circle
  • Post from July, tweet from today:

    It’s easy to forget that Scottstar Codex just makes shit up, but what the fuck “dynamic” is he talking about? He’s describing this like a recurring pattern and not an addled fever dream

    There’s a dynamic in gun control debates, where the anti-gun side says “YOU NEED TO BAN THE BAD ASSAULT GUNS, YOU KNOW, THE ONES THAT COMMIT ALL THE SCHOOL SHOOTINGS”. Then Congress wants to look tough, so they ban some poorly-defined set of guns. Then the Supreme Court strikes it down, which Congress could easily have predicted but they were so fixated on looking tough that they didn’t bother double-checking it was constitutional. Then they pass some much weaker bill, and a hobbyist discovers that if you add such-and-such a 3D printed part to a legal gun, it becomes exactly like whatever category of guns they banned. Then someone commits another school shooting, and the anti-gun people come back with “WHY DIDN’T YOU BAN THE BAD ASSAULT GUNS? I THOUGHT WE TOLD YOU TO BE TOUGH! WHY CAN’T ANYONE EVER BE TOUGH ON GUNS?”

    Embarrassing to be this uninformed about such a high profile issue, no less that you’re choosing to write about derisively.











  • Short answer: “majority” is hyperbolic, sure. But it is an elite conviction espoused by leading lights like Nick Beckstead. You say the math is “basically always” based on flesh and blood humans but when the exception is the ur-texts of the philosophy, counting statistics may be insufficient. You can’t really get more inner sanctum than Beckstead.

    Hell, even 80000 hours (an org meant to be a legible and appealing gateway to EA) has openly grappled with whether global health should be deprioritized in favor of so-called suffering-risks, exemplified by that episode of Black Mirror where Don Draper indefinitely tortures a digital clone of a woman into subjugation. I can’t find the original post, formerly linked to from their home page, but they do still link to this talk presenting that original scenario as a grave issue demanding present-day attention.


  • less than 1%…on other long-term…which presumably includes simulated humans.

    Oh it’s way more than this. The linked stats are already way out of date, but even in 2019 you can see existential risk rapidly accelerating as a cause, and as you admit much moreso with the hardcore EA set.

    As for what simulated humans have to do with existential risk, you have to look to their utility functions: they explicitly weigh the future pleasure of these now-hypothetical simulations as outweighing the suffering of any and all present or future flesh bags.