• 4 Posts
  • 129 Comments
Joined 2 months ago
cake
Cake day: February 11th, 2026

help-circle



  • When Reddit started, Spez and Aaron made “sock puppet” accounts to make the site seem more active than it was, because you have to have what looks like an active user base to attract more users.

    Now, after Reddit has gone public, they need the appearance of lots of users to attract advertising dollars and keep their stock price high, and there’s no need to operate sock puppets by hand anymore because of LLMs - they can be the sock puppets, and if you have enough of them acting human enough, it doesn’t even matter if some people realise or if one gets called out as a bot.

    This also has the extremely useful benefit of steering society slowly towards the ideology of the billionaire, by having those bots normalise hate in the tsunami if messages they post.

    I think the account you interacted with was an LLM, not even a real troll.




  • That’s not “a very common way to see it”. It’s the way it is.

    Facts are independently measurable; there the same for everyone, you always get the same data; there are no exceptions. As you said in another comment, objective reality is what remains true regardless of reference frame.

    Opinions are not independently measurable. Once you have a measurement that holds true across reference frames, you have a fact.

    Objective reality is treated as superior to subjective reality because it’s more useful. Subjective reality can be “better” in certain circumstances though, for example as an escape for a mind - abandon your observations of objective reality and replace them with something preferable.

    You have to accept the meanings of words in order to have a meaningful debate about the concepts they carry.














  • Ah-ha, thanks for the update on Docker! Saves me going down that rabbit hole 😅

    On the files on the NAS: yep, that’s by design. My files are across the WAN, not LAN, so I built it to stage remote files locally before transcoding. It currently pulls a file, transcodes it, and moves it wherever you chose for output. This does mean that going over a network is slow, because you have to wait for the staging and cleanup before doing another file. That’s deliberately conservative though; I wanted to avoid saturating networks in case the network operator takes exception to that sort of thing. A secondary benefit is that the disk space required for operations is just twice the size of the source file - very low chance of having to pause a job because the disk monitoring detected there’s no room.

    I’ll look at putting in an override that disregards the network and treats remote files as local for you!