there's a post on here that's like "the worst thing a piece of fiction can be is mean", and while I agree with that, I think it being insincere is just as bad. nothing more obnoxious than a story that's constantly sneering and rolling its eyes at its own genre in a bid to seem clever and above it all.
"isn't it so ridiculous how we're singing in this musical?" "isn't the world building in our monster collector so ridiculous?" "aren't the leads in our fairy tale romance such idiots?" "isn't the setup in this horror film so cliched?" okay now say something true and real.
I like when people like a character so way too much that it transcends even self shipping or kinning and becomes more of a patron saint that you pray to type of deal
"patron saint" stop using catholic figures in a blasphemous way! it's disrespectful to catholics.
youve made me very happy by saying this
you...enjoy being disrespectful to catholics?
Switching to a slightly more zoomed out version of my avatar because the forced circular crop imposed by the latest dashboard update meant the image boundary was running right though the middle of my eyeball on the old one. Let's see if this works better.
Who are you and what have you done with our beloved Prokopetz
Is this any better?
Much worse 👍 !
How about now?
the eye of horus please, victoria
Third try's the charm?
Yeah, that seems to have done the trick. This one's perfect.
Well now I'm not doing it.
Giving it a little kiss.
Learning Magic the Gathering from my partner has been kinda hilarious cuz I find a lot of the cards will be either:
Treznor, the Eternal Flame: *3 paragraphs of text* -> Widely maligned, considered basically useless
Grey Rock: adds 2 mana -> $14,000 per copy, outlawed in 12 countries
You might think your anime opening is cool, but is it “seamlessly put a ‘previously on…’ segment in the MIDDLE of the opening and have it kick ass every time” cool?
Martinaise hates to see me coming.
Harry Du Bois fit check - aka my first test run for Motor City Comic Con in May
I was going to say it's amazing this manages to annihilate both roko's basilisk and pascal's wager in one fell swoop, but on second thought, that's just because roko's basilisk is literally just pascal's wager reskinned for tech guys.
Roko's Basilisk actually does a really good job of demonstrating how ridiculous Pascal's Wager is in my experience
For a community of practice supposedly about dwelling on how easy it is to be wrong about things, and the inherent uncertainties you need to apply to your beliefs because of the intractable nature of our own biases, a lot of the rationalists really fell hook, line, and sinker for an idea which only possibly makes sense if they assume their beliefs about the future of AI are absolutely accurate.
And specifically that a literally unfathomably intelligent entity will definitely parse morality and ethical decision making in exactly the way they do, and make the same ethical calls that a very specific subset of them would, because their ethical frameworks are for sure the most rational form of ethics and therefore anything sufficiently intelligent would see things through their lens.
Related: The moment I dropped out of the rationalist community was when I realized Yudkowsky was claiming that sufficiently "rational" people don't need the scientific peer review process, or similar collective error correction systems, because that's for handling the mistakes caused by people's biases. And anybody sufficiently who's practiced rationality enough clearly wouldn't need that anymore, because they are fully aware of and capable of compensating for all of their own biases.
So there I was, walking home while reading one of his essays, going "...no? I thought the whole point of learning about our own biases and fallible cognition was that this is why, the whole reason why, we need to emphasize collective error correction. Why we need to make decisions collectively, why we need to empower others to check ourselves, precisely because these irrational parts of human cognition can't ever be fully excised and believing otherwise is the greatest failure mode of rationality you can possibly fall into?"
Then I reread that part of the essay a few times to make sure I was reading it right, sighed deeply, closed it, and resolved to reread a bunch of Rationalism stuff with that in mind to check if I'd picked up any associated beliefs with that idiocy, and talk about this stuff with a friend for a while. Because trying to clean out your brain all on your own is a fool's game for suckers.