“gluteal clefts”.
I don’t know why I find that so funny.
Now you’re older, how frequently do you think you were right in your comparisons?
A few posts above this one, I saw a post about how German bridges are falling apart, so your comment has done me psychic damage. Man, things feel grim.
You’ve got to be careful with rolling your eyes, because the parallelism of the two eyes means that the eye roll can be twice as powerful ^1
(1) If measured against the silly baseline of a single eyeroll
I read this article a few months ago that I found quite interesting: Original link Unpaywalled link
The manufacturer in this blames the DEA, and whilst I don’t trust a pharmaceutical company to do anything other than ruthlessly maximise profits, in this case I’m inclined to believe this depiction of the DEA as the overly persnickety bad guy (because I have even less reason to trust the DEA)
The message is “we are heading towards complete climate collapse and The Powers That Be are acting like things are fine”.
I agree, you’ve captured much of why I came away from the article feeling a bit ‘hmmm’.
Something I read somewhere that I found super interesting is that on Windows, when a process completes, the user often gets a notification or popup alerting them to this, whereas on Linux, it’s more normal for there to not be any confirmation messages when a process is finished. I hadn’t consciously realised this difference until I read this and reflected on how many times I’d have to double check things when I first started using Linux.
I wonder what would facilitate people to make their own solutions in this way. Like, I have made a few apps or automation things myself, but if I look at my “normie” friends who don’t have the level of tech familiarity that I do, they struggle with whatever out of the box solutions they can find. Poor IT education is a big part of this, and I’ve been wondering a lot about what would need to change for the average “normie” to be empowered to tinker
Comments like this remind me of how I felt, as a Brit, to learn about Irish history, way after I left school.
I tried to think of a witty response to your funny joke but I’m apparently too tired for that, so instead, I’ll wish you good luck for next week, and the weeks that follow it; getting a diagnosis as an adult is often cathartic in the short term, liberatory in the long term, and in between those points is a long period of introspective untangling a web of messy feelings and possibly internalised ableism. I wish you the strength to endure and to emerge with a better understanding of who you are, regardless of the outcome of the assessment.
I hadn’t thought about it from that angle, thanks for sharing your perspective, it’s really interesting
A tension that I find very interesting is how YouTube creators with a decent but not huge subscriber base (I’ve mainly seen it in video essayists, but that’s just what I watch more of) grapple with the sometimes implicit, sometimes explicit dichotomy of “content” vs “art”, where “content” is what the algorithm wants and what will pay their bills, and “art” is the weird stuff they actually want to make.
It’s nowhere near a full replacement to Spotify, but something that eased my switchover was Listenbrainz for open source music recommendations. It’s not as good as Spotify’s Discover Weekly playlists (yet!), but the greater transparency is worth it imo. I have the app from fdroid and it tracks what songs I’m listening to (especially useful if you connect it to a streaming app) and gives recommendations based on that.
Yeah, I’m super salty about the hype because if I had to pick one side or the other, I’d be on team “AI is worthless”, but that’s just because I’d rather try convincing a bunch of skeptics that when used wisely, AI/ML can be super useful, than to try talk some sense into the AI fanatics. It’s a shame though, because I feel like the longer the bubble takes to pop, the more harm actual AI research will receive
Eh, it depends on what we count as “AI”. I’m in a field where machine learning has been a thing for years, and there’s been a huge amount of progress in the last couple of years[1]. However, it’s exhausting that so much is being rebranded as “AI”, because the people holding the purse strings aren’t necessarily the same scientists who are sick of the hype.
[1] I didn’t get into the more computational side of things until 2021 or so, but if I had to point to a catalyst for this progress, I’d say that the transformer mechanism outlined in the 2017 paper “Attention is all you need”, by Google scientists.
One of my favourite biochemistry tutors at university was also a reverend. We never spoke about the overlap but I’ve read his books since graduating and it’s interesting to see how his faith augments his science and vice versa.
Neat site!