For 20-ish% of Alzheimer's patients, the Shingles vaccine may be a treatment. This has been suspected for a few years now but has received recent confirmation studies.
While the study was about the shingles vaccine, I wonder if having passed normally through shingles influences positively or negatively the chances of later developing Alzheimer's.
I assume you have absolutely no clue what it refers to.
Handmade Hero is a long running yt series by Casey Muratori. He builds a game engine from scratch, no cheats, no shortcuts, straight to the metal (from C-ish perspective). So you learn how to deal with computers to achieve things, fast and efficient, by understanding computers.
At some point Casey thought it was a failure and a waste of time. But to his surprise quite a fanbase evolved around it and it turned that it really helped people to go from zero to "hero". The handmade "movement" relates to this timeline and the aftermath of people thriving from it. My rough definition of "Handmade" dev mentality would be: Ignore the things that seem to make things "easy" (high level software) and learn the actual thing. So you learn what a framebuffer is instead of looking for a drawing api, applicable to different contexts.
That being said is that this foundation doesn't seem to be endorsed by Casey. Their mission goals seem quite shallow, if at all.
> no cheats, no shortcuts, straight to the metal (from C-ish perspective)
Not the person you replied to but even when I stumbled over this (the network, not the game) for the first time, I was left wondering where the line is drawn.
> You can learn how computers actually work, so you can unleash the full potential of modern systems. You can dig deep into the tech stack and learn what others take for granted.
Just.. no libraries? Are modern languages with batteries included ok? What makes a library for C worse than using Python? Is using Python too bloated already? Why is C ok and I don't have to bootstrap a compiler first? (E.g. building with Rust is a terrible experience from a performance perspective, the resulting software can be really nice and small)
I'm not even trying to be antagonistic, I simply don't understand. I'm just not willing to accept "you'll notice when you see it" as an example.
My immediate assumption was that this was a reaction against LLM–assisted or –written software, but I couldn’t see any mention of this in the front page.
So maybe ‘handmade’ refers to artisanal, high quality, made with care, etc.
That page seems like it's trying to define what Handmade is through a bunch of complaints and what it is not
Still no idea what they actually do, other than maybe this is just some random site about building a community to "make better software".
Software isn't bad because engineers don't care. It's bad because eventually people need to eat food, so they need to get paid, which means you have to build something people will pay for, this involves tradeoffs and deadlines, so we take shortcuts and software is imperfect.
> the field has become lucrative enough it has attracted people who are interested in the money and not the craft
Yup, exactly.
> I'd use unrealistic to describe Handmade, proud is also accurate and works too
In certain settings definitely. But even in those corporate settings where it's unrealistic I'd rather work with one than not. If not applied dogmatically, that corner of the corp has a good chance of being an oasis. But a fleeting one perhaps.
I read that but it doesn't define handmade. It gripes about large frameworks and rewriting in different languages but doesn't say what handmade is or how it addresses anything.
Sure, but as soon as they released their first iteration, they immediately went back to the drawing board and just slapped @MainActor on everything they could because most people really do not care.
Well yes, but that’s because the iOS UI is single threaded, just like every other UI framework under the sun.
It doesn’t mean there isn’t good support for true parallelism in swift concurrency, it’s super useful to model interactions with isolated actors (e.g. the UI thread and the data it owns) as “asynchronous” from the perspective of other tasks… allowing you to spawn off CPU-heavy operations that can still “talk back” to the UI, but they simply have to “await” the calls to the UI actor in case it’s currently executing.
The model works well for both asynchronous tasks (you await the long IO operation, your executor can go back to doing other things) and concurrent processing (you await any synchronization primitives that require mutual exclusivity, etc.)
There’s a lot of gripes I have with swift concurrency but my memory is about 2 years old at this point and I know Swift 6 has changed a lot. Mainly around the complete breakage you get if you ever call ObjC code which is using GCD, and how ridiculously easy it is to shoot yourself in the foot with unsafe concurrency primitives (semaphores, etc) that you don’t even know the code you’re calling is using. But I digress…
Not really true; @MainActor was already part of the initial version of Swift Concurrency. That Apple has yet to complete the needed updates to their frameworks to properly mark up everything is a separate issue.
async let and TaskGroups are not parallelism, they're concurrency. They're usually parallel because the Swift concurrency runtime allows them to be, but there's no guarantee. If the runtime thread pool is heavily loaded and only one core is available, they will only be concurrent, not parallel.
> If the runtime thread pool is heavily loaded and only one core is available, they will only be concurrent, not parallel
Isn't that always true for thread pool-backed parallelism? If only one core is available for whatever reason, then you may have concurrency, but not parallelism.
> Charlie Brown may have been as popular as any character in all of literature
Was he? Maybe this is true inside the US but from outside the US, I've always viewed the character as a peculiarly American artefact – something I was aware of but never really read or watched. This seemed to be reinforced by most major Charlie Brown titles seemingly tied to other American customs like Halloween and baseball.
Snoopy as a character is popular in Japan, but only as a character design - kind of like Hello Kitty. There is zero awareness of any of the shows or really Charlie Brown himself.
I'm Brazilian, in my middle 40s. When I was a little kid my best friend used to carry a blanket around. Neighbors called him "Linus" for years. But I'm confident it was because of the TV show, not the comic strips.
Mac sales are up 12%, year over year. It's Apple's fastest growing hardware category. They're just going to be lower next month (year over year), due to the release cycles being different.
> Even many games that support native linux run better under wine.
The same is often true on macOS, too – running games through CrossOver is often better than the native port. The reality is that there simply aren't enough professional game devs on Linux and macOS platforms to polish that last 20% and make all the difference.
I'm not sure what you're talking about. Any app compiled using LLVM 17 (2023) can use SME directly and any app that uses Apple's Accelerate framework automatically takes advantage of SME since iOS 18/macOS 15 last year.
Benchmarking a processor for "app written by someone who disregards performance" is something you can do, but it's a bit of a pointless exercise; no processor will ever keep up with developers ability to write slow code.
Of course. And these are CPU vector instructions, so the saying "The wider the SIMD, the narrower the audience" applies.
But ultimately with a benchmark like Geekbench, you're trusting them to pick a weighting. Geekbench 6 is not any different in that regard to Geekbench 5 – it's not going to directly reflect every app you run.
I was really just pointing out that the idea that "no" apps use SME is wrong and therefore including it does not invalidate anything – it very well could speed up your apps, depending on what you use.
https://www.alzheimers.org.uk/news/2025-11-18/promising-rese...