As infuriating as it may be, it's a good occasion to learn to let go.
Rationally, you're paying way more than the liquid in the glass, so just mentally tell yourself that the haircut is included in the price, and that's fine. If you can't stand the price anyway, change bar.
I'm sure the same individuals will calculate tip to the cent as well.
As someone who worked in specialty foods for years, you get what you get. Flaws and all. That's part of the charm of this industry. This is especially true for small craft breweries. If you insist on accuracy then ask for a can/bottle list. If you insist on consistency then buy macro.
The main effect is that you build things a lot faster.
The secondary effect is that, since you can code a lot faster, you are no longer scared of large refactors, ports to other languages, and all that kind of tedious stuff.
That means that if you design something, ship it, and realize it's bad, you can completely change it.
The "oh we should have done it differently, but now it would take years to change, so we have to deal with legacy garbage" argument does really make sense anymore.
Like if HN had LLM agents, we wouldn't have had to wait for years for them to solve the "click here for more comments" problem.
And iterating with users is a lot faster.
We, software developers, as a profession took over countless crafts. It started with people doing calculations by hand, then moved on to people typing on typewriters and continued from there. People used to edit films with scissors and scotch tape. People used to place lead fonts on a matrix to print news articles. Databases used to be little cards made carefully by people whose job it was to organize and modify them. It’s a bit indecent for a developer to complain that LLMs took away the pleasure of molding a clay made of bits, while the robots we enabled to build took the actual clay off of potmakers actual hands.
And what the author forgets to mention is that we got it good. Oh boy. As a software developer, I can work in any field I want. I started on video compression. I moved to finance. I make games in my spare time. I make plugins for music. And I get to be paid way more than my neighbor who’s a heart surgeon. I can work remotely 100%. I can go to a nice beach in Thailand, work 2 hours in the morning and enjoy the rest of the day, and still make more than the median salary in France, where I live.
The grief is not the loss of the craft alone, it’s the loss of that craft that paid for your house.
As they said: software is eating the world. Well, it is now eating itself. It’s only fair.
The author is right though, human societies need to ask themselves whether they are willing to sacrifice all the crafts on the altar of productivity and convenience.
The Amish decided they didn’t want to. It’s a bit of a weird choice, but it is a choice.
I think this is a bit facetious. Although software had a bunch of localized impacts, it killed only a handful of mainstream professions. People still had to typeset articles, there was just less lead involved. For a writer, switching from a typewriter to a keyboard didn't mean you somehow needed less skill to write.
That said, I'm with you that it's tacky for software engineers to complain about their own hardships. We are some of the wealthiest and most pampered white collar workers out there, and we're not exactly innocent bystanders.
I wrote a comment on this thread. After reading yours I upvoted it and deleted mine: you clearly gave it more thought than I did and expressed my sentiment a lot clearer.
> Misconception 1: specification documents are simpler than the corresponding code
That is simply not true. There is a ton of litterature around inherent vs accidental complexity, which in an ideal world should map directly to spec vs code. There are a lot of technicalities in writing code that a spec writer shouldn't know about.
Code has to deal with the fact that data is laid out a certain way in ram and on disk, and accessing it efficiently requires careful implementation.
Code has to deal with exceptions that arise when the messiness of the real world collides with the ideality of code.
It half surprises me that this article comes from a haskell developer. Haskell developers (and more generally people coming from maths) have this ideal view of code that you just need to describe relationships properly, and things will flow from there.
This works fine up to a certain scale, where efficiency becomes a problem.
And yes, it's highly probable that AI is going to be able to deal with all the accidental complexity. That's how I use it anyways.
The iPhone is designed to be a good smartphone, not a good NAS. It is silly to expect anyone to compromise the design of a mass market product to support some esoteric MacGyvering entirely unrelated to the original product.
Should we all expect Toyota to design their ECUs to be used as a NAS?
It's not about "design", because the iPhone is perfectly capable of running arbitrary code, it just refuses to do so if you're not Apple.
The situation is such that the legal owner of the device has less power over it, post-sale, than the company that made it.
That reason alone, the imbalance of power, should be enough to support abolishing those restrictions, preferably by law.
To be clear: this is something that should be beyond market forces, and it should apply to anything that is sold to consumers and can run code. The end goal should be that no user remain less powerful, in terms of code execution and access to content, than the manufacturer.
> It's not about "design", because the iPhone is perfectly capable of running arbitrary code
It is a very intentional UX choice to mitigate malware for users who do not know how to evaluate the legitimacy of software on their own. And studies show that this is a very effective policy, both perceived (e.g. marketing) and real (actual breach statistics).
You can mitigate malware while still allowing for the same level of end-user control as the manufacturer. Look at Windows itself! People getting infected on up-to-date installations is a rarity nowadays, all without draconian lockdown policies.
It took windows many decades to get there and the reputational harm was already done by then. Android is not doing particularly well but it has improved significantly.
Of course Apple doesn’t want people to use their device in a way that’s not how they designed it. They’re very anal about the user experience, they don’t want kids to install ArchLinux on their grandparents iPhones, and have the grandparents complain that their phone is shit. I get that.
Conveniently, the way they designed the phone allows them to charge 30% of every transaction that happens on the device…
But that’s beyond the point. The point is that the iphone is a capable device, that probably can run macos, and it’s a waste that we’re not allowed to.
I'm all for antitrust action against the financial trap that is the app store. But as someone who designs products, I think it's absolutely asinine to require security flaws in a product's primary design to support an untended repurposing.
I guess I don’t see how allowing some phone owners to root their devices introduces security flaws for those who don’t. Maybe there’s something I’m missing here.
A NAS is just an example, here's a better one; I love to use my old phones as wall mounted displays and controls for home assistant, or as remote music players plugged in to some speakers that I can hook into in music assistant. Some of my old phones are more than capable of this hardware wise but are locked to older versions of android and can't run anything built for a newer version, so they end up as ewaste intstead.
I think my next phone is going to be a fairphone or something for this reason.
You can do this but you have to remove the battery and hook up the circuitry to external power. This practically turns the phone into a glorified SBC. It may still be worth it since there's more of a mass market for phones than SBCs (and phones come with lots of extra hardware components that can be useful) but it's not that huge of a win.
None of those are even remotely reasonable enough to be a higher priority design criteria than preventing little old ladies from unknowingly installing malware.
That law is perhaps an annoyance for Apple, but it can't cost them billions, can it? I seriously doubt that it would cost Apple more than the several hundred million dollars Meta still needs to funnel in order to get those laws passed in more states.
Plus, Apple gets to be the gatekeeper for Meta and other apps which can't be good for meta, and Apple gets to know the age of its users, which in itself is monetizable.
> That law is perhaps an annoyance for Apple, but it can't cost them billions, can it?
The CEO has 24h in the day, and he/she is asked to be deposed (laws and legal system has that power), it chips away from grand visions. It isnt just money, you cant just stand up a team and be done with it. Everybody will be coming at you.
Expect to see a lot "Y alleges Apple didnt do enough to protect kids" and the burden of proof will be on Apple to make their executives available.
But didnt Apple fire the first shot with ATT? Apple was never against ads (see ads.apple.com or numerous ads on App Store) they were against Facebook's ads.
Rationally, you're paying way more than the liquid in the glass, so just mentally tell yourself that the haircut is included in the price, and that's fine. If you can't stand the price anyway, change bar.
You're there to chill. Just chill.
reply