Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The right way to resolve such bugs is to have abstraction layers that hide these awful hairiness.

If hair regarding low-memory conditions, Internet Explorer and other things all appears in the same function, some abstractions are definitely missing.

Also, if the reason for these things isn't clear, that's what comments are for.

Rewriting code, from my experience, has had only great results.



This is so absolutely true. What nobody recognizes is that most of these really fundamental systems we run on - unix system calls, our window managers, our shells, etc - were designed and implemented in an era where the supercomputers couldn't dent your Nexus 5 in terms of most computing performance metrics.

They didn't have the luxury of having an API that might be a few thousand KB big loaded into memory to act as a single redirection to your underlying implementation. They worked in the confines of bytes of memory, not gigabytes.

We can afford to be generic, to make extensible runtime programmable interfaces and runtime evaluation type dynamism, because we have the performance necessary. But our core APIs are still written like its 1980.


However, this line of reasoning also leads to slow web applications that make a high-powered workstation feel like a slow 386 from 15 years ago.


My only concern is that over the years I've seen everything go in cycles. So maybe 8mb is not a lot of RAM for a computer nowdays. A few years ago it was a lot for a router. I like my OpenWRT router that runs Linux. Before that the same could be said for mobile computers with GSM modules (cellphones).

I'd like to think that Linux will continue to run on machines with at most 4mb of RAM with not much processing capacity because if history keeps repeating itself we're going to keep on inventing new devices with those constraints.


But our core APIs are still written like its 1980.

Exactly what level of abstraction is good for the "core APIs"? Something actually has to send a series of bytes to be written to the disk, even if a bunch of serialization and encoding abstractions are written on top of that. If the latter is the "core", what's the less abstract stuff that inside that?


Rewriting can be great as long as you take all the things Joel notes into account and decide it is still the right decision for your project.


Then you end up with "abstraction layer" like autoconf, and the "cure" is MUCH worse than the disease.


autoconf is operating within difficult constraints (only /bin/sh and make are assumed to be installed), and uses shell scripts that write shell scripts that write shell scripts.

A) autoconf is not similar at all to an API layer hiding hairiness in its domain

B) autoconf becoming terrible does not mean that portability abstractions are necessarily terrible




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: