Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: What tools and practices have helped you work better as a developer?
21 points by kartm on Nov 3, 2024 | hide | past | favorite | 52 comments
As title says. Comment below with any stories or observations on how productivity (or the lack of it) shaped your work!

PS: I'm working on a thesis on productivity in software development. If you have a few minutes, please share your experiences & what works (and what doesn't) in my survey: https://forms.gle/4B9GAtXD1nahwvn48



My Jetbrains IDE is bar none the biggest productivity boost for me. Even after VScode arrived (and got better, much better, over the years!), Jetbrains still has so many built-in features that I find it hard to code without, and I happily pay for my own subscription.

Aside from that, I think the thing that's helped me the most is simply over-commenting, everywhere and all the time. I leave a quick note for other devs (and myself) for every helper func, even if it looks trivial to me. And for anything complex, I try to leave detailed line-by-line comments that any junior dev can pick up. This not only helps others pick it up, it helps me myself a few weeks from now, and it also helps prevent runtime issues through what is basically as-you-go "rubber ducky" debugging, forcing me to verbalize my rationale for writing something a certain way.

And of course ChatGPT has been a moderately big help. It's not quite a replacement for another experienced dev, but it's certainly taken over 90% of my Stackoverflow and Google usage – if only for its much better natural language parsing abilities.

With these tools in place, the code mostly just writes itself if you can give me time to focus and not force me to join pointless meetings and scrum planning sessions. I can only be productive if I can sit down and focus without pointless rituals where my input isn't even needed or asked for.


I'll second a good IDE. I have nothing against code editors but I don't understand the point of having tons of different tools/extensions when a good IDE will have all of that included.

I love code editors for scripts and light projects... but I can't imagine trying to work on larger ones without an IDE.


Jetbrains is best for Java. For JavaScript it is literally broken for years.

https://youtrack.jetbrains.com/issue/WEB-31686/Autocomplete-...


I use it every day for Javascript/Typescript and it works great. It's definitely not just for Java; many many languages are supported (https://www.jetbrains.com/ides/).

The bug you linked to seems rather esoteric (someone didn't want to manually type "import {blah} from package"...?). If you start to type that import, it'll autocomplete the rest for you. And once it's imported, it's very very good at traversing library files and types and bringing you to the definitions (VScode can do this too).

Does it have bugs? I'm sure. And it has its own absurd annoyances (like https://youtrack.jetbrains.com/issue/IJPL-60969/Show-path-in...) that might never get fixed. But in general it works great.


When you try import {blah} from package blah will not autocomplete. That is my point. Every other editor works in this case. And npm link is quite a common case.


Why don't they just add it to package.json? I don't understand why this is a big deal at all. It'd be weird to me if a repo started to use random libraries that I didn't explicitly add to its manifest, even if I npm linked them... but I guess that's just me?

If it's important to you, I guess this is not the right IDE for ya!


It is in package.json


What? The repro instructions say to leave it out.

Anyway, sorry, I hope they figure it out, but it's not something I've ever had trouble with. If it's an issue for you too, you might wanna write their support or just don't use the IDE.


What language do you write in?


99% TypeScript/JS/React. Very, very occasionally Ruby or PHP.

Edit: Oh, but it's also great for parsing non-programming languages with good support for HTML, XML, Markdown, various SQLs, GraphQL, JSON, YAML, misc config file formats, container or CI/CD definitions, etc.


A small piece of advice. Review your changes using a nice GUI diff tool immediately before committing anything to your repo. Lately I use Github for Windows for the tool, but that’s unimportant as long as the tool clearly shows all changes to all source files which will be committed.

It’s an easy thing to do, but IMO helps software quality tremendously. 60% of times I just read my changes and commit them as they are. In 25% of commits I spot something minor like typos in comments, and fix. Most importantly, for the remaining 15% of them I notice some important issues which need to be fixed.

Reading git log later, or delegating to other people, doesn’t work because timing. I commit my changes immediately after making these changes. This means I still remember what the code in question does, and why I changed what I did.


I do that already for 20+ years or so. A good diff tool with editing capabilities is the best swiss army knife.

Indeed, just before each commit I often find extra newlines, debug prints or incorrect comments or try-out commented code, easy enough to repair with a decent diff tool.


GIT - back in the 1980s, I would use PKzip to archive up all my Turbo Pascal source code into a file, and put it on a numbered floppy disk to back up my source code.

SVN was a step forward, but GIT is the bees knees. I've used it to keep a workstation without networking up to date and synced with GitHub. The whole ecosystem is amazing.


It's funny, and common, to hear new-learners complain that git's too complex. It's so much better than all the alternatives, and 99% of the time it's incredibly simple to interact with.

Definitely feels a little weird to be the one saying 'back in my day, we had to walk 2 miles up hill, both ways, to commit our code. and lord help you if you needed to submit a patch.'


Git is terrible for real-world development, and it's a damn shame that industry has standardized on it. For something like the kernel, which is managed as a series of patches, it's fine, but so many real-world projects depend on large amounts of binary assets or data. Where is the most logical place to put these? The repo.

Guess what you CAN'T do efficiently with Git :)

So a lot of industrial Git users have to do these contortions involving S3 buckets, etc., or else reinvent their own bespoke versions of Git (like Microsoft GitVFS) in order to stand up a working tree on a fresh machine. Plus those external dependencies need to be kept track of, updated, and the updates kept track of.

We used to have an industrial-strength VCS that could handle source code, binary data, and huge repositories of both very efficiently: Perforce. Which is kind of on private-equity life support now.


The problem of versioning binaries is more fundamental, not tied to the tool.

Binary assets themselves don't belong under traditional source control because they are not suitable to be diffed. That is why git LFS stores them seperately and only versions their hashes.


Although I think it's extremely unlikely, I almost wish subversion would have a resurgence. I like git, personally, but I've seen so many people struggle with it, and the problems of putting large/binary assets into the repository are real. Yes, there's LFS now, but (as far as I know) you then need to make up front decisions about what to store directly and what to store indirectly as LFS objects.


Subversion was much better than anything existing before, but a very fundamental problem is that it treats branches and tags, which are conceptually different, the same way. This results in branches not tracking well their history and tags behaving like moving targets.


Learning to type properly. Even if you can bump up your typing speed 10 wpm. Over the life time of a software engineer it's worth it.


Learning Vim key bindings and using them regularly

Studying topics from first principles instead of corporate documentation

Learning the history of programming and computing

Moving beyond competency and becoming highly proficient at the command line

Understanding git backwards and forwards

Being able to explain every concept in computing from the lens of the broader field of engineering

Being able to write regular expressions without consulting too many resources

Effectively composing Unix programs to get a larger task done

Reducing every task to the absolute basics, to demonstrate a “proof-of-concept”

* I’m not claiming to have mastered any or all of these, but they do serve as my “North Star” guidances when figuring out which direction to advance in this career


Can you please give insights into "Studying topics from first principles instead of corporate documentation"? Studying topics from first principles?


An example of this could be Postman or your favorite Git GUI.

You can also learn HTTP networking and curl, as they are the underlying principles of Postman, and learn Git graphs instead of using a Git GUI tool.

Next time you're confronted with a problem, instead of starting with "let's solve this with library X or framework Y," begin by asking "what are we trying to solve?" and then explore multiple options to solve that problem.

A technique I often employ is to first ask what the complexity of the problem is, then evaluate the complexity of each potential solution. For instance, the goal might be implementing automatic deployments, but the solution became way more complex than the original stated goal.

The technique often stil lead you to popular solutions, but it could also point you toward simpler, less conventional alternatives. Sometimes even eliminating problems by side stepping them completely.


I still don't get it. I don't care about reducing complexity, I care about reducing complexity that will be me of my teams problem, and even then, it's not my top priority, getting rid of potential manual error is more important.

A pencil will always be simpler than all of Keep's infrastructure, but it requires vastly more skill and attention than a cloud notes app, to the point where it's primary benefit seems to be as a form of mental exercise.


Thank you : )


Sure, I've been doing a lot of full-stack web development lately, and I found that reading chapters from the textbook "Computer Networks" by Andrew Tenenbaum as a primary source to be far more enlightening about how networks operate, than reading any of the hottest frameworks documentation on how they work under-the-hood. Corporate documentation tends to be more about giving off the vibe of competence, as opposed to being actually informative. They want to make it look like their tool/framework is easy to learn by offering false senses of understanding, instead of the deep understanding provided by textbooks. There are exceptions, but they seem to be growing more and more rare. Compare man pages for old CLI tools versus documentation for frameworks like Next.JS, which tends to gloss over important implementation details that might make you a better overall developer (instead of a framework wrangler).

On the side, I'm trying to enhance my knowledge on embedded systems as well, and picked up the book "Making Embedded System" by Elecia White. I have learned so much in just the first few chapters, even though I worked on an embedded software project for a full year recently. This has served me well, because previously my understanding of embedded systems was more of at the level of an Arduino, rather than an embedded developer. Arduino is fantastic for getting your feet wet, but it does hide away a lot of slightly complicated details (that are not actually that hard to learn with a little effort).

I would basically say, always prefer lengthy and somewhat dry textbooks over superficial and meme-heavy blog posts (or worse, YouTube videos). Learn the subject as if you only have one shot in your career to learn it, and have to learn a permanent representation of the subject that can last decades. Entertaining corporate docs, blog posts and videos are like a sugar-heavy diet for the mind, you'll feel energized at first, but eventually become sick from lack of deep understanding of anything

In a twist of irony, here is a blog post that explains this idea reasonably well - https://archive.ph/XSPRr

Here's a canonical blog post (again, the irony) on why understanding subjects deeply is important - https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...


The better "Framework wranglers" usually eventually learn the "leaks" as part of the abstraction's API, and learn to be comfortable not trusting anything.

TCP can fail, but so can my own totally deterministic code I only think I understand. Or more likely, I trip and fall and break the Ethernet cable and the whole thing fails.

My job is to make the final result reliable anyway, even knowing the thoughts in my own head are not trustworthy. Even if I knew anything about proof theory I could still make a mistake with a mathematical proof, and simplicity can't stop it. Left and right are fairly simple concepts and I've mixed them up so many times.

A lot of devs think in terms of trust and quality, and are so quick to assume entire subsystems are perfectly fine, whether it's because it's a high quality thing from a trusted vendor, or because it's simple and they did it themselves and feel they understand it.

It's just like in real life, how people seem totally sure they won't trip and fall or drop what they're holding. I'm used to understanding my own hands as unpredictable, so I guess that's why I like tech.

It's less about avoiding mistakes and more about predicting and planning for mistakes.


> It's less about avoiding mistakes and more about predicting and planning for mistakes.

Totally agreed


Thank you, good sir : ) for the explanation.


That seems like stuff that would increase productivity for a computer scientist or someone working on very novel algorithms, but might not be quite as relevant to an integrator dev at a crud shop.


I interpreted the original question as how to be good at this career of software engineering. If someone wants to hack together some legos and log off at 5pm every day then don’t do anything I said, but also don’t expect to experience the exponential productivity growths enjoyed by devs who take their careers seriously. And subsequently, don’t expect to have a job you enjoy in this industry for very long

For the record, my day job is in fact more of an “integrator” than a computer scientist, but I still find value in every learning principle I laid out.


With the growth of AI, you might be right, more and more jobs are going to require deeper knowledge. It's kind of scary when I'm using codeium and it writes exactly what is would have, if I had no time constraints at all, because it means a lot of what I do is incredibly predictable.

But there's still a lot of skill and possibility to take your career seriously with hacking together Legos, if you want to make your Lego thing reliable and maintainable, and I don't think they're exactly the same skills as someone like a compiler writer or suckless dev needs.

I'm not dealing directly with data regularly, I don't need to awk things, but I do need to know about the oddities of high level frameworks. Knowing awk might help somewhere, but it's not as essential as knowing CSS browser compatibility issues.

A lot of the time I don't even make decisions based on technical factors, they're made for me by social factors and I can either make it work or fail.

I've been at places where git branches were out of the question,the uphill battle to convince everyone to learn them was probably not happening.


You’re right, I probably phrased that poorly. There are multiple ways to “take a career seriously”, and the one I described is just one way. I do think it’s the appropriate way for the software engineer individual contributor path, specifically


I started coding by just dropping print statements in my code to debug it. In college I was introduced to gdb and I never looked back. I will not code without a debugger now, it's such a waste of time dropping print statements in code and re-running. I'm always surprised when I learn about some language I'm interested in and finding out that the standard way to write code is in a text editor without an LSP implementation and just print debugging everywhere. I don't even start trying a language if I can't put breakpoints in somewhere.


I find that being able to do either is best. Sometimes debuggers are amazing, sometimes they are not. It depends on what environment you are working on, what layer of the stack the problem resides in, and how complex the info is that you need to debug the problem. Being able to choose multiple paths to get information is better than locking down on one "always the right answer" technique.


If you debug an embedded app, or you are interacting with a realtime embedded device, often debugging does not work because of memory or speed limitations. Then short print statements do wonders.

Also it can help to figure out the order in which pieces of code accross the codebase are executed. It is my experience that in such case you would very easily loose track when using a debugger.


No need to fetishize tools. Sometimes a debugger is useful, in others you are better served with logging/metrics/traces, and yes even print statements.

For example, using print statements may encourage you to create a better model of the code: formulate hypothesis, test it. While debugging encourages more local reactive (not proactive) view of the runtime.


Idk what fetishizing is supposed to mean in this context, I'm saying that a debugger is vastly more efficient tool than rerunning the program over and over


I meant: do not put too much emphasis on any single tool. If all you know is debugger, try other instruments. You might find they are “vastly more efficient” (in certain context).


Support for breakpoint style debugging is a feature of the runtime, though coordination with the language is important. Some environments just can't support it whatever language you use.


True and thankfully I don't need to operate in such environments. Others have brought up embedded, which is an industry I don't work in


REPL driven development. I was doing it before Clojurists were talking about it. It's the most efficient way I know to build large systems -- by composing smaller pieces whose API and behavior you are deeply familiar with through interactive fiddling.

Relatedly, learning to use Emacs with power and precision has been a boon to me, though the Emacs skill ceiling is absurdly, absurdly high and there's so much I've yet to learn. Programming the editor to do repetitive work for me, not just using packages and customizations, is key.

Literate programming has been a tool in my toolkit for documenting my thoughts as I solve an interesting problem. These days I use org-mode, but noweb has been useful in the past.


I've gotten considerably better at debugging complex issues after resolving to always put in the work to find out what is really happening - build in the observability, export the data, write whatever analysis tools you may need. Wasted so much time earlier in my career on blind guesswork.


Measuring is knowing indeed. You can throw out 90 percent of your hypothesis of what is going on, with a few well placed data probes.


use https://github.com/Aider-AI/aider clone any repo, use aider for Q&A about the code. Use aider to add features to the repo and do experiments with the code. Its a very interactive way to learn


I use grep.app to see how things are used in practice, especially minimally-documented or complex things like property-based testing macros. More recently ChatGPT helps a lot with this, but I still like grep.app for looking into production code.


How does it stack up against GitHub's own search?


If you're searching across a bunch of open-source repos, Grep.app can be super handy—it’s fast, supports regex well, and lets you filter by language and license. But if you’re focused on a specific GitHub repo, GitHub’s native search is probably better. It ties in with commit history, issues, and PRs, and the new symbol search is great for navigating large codebases. Basically, Grep.app is good for broad searches across projects, while GitHub search is stronger for digging deep into one project.


for me rescuetime.com and hubstaff have helped in terms of productivity, I also have written a couple tools myself to make myself more productive over the years, vim also comes to mind, there are several ways to be more productive, one just has to care enough.


start the day standing up and try not to sit all day. some days are better than others


That’s worse than sitting all day.


i saw a study recently that claimed that but idgaf i love it, my body feels great standing 75% of the day


Drawing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: