> They used stack overflow to make their case, and report that user engagement has gone down after the release of ChatGPT. Could it not be the case that SO is less adept at finding related/duplicate questions than ChatGPT? Given the later's facility with the language, I would expect it to be. So I look at the paper to see if they accounted for that, and find this.
The moderation team and community in general on Stack Overflow is so toxic I'm not even sure you could control for that effect well enough to arrive at this conclusion. I would argue people are leaving because it's easier to ask ChatGPT your question than be flamed and banned for asking how to do something. A half right answer from ChatGPT is better than getting marked duplicate and closed because the moron moderation team can't detect nuance.
I have anecdotal stories about aa friend corroborating this. He had been rebuked and experienced an unwelcoming response on Stack Overflow in response to his questions while trying to learn web development, and found ChatGPT a great teacher in comparison.
I'm a very experienced developer. Sometimes I have questions that are easier to ask a person than go dig through miles of documentation. I could easily frame a question correctly to get a response but even I feel extremely unwelcome there. I couldn't imagine being junior.
It is unwise to leave the detection of nuance to volunteer moderators. Rather, highlight the nuance in your question (or answer). That strategy had always landed me high value answers to my questions.
I've arrived from google on "closed" questions I needed answers for, XY questions where I had question X but not secret question Y, people trying to XY answer extremely simple and straightforward X-and-only-X questions, questions unanswered for years with several upvotes and me-toos.
But shallow questions where the official docs are too raw or are missing a few specifics? Stack Overflow was good for that before chatgpt.
The moderation team and community in general on Stack Overflow is so toxic I'm not even sure you could control for that effect well enough to arrive at this conclusion. I would argue people are leaving because it's easier to ask ChatGPT your question than be flamed and banned for asking how to do something. A half right answer from ChatGPT is better than getting marked duplicate and closed because the moron moderation team can't detect nuance.