> Sure. With infinite energy, anything’s possible - we can prevent all bugs. The problem is, we don’t have infinite energy.
You don't need infinite energy but it is a significant undertaking. The UB seem to obey a power law in regards to their lifetime i.e. every X years number of UB caused problems halves.
> I can only recall one serious concurrency bug I’ve ever had (a data race) which took some debugging time to track down, but it wasn’t in Go. YMMV.
That's what is nasty about UB in data races. It's not trivial to find and it's a pain in the neck to reproduce. So even taking you at your word, you not having issues with data races isn't the same as "I wrote code free of data races".
> As for the “billion dollar mistake”, I understand the argument in the context of C or C++, but not in the context of Go.
Even without a chance of UB. You're adding an implicit `Type | null` to every piece that uses nullable code without handling the null case. Each time you forget to do it, you cause either an UB or a null pointer error. And the place it manifests is different from where it's generated.
Furthermore, listening to Tony Hoare's talk (https://youtu.be/ybrQvs4x0Ps?t=1682), he mentions that trying to avoid the null, in the context of a language like Java or C#, that permits them is also causing people to waste time working around them.
> The middle network layer will report the error to the user, I fix it, and move on, a $0 mistake.
Sure, but that's not a $0 mistake. It's {time to fix * hourly rate}. Even if you're doing for OSS, you could have been spending that time doing something else.
> Everything is a trade off.
Sure. But trading "ease of use" for "preventing errors" is not something I want in any language I use. Null/nil are about as good concepts today as silently skipping errors.
> Even without a chance of UB. You're adding an implicit `Type | null` to every piece that uses nullable code without handling the null case. Each time you forget to do it, you cause either an UB or a null pointer error. And the place it manifests is different from where it's generated.
> Furthermore, listening to Tony Hoare's talk (https://youtu.be/ybrQvs4x0Ps?t=1682), he mentions that trying to avoid the null, in the context of a language like Java or C#, that permits them is also causing people to waste time working around them.
It's an insignificant amount of time in my experience, which makes this irrelevant in practice. I’ve never had to "work around them", this is either a theoretical argument or just sloppy programming.
> Sure, but that's not a $0 mistake. It's {time to fix * hourly rate}. Even if you're doing for OSS, you could have been spending that time doing something else.
Oh, come on, you’re smart enough to understand that a few minutes every few months may not be exactly $0, but it’s such an insignificant value that it can be treated as $0.
> Sure. But trading "ease of use" for "preventing errors" is not something I want in any language I use.
Well, you can choose your trade offs, and you can let others choose theirs. So I guess you use Ada for everything? or maybe you use Coq to prove everything? Of course you don’t, you’re also making trade offs, just different ones.
> Null/nil are about as good concepts today as silently skipping errors.
You exaggerate again. I do not agree with that framing, and I back my opinion with experience from practice.
> Oh, come on, you’re smart enough to understand that a few minutes every few months may not be exactly $0, but it’s such an insignificant value that it can be treated as $0.
You're smart enough to know that just because you sum small things, you can aggregate them over many people and a long time (circa 50 years) to get to massive numbers.
> Well, you can choose your trade offs, and you can let others choose theirs. So I guess you use Ada for everything? or maybe you use Coq to prove everything? Of course you don’t, you’re also making trade offs, just different ones.
Ada doesn't give you full memory safety. I think you need Ada.Spark. I can't find as much teaching material on it, but definitely on my radar. Also, I'm more of a Lean guy myself, but it has a different purpose than Rust. I.e. proving things.
You don't need infinite energy but it is a significant undertaking. The UB seem to obey a power law in regards to their lifetime i.e. every X years number of UB caused problems halves.
> I can only recall one serious concurrency bug I’ve ever had (a data race) which took some debugging time to track down, but it wasn’t in Go. YMMV.
That's what is nasty about UB in data races. It's not trivial to find and it's a pain in the neck to reproduce. So even taking you at your word, you not having issues with data races isn't the same as "I wrote code free of data races".
> As for the “billion dollar mistake”, I understand the argument in the context of C or C++, but not in the context of Go.
Even without a chance of UB. You're adding an implicit `Type | null` to every piece that uses nullable code without handling the null case. Each time you forget to do it, you cause either an UB or a null pointer error. And the place it manifests is different from where it's generated.
Furthermore, listening to Tony Hoare's talk (https://youtu.be/ybrQvs4x0Ps?t=1682), he mentions that trying to avoid the null, in the context of a language like Java or C#, that permits them is also causing people to waste time working around them.
> The middle network layer will report the error to the user, I fix it, and move on, a $0 mistake.
Sure, but that's not a $0 mistake. It's {time to fix * hourly rate}. Even if you're doing for OSS, you could have been spending that time doing something else.
> Everything is a trade off.
Sure. But trading "ease of use" for "preventing errors" is not something I want in any language I use. Null/nil are about as good concepts today as silently skipping errors.