Brutkey

mathew
@mathew@universeodon.com

@alcinnz@floss.social @bkim@mastodon.social It’s important to distinguish between null and the zero value. Unfortunately C-like languages obscure the difference by doing #define NULL 0. Having zero values for most types is OK (pointers are an exception, but pointers cause a lot of other problems). It’s null that tends to ruin language safety. If an integer X can be zero, that’s fine, you still have an integer type you can always increment. If an integer can be null, you have a problem.