C3 is switching from unsigned to signed sizes by default after five years, finding unsigned causes subtle, hard-to-detect bugs that safety rules cannot fully prevent.
Key Takeaways
The root cause of classic C bugs like infinite loops and broken comparisons traces back to unsigned being the default for sizes, not isolated quirks.
Unsigned sizes force pervasive signed/unsigned casts throughout indexing code; C3’s attempt to minimize casts instead created silent promotion surprises with / and %.
Ring buffer offset math with unsigned wrapping is silently wrong in common patterns; no compiler rule can reliably catch it.
Go and Java both chose signed sizes deliberately; Go is a low-level systems language, making this a meaningful data point for C3’s decision.
Signed overflow produces obviously wrong negative numbers; unsigned overflow produces plausible-looking but incorrect values, making unsigned bugs harder to spot.