29 points by cremer 1 day ago | 6 comments
stuaxo 1 hour ago
Not sure if this was written with AI assistance of not, but I've become allergic to linguistic triples as LLMs use them so much, reading "Same code. Same input. Different answer" makes me not want to read the rest.
jan_Inkepa 1 hour ago
gobdovan 48 minutes ago
I think the problem is the mismatch between the intended evocative tone of tricolon crescens and the triviality of a 'computer quirk' in the grand scheme of things. "Use figures of speech, but don't sling them around like monkey shit" - my literature teacher.
thecaio 29 minutes ago
Same. I now tend to simply abandon writing when I see those tell tale signs
tyilo 20 minutes ago
If the original code was written in Rust, then I don't think the Rust compiler is allowed to do any of these "optimizations" of rewriting floating point expressions.
sampo 47 minutes ago
I wish the blog would reveal the values of the 3 floats that make their

    cross_sign(A, B, C)
to give different results in different platforms.
jmalicki 1 day ago
I love seeing a Shewchuk citation other than my ML background of learning conjugate gradient! He is truly a great educator!
cremer 1 day ago
his predicates paper opens with "Computational geometers despise floating-point arithmetic" same trick as the CG title: write the sentence a frustrated reader would write, then aren it.. if you like those the Triangle paper is the third one in the same key
adampunk 1 day ago
Makes you wish everyone agreed on extended precision!
cremer 1 day ago
careful what you wish for, x87's extended precision is what wrote the original bug: 80 bits in registers,64 on a spill, so the value depended on register pressure

ARM and WASM dropped it for a reason, and more bits would not help anyway, a sign is one bit, any rounding step that disagrees at the last bit flips it

fefa4ka 1 day ago
[dead]