it's probably okay to stop making programming languages where "double" is a type name because that is total nonsense
-
-
Replying to @eevee @JadenGeller
Why all the fuzz? All these numeric types and how to deal with them are defined in IEEE standards + libs since 1965 or so.
2 replies 1 retweet 0 likes -
That is great for languages that existed in 1965. Newer languages could have gone through the effort for better names.
6 replies 0 retweets 0 likes -
very critical software like aircraft ctrl factories, reactors etc. any confusion or changes to 50 years of tested IEEE libs endangers this >
1 reply 0 retweets 0 likes -
you may notice we are specifically talking about /new/ programming languages
2 replies 0 retweets 2 likes -
>> regardless of what kind of programming language is used, besides every technical engineer knows what they are.
2 replies 0 retweets 0 likes -
js calls a 64-bit float a Number lua calls it a number python calls it a float ruby calls it a Float rust calls it an f64
2 replies 0 retweets 3 likes
Frankly: Number is even worse for something with very specific behaviour.
-
-
Replying to @andreasdotorg @eevee and
Yeah, I think `Number` should be reserved for a rational type. Even then, I think `Number` should maybe be a typealias for the actual name.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.