But are there many scenarios where you don’t already need that anyways, just for writing out the digits of a number in the given base?
I mean, I can imagine a scenario where you might talk about base 420 on a theoretical level, without explicitly counting up until 418, 419, 420 (as e.g. Ϡ, Ϣ, 10). But honestly, you could even still refer to that as “Base 419” and it would still be fairly obvious what you mean, since you are using multiple digits rather than just one. I guess, you could also write it as “Base 4199” (so with a subscript 9 to represent what we normally call “Base 10”), if you want to be precise about it.
More like in base 1010 or base 10
More like in base 10 or base 10
Exactly! And don’t forget about hexadecimal aka base 10
Yeah, always bothered me that we don’t refer to them by their highest digit. That would make them unambiguous.
It’s not like “base 0” was getting used anyways
Then you would need an unique symbol for every possible number
But are there many scenarios where you don’t already need that anyways, just for writing out the digits of a number in the given base?
I mean, I can imagine a scenario where you might talk about base 420 on a theoretical level, without explicitly counting up until 418, 419, 420 (as e.g. Ϡ, Ϣ, 10). But honestly, you could even still refer to that as “Base 419” and it would still be fairly obvious what you mean, since you are using multiple digits rather than just one. I guess, you could also write it as “Base 4199” (so with a subscript 9 to represent what we normally call “Base 10”), if you want to be precise about it.
Yeah I doubt non-programmers would catch that.