The NaN isn’t an thrown. It’s just silently put into the result. And in this case it’s completely unintelligible. Why would an operation between two strings result in a number?
"Hello" - "world" is an obvious programmer mistake. The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.
The main problem here is downward coercion. Coercion should only go towards the more permissive type, never towards the more restrictive type.
Coercing a number to a string makes sense, because each number has a representation as a string, so "hello" + 1 makes intuitive sense.
Coercing a string to a number makes no sense, because not every string has a representation as a number (in fact, most strings don’t). "hello" - 1 makes no sense at all. So converting a string to a number should be done by an explicit cast or a conversion function. Using - with a string should always result in a thrown error/exception.
The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.
You basically defied the whole NaN thing. I may even agree that it should always throw an error instead, but… Found a good explanation by someone:
NaN is the number which results from math operations which make no sense
And the above example fits that.
"hello" - 1 makes no sense at all.
Yeah but actually there can be many interpretations of what someone would mean by that. Increase the bytecode of the last symbol, or search for “1” and wipe it from string. The important thing is that it’s not obvious what a person who wrote that wants really, without additional input.
Anyway, your original suggestion was about discrepancy between + and - functionality. I only pointed out that it’s natural when dealing with various data types.
Maybe it is one of the reasons why some languages use . instead of + for strings.
The NaN isn’t an thrown. It’s just silently put into the result. And in this case it’s completely unintelligible. Why would an operation between two strings result in a number?
"Hello" - "world"
is an obvious programmer mistake. The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.The main problem here is downward coercion. Coercion should only go towards the more permissive type, never towards the more restrictive type.
Coercing a number to a string makes sense, because each number has a representation as a string, so
"hello" + 1
makes intuitive sense.Coercing a string to a number makes no sense, because not every string has a representation as a number (in fact, most strings don’t).
"hello" - 1
makes no sense at all. So converting a string to a number should be done by an explicit cast or a conversion function. Using-
with a string should always result in a thrown error/exception.You basically defied the whole NaN thing. I may even agree that it should always throw an error instead, but… Found a good explanation by someone:
And the above example fits that.
Yeah but actually there can be many interpretations of what someone would mean by that. Increase the bytecode of the last symbol, or search for “1” and wipe it from string. The important thing is that it’s not obvious what a person who wrote that wants really, without additional input.
Anyway, your original suggestion was about discrepancy between + and - functionality. I only pointed out that it’s natural when dealing with various data types.
Maybe it is one of the reasons why some languages use . instead of + for strings.