- cross-posted to:
- programmerhumor@lemmy.ml
- cross-posted to:
- programmerhumor@lemmy.ml
javascript is to web developers what powerpoint is to sales people
It makes sense though
It does to some degree.
- “11” is string, 1 is an int, because strings can be added (+) convert int to string and combine: “11”+“1” = “111”
- “11” is string, 1 is an int, because strings cant be subtracted (-) convert string to int and combine: 11-1 = 10
I’m not into JS so I don’t know how it takes priority. ints can be added too, so I guess its basing it on the first variable which is compatible with the operator: in the first case string, in the second case int.
If this is how it works, it makes sense. But imo its a case of the designers being preoccupied with whether or not they could, they didn’t stop to think if they should.
… It does?
This here is my absolute favorits way to diss someone. Send the a wikipeda link and bam!
It’s because
+
is two different operators and overloads based on the type to the left, while-
is only a numeric operator and coerces left and right operands to numeric. But frankly if you’re still using+
for math or string concatenation in 2025, you’re doing it wrong.I know nothing about javascript, what is wrong with using + for math? perhaps naively, I’d say it looks suited for the job
The native arithmetic operators are prone to floating point rounding errors
The correct way to do it is to load a 500mb library that has an add function in it.
Point taken but the one I use is only ~200k for the whole package, ~11k for the actual file that gets loaded
It’s much better to make your own function that uses bitwise operations to do addition.
function add(a, b) { while (b !== 0) { // Calculate carry let carry = a & b; // Sum without carry a = a ^ b; // Shift carry to the left b = carry << 1; } return a; }
(For certain definitions of better.)
It’s my favorite language too, but I also find this hilarious.
Javascript is a dogshit language that everyone is stuck with. The best that we can hope for is the likes of typescript take the edge off of it. Even though it’s like smearing marzipan over a turd.
JS should have never leaved the Browser side. Now you can use this thing for Backend and is just awful
If you’re consciously and intentionally using JavaScript like that, I don’t want to be friends with you.
Imagine doing math with strings and then blaming the language not yourself
The problem is consistency.
The risk is when it happens unintentionally. The language is bad for hiding such errors by being overly ‘helpful’ in assuming intent.
Sure, but at this point it’s your own fault if you don’t use Typescript to keep these issues from happening.
“Use a different language” is a common defense of javascript, but kind of a weird one.
Not really, considering Typescript only adds static types to JS. It’s not a different language, it’s an extension.
Since it needs to be compiled to JavaScript in order to be used, I kind of consider it a different language. Yes, it’s a strict superset of JavaScript, but that makes it different.
That’s your prerogative, but it honestly doesn’t make sense. Typescript adds almost no functionality to JS (and the few pieces it adds are now considered mistakes that shouldn’t be used anymore). It only focuses on adding typing information, and in the future you’ll be able to run TS that doesn’t use those few added features as JS (see the proposal).
You can also add the TS types as comments in your JS code, which IMO shows that it’s not a different language.
So, just don’t use JavaScript?
That’s also my understanding: “Javascript is great because you can use other languages and then transpile them to JS.”
Oh man machine language is so good, literally the best actually
I wouldn’t use raw JS for anything new, yes. Typescript however is an excellent language.
Type of “not a number” is number
Obligatory link to wat? video
That is just the tip of the iceberg:
so plus coerces into string if not number, was that so hard?
Haha that’s a great site. But I think the C example is actually reasonable behaviour.
Ugh, like… I get why it outputs like that, but I also absolutely hate that it outputs like that.
Not just javascript: https://www.destroyallsoftware.com/talks/wat
Oh wow, that’s upsetting
F#? What? We can’t curse on the internet? Self censorship at dictator levels here. /s
Heck, I need to learn some new languages apparently. Here I was expecting an angry "CS0029 cannot implicitly convert type ‘string’ to ‘int’!
To start off… Using arithmetic operators on strings in combination with integers is a pure skill issue. Let’s disregard this.
If you were to use + where one part is a string, it’s natural to assume a string appending is desired since + is commonly used as a function for this. On the other hand, - is never used for any string operation. Therefore, it’s safe to assume that it relates to actual artihmetics and any strings should therefore be converted to numerical values.
This is an issue with untyped languages. If you don’t like it, use typescript. End of story.
Instead of trying to make it work, javascript could just say “error.” Being untyped doesn’t mean you can’t have error messages.
I think it’s less about type system, and more about lack of a separate compilation step.
With a compilation step, you can have error messages that developers see, but users don’t. (Hopefully, these errors enable the developers to reduce the errors that users see, and just generally improve the UX, but that’s NOT guaranteed.)
Without a compilation step, you have to assign some semantics to whatever random source string your interpreter gets. And, while you can certainly make that an error, that would rarely be helpful for the user. JS instead made the choice to, as much as possible, avoid error semantics in favor of silent coercions, conversions, and conflations in order to make every attempt to not “error-out” on the user.
It would be a very painful decade indeed to now change the semantics for some JS source text.
Purescript is a great option. Typescript is okay. You could also introduce a JS-to-JS “compilation” step that DID reject (or at least warn the developer) for source text that “should” be given an error semantic, but I don’t know an “off-the-shelf” approach for that – other than JSLint.
This is fair enough from an idealistic view. In practice, you don’t want your entire website to shit itself because of a potentially insignificant error.
Look! I bought this for free on capybaras website, there’s a glitch!
capybara: at least it didn’t throw an error.
/ jk 😁
Use typescript if you’re paranoid about this
I’d rather have my website shit itself than have silent difficult to find errors.
Use typescript
This is exactly why it should throw an error, to make it incredibly obvious something isn’t working correctly so it can be fixed. Otherwise you have wrong logic leading to hard to notice and hard to debug problems in your code
Use typescript
No. I don’t want to transpile. I don’t want a bundle. I want a simple site that works in the browser. I want to serve it as a static site. I don’t want a build step. I don’t want node_modules. I want to code using the language targeted for the platform without any other nonsense.
Javascript is cancer. Fucking left pad?! How the fuck did we let that happen? What is this insane fucking compulsion to have libraries for two lines of code? To need configuration after configuration just to run fucking hello world with types and linting?
No, fuck Typescript. Microsoft owns enough. They own where you store your code. They own your IDE. They might own your operating system. Too much in one place. They don’t need to own the language I use, too.
“Let’s use a proprietary improvement to fix the standard that should have not sucked in the first place” is why we can’t have nice things.
No.
In practice runtime errors are a bitch to find and fix.
Fair enough. This is why people prefer typescript
This is my favorite language: GHC Haskell
GHC Haskell:
GHCi> length (2, 'foo') 1
Wait, now I need to know why.
* some time later *
I went to check why the hell this happened. It looks like the pair (“
(,)
”) is defined as an instance ofFoldable
, for some reason, which is the class used by functions likefoldl()
andfoldr()
. Meanwhile, triples and other tuples of higher order (such as triples, quadruples, …) are not instances ofFoldable
.The weirdest part is that, if you try to use a pair as a
Foldable
, you only get the second value, for some reason… Here is an example.ghci> foldl (\acc x -> x:acc) [] (1,2) [2]
This makes it so that the returned length is 1.
I don’t even know Haskell but it seems like (" ( , ) ") would be an instance of boob.
(.)
is a valid expression in Haskell. Normally it is the prefix form of the infix operator.
that does function composition.(.) (2*) (1+) 3
=((2*) . (1+)) 3
=2 * (1 + 3)
=8
.But, the most common use of the word “boob” in my experience in Haskell is the “boobs operator”:
(.)(.)
. It’s usage in Haskell is limited (tho valid), but it’s appearance in racy ASCII art predates even the first versions on Haskell.The pioneers of ASCII art in the 70s and 80s are the unsung heroes of porn.
Oddly enough, in Haskell (as defined by the report), length is monomorphic, so it just doesn’t work on tuples (type error).
Due to the way kinds (types of types) work in Haskell, Foldable instances can only operate over (i.e. length only counts) elements of the last/final type argument. So, for (,) it only counts the second part, which is always there exactly once. If you provided a Foldable for (,) it would also have length of 1.
Oh we’ve hit an issue that’s solved by another language or we could make another framework
If you mix types like that, it’s your own fault
Lol. In a dynamically typed language? I will do this always, that’s why I am using it
You can have a dynamic language that is strongly typed to disallow stuff like this. Like Python for example
Aand what is your point?
BS. A language shouldn’t have operators that allow non sensical operations like string concatenation when one operand is not a string.
It’s not nonsensical, implicit type coercion is a feature of JavaScript, it’s perfectly logical and predictable.
JavaScript is a filthy beast, it’s not the right tool for every job, but it’s not nonsensical.
When you follow a string with a
+
, it concatenates it with the next value (converted to string if needed). This makes sense, and it’s a very standard convention in most languages.Applying arithmetic to a string would be nonsensical, which they don’t do.
You are entitled to your opinion. implicit conversion to string is not a feature in most languages for good reasons.
Sure. And you’re entitled to yours. But words have meaning and this isn’t MY OPINION, it’s objective reality. It follows strict rules for predictable output, it is not nonsensical.
You’re entitled to think it’s nonsense, and you’d be wrong. You don’t have to like implicit type coercion, but it’s popular and in many languages for good reason…
Language Implicit Coercion Example JavaScript '5' - 1 → 4
PHP '5' + 1 → 6
Perl '5' + 1 → 6
Bash $(( '5' + 1 )) → 6
Lua "5" + 1 → 6
R "5" + 1 → 6
MATLAB '5' + 1 → 54
(ASCII math)SQL (MySQL) '5' + 1 → 6
Visual Basic '5' + 1 → 6
TypeScript '5' - 1 → 4
Tcl "5" + 1 → 6
Awk '5' + 1 → 6
PowerShell '5' + 1 → 6
ColdFusion '5' + 1 → 6
VBScript '5' + 1 → 6
ActionScript '5' - 1 → 4
Objective-J '5' - 1 → 4
Excel Formula "5" + 1 → 6
PostScript (5) 1 add → 6
I think JavaScript is filthy, I’m at home with C#, but I understand and don’t fear ITC.
Also, you contradicted yourself just then and there. Not a single of your examples does string concatenation for these types. It’s only JS
- In https://lemm.ee/comment/20947041 they claimed “implicit type coercion” and showed many examples; they did NOT claim “string concatenation”.
- However, that was in reply to https://lemmy.world/comment/17473361 which was talking about “implicit conversion to string” which is a specific type of “implicit type coercion”; NONE of the examples given involved a conversion to string.
- But also, that was in reply to https://lemm.ee/comment/20939144 which only mentions “implicit type coercion” in general.
So, I think probably everyone in the thread is “correct”, but you are actually talking past one another.
I think the JS behavior is a bad design choice, but it is well documented and consistent across implementations.
Read the thread again, it seems you slipped somewhere. This was all about the claim that implicit conversion to string somehow could make sense.
C# is filthy. But it explains where you got your warped idea of righteousness.
Especially that + and - act differently. If + does string concattenation, - should also do some string action or throw an error in this situation.
- should also do some string action
Like what kind of string action?
“Hello” + " world" is what everyone can understand. Switch with “-” and it becomes pointless.
this the “or throw an error”
If you try what I wrote it will throw a NaN. I was asking about the first part of the proposal.
The NaN isn’t an thrown. It’s just silently put into the result. And in this case it’s completely unintelligible. Why would an operation between two strings result in a number?
"Hello" - "world"
is an obvious programmer mistake. The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.The main problem here is downward coercion. Coercion should only go towards the more permissive type, never towards the more restrictive type.
Coercing a number to a string makes sense, because each number has a representation as a string, so
"hello" + 1
makes intuitive sense.Coercing a string to a number makes no sense, because not every string has a representation as a number (in fact, most strings don’t).
"hello" - 1
makes no sense at all. So converting a string to a number should be done by an explicit cast or a conversion function. Using-
with a string should always result in a thrown error/exception.The interpreter knows that this is not something anyone will ever do on purpose, so it should not silently handle it.
You basically defied the whole NaN thing. I may even agree that it should always throw an error instead, but… Found a good explanation by someone:
NaN is the number which results from math operations which make no sense
And the above example fits that.
"hello" - 1
makes no sense at all.Yeah but actually there can be many interpretations of what someone would mean by that. Increase the bytecode of the last symbol, or search for “1” and wipe it from string. The important thing is that it’s not obvious what a person who wrote that wants really, without additional input.
Anyway, your original suggestion was about discrepancy between + and - functionality. I only pointed out that it’s natural when dealing with various data types.
Maybe it is one of the reasons why some languages use . instead of + for strings.
That’s the case in many languages, pretty much in all that don’t have a separate string concatenation operator.
Yeah, and almost all languages I know then would throw an exception when you try to use
-
with a string, and if they offer multiple operators that take a string and a number, they always only perform string operations with that and never cast to a number type to do math operations with it.(e.g. some languages have
+
for string concatenation and*
to add the same string X time together, so e.g."ab" * 2 => "abab"
. It’s a terrible idea to have+
perform a string operation and-
performs a math operation.)Sure, but then your issue is with type coercion, not operator overloading.
Because there’s in fact no operator overloading happening, true, but that’s mostly an under-the-hood topic.
It should not happen no matter why it does happen under the hood.
Operator overloading for
string - string
is wrong and type coercion to implicitly cast this toint(string) - int(string)
is just as wrong.There is operator overloading happening - the
+
operator has a different meaning depending on the types involved. Your issue however seems to be with the type coercion, not the operator overloading.It should not happen no matter why it does happen under the hood.
If you don’t want it to happen either use a different language, or ensure you don’t run into this case (e.g. by using Typescript). It’s an unfortunate fact that this does happen, and it will never be removed due to backwards compatibility.