- cross-posted to:
- programmerhumor@lemmy.ml
- cross-posted to:
- programmerhumor@lemmy.ml
Without one, the run time system, must assign some semantics to the source code, no matter how erroneous it is.
That’s just not true; as the comment above points out, Python also has no separate compilation step and yet it did not adopt this philosophy. Interpeted languages were common before JavaScript; in fact, most LISP variants are interpreted, and LISP is older than C.
Moreover, even JavaScript does sometimes throw errors, because sometimes code is simply not valid syntactically, or has no valid semantics even in a language as permissive as JavaScript.
So Eich et al. absolutely could have made more things invalid, despite the risk that end-users would see the resulting error.
Python also has no separate compilation step and yet it did not adopt this philosophy
Yes. It did. It didn’t assign exactly the same semantics, but it DOES assign a run time semantic to
min()
.I’m addressing the bit that I quoted, saying that an interpreted language “must” have valid semantics for all code. I’m not specifically addressing whether or not JavaScript is right in this particular case of
min()
.…but also, what are you talking about? Python throws a type error if you call
min()
with no argument.