Most programming languages (C,Java,etc) have a limited set of operators. Usually arithmetic, bit-level, and logic operations.

Some programming languages (Scala,Haskell,etc) allow user-defined operators. This usually leads to 'operator soup', where idiomatic code is full of operators (>>=,++,etc), which are hard to learn for newbies. User-defined gets even harder, if you also allow prefix and postfix and ultimately mixfix (e.g. Agda) operators.

Then there are overloaded operators (+ for strings and numbers) which make code hard to understand, since simple expressions like x+1 might have arbitrary effects. Overloading gets even harder to read it is dynamic (Python,Ruby,etc), which means you cannot statically determine the meaning of an operator.

These complications make operators a hard territory for language designers. To find a good solution let's think about the extreme cases. For example, you could remove all operators like Lisp. Then there is valid criticism that x+y is more intuitive than (+ x y), even though the latter is more generic, e.g. (+ x y z). Programmers compensate with infix. You could drop overloading like OCaml, where you have x+y for integers and x+.y for floats. This is good for carefully-crafted code, where you never want to accidentally use one or the other, but inconvenient in most cases. A language can lack user-defined operators like C, but there are use cases, like matrix or vector data types, where arithmetic operators work intuitively. A problem with user-defined data types is that usually arithmetic laws like commutativity of + are violated. For example, x+y == y+x is wrong, if x and y are strings (and x≠y). Most type systems are not expressible enough to enforce such behavior for user-defined operators, as it requires the full theorem-prover power (and effort and compile times).

Then there is APL, which is a language only consisting of operators. It even uses symbols like ⍟, ⌹, and ↑. which gave rise to alternatives like J,K,etc. The interesting fact is that the programmers like the wealth of operators (once they learned it, so Stockholm Syndrome might be involved). APL does not have precedences (everything is right-associative), overloading, and (arguably) user-defined operators, though.

Personally and today, I think:

© 2014-12-28
qznc