Most programming languages (C,Java,etc) have a limited set of operators. Usually arithmetic, bit-level, and logic operations.
Some programming languages (Scala,Haskell,etc) allow user-defined operators.
This usually leads to 'operator soup',
where idiomatic code is full of operators (>>=
,++
,etc),
which are hard to learn for newbies.
User-defined gets even harder, if you also allow prefix and postfix
and ultimately mixfix (e.g. Agda) operators.
Then there are overloaded operators (+
for strings and numbers)
which make code hard to understand,
since simple expressions like x+1
might have arbitrary effects.
Overloading gets even harder to read it is dynamic (Python,Ruby,etc),
which means you cannot statically determine the meaning of an operator.
These complications make operators a hard territory for language designers.
To find a good solution let's think about the extreme cases.
For example, you could remove all operators like Lisp.
Then there is valid criticism that x+y
is more intuitive than (+ x y)
,
even though the latter is more generic, e.g. (+ x y z)
.
Programmers compensate with infix.
You could drop overloading like OCaml,
where you have x+y
for integers and x+.y
for floats.
This is good for carefully-crafted code,
where you never want to accidentally use one or the other,
but inconvenient in most cases.
A language can lack user-defined operators like C,
but there are use cases,
like matrix or vector data types,
where arithmetic operators work intuitively.
A problem with user-defined data types is
that usually arithmetic laws like commutativity of +
are violated.
For example, x+y == y+x
is wrong,
if x
and y
are strings (and x≠y).
Most type systems are not expressible enough to enforce such behavior for user-defined operators,
as it requires the full theorem-prover power (and effort and compile times).
Then there is APL, which is a language only consisting of operators. It even uses symbols like ⍟, ⌹, and ↑. which gave rise to alternatives like J,K,etc. The interesting fact is that the programmers like the wealth of operators (once they learned it, so Stockholm Syndrome might be involved). APL does not have precedences (everything is right-associative), overloading, and (arguably) user-defined operators, though.
Personally and today, I think:
- Precedences are not that important, because they are hard to memorize and parens are just safer.
I could live with
2+3*4==20
or2*3+4==14
. - Static overloading is fine.
Using
+
with all kinds of numbers, vectors, and matrices is just too convenient. - Dynamic overloading is not ok. Static program analysis is too good to make it harder even with operators.
- Overloading must preserve guarantees like commutativity, so no
+
for string concatenation. - User-defined operators are not a good idea, but having lots of operators is.
So give me operators like
??
or&/$
or⌹
, but they must have a single clear meaning everywhere.