"Looking more like math" is not so vapid. As Leslie Lamport says, programming is hard because thinking is hard. Fortunately, we long ago invented and mostly debugged something to make thinking easier: Mathematics. But, it turns that not only is mathematics a great tool for thinking, it's arguably the best such tool we've invented so far.
The people that object to making programming look more like math seem to think that the goal is to make programming serve mathematics, but the actual result is that (just as with physics) math ends up in better service to programming.
Looking more like math _in itself_ is vapid. Sure, you can make cases for why looking like math is good. But by itself it's vapid.
> Fortunately, we long ago invented and mostly debugged something to make thinking easier: Mathematics. B
Don't conflate mathematical notation with math itself. The great thing about mathematics are the analytical insights, not how those insights are written down.
Is it really the case that we "invented and mostly debugged something to make thinking easier" in the notational sense? Math notation, like most notation, seems as arbitrary as any other invented notation. Was there really some rigour behind it which you can use to claim that it makes "thinking easier"? The question itself of "making thinking easier" through notation is beyond the scope of mathematics. But there are of course interdisciplinary mathematicians.
Math notation is all over the damned place. It's like certain areas of math are allergic to standardized notation or minimization of complexity. Parameters to "functions" go after the function name, or in subscripts, superscripts, in positional slots around symbols like Σ.
Lets not even go to the annoying tendency to use single letter variable names.
I certainly agree that notation is important. However, what I'm talking about is tweaks at the smallest scale—just the glyph or glyphs you use to represent a single symbol. I don't think there's that much value deviating from ASCII characters or character-pairs in order to use a more mathy glyph.
I'm a fan of most of the other math notation programming languages use: parentheses for grouping, infix operators, function call syntax, etc.
I think there is a lot of room to improve the visual display of code, but I don't think much of it is as the token level.
> I think there is a lot of room to improve the visual display of code, but I don't think much of it is as the token level.
I absolutely agree - the power of mathematics isn't so much that people find it easier to read ∊ for set membership testing, or ⊥ for nil/undefined, it's that one can encode new concepts concisely in new notation. For example: div, grad and curl [1]. What's needed isn't more maths symbols in code, it's more flexible syntax markup.
You are describing the Platonic view mathematics[1]. The nature of math is far from a settled question among philosophers of mathematics, but both Plantonists and Non-Platonists agree that good notation is a crucial factor in our ability to actually do math[2].
Anecdotally, it's easier for me to think in terms of operators rather than functions or methods, even though they're all the same thing.
Being able to represent the same things in different ways is simply captured by the syntax/semantics dichotomy from disciplines like programming language theory. There is no need to invoke philosophy.
Programmers care about syntax, too. But it's not like we've collectively figured it out yet. But I wouldn't be surprised if it turned out that we were further ahead than mathematicians in some ways. :o)
The people that object to making programming look more like math seem to think that the goal is to make programming serve mathematics, but the actual result is that (just as with physics) math ends up in better service to programming.