Hacker News new | past | comments | ask | show | jobs | submit login

Difficulties, if any, perceived or real, arising in connection with notation, are usually incomparably smaller than those presented with the subject itself. (Personally, I only wish mathematical notation were better integrated with software in general and programming languages in particular.)



Not true at all, there are several times I've attempted to read through a textbook only to be stopped by notation because something was introduced prior to being referenced, or notation is overloaded with multiple meanings.

I consistently have run into "perceived or real" confusing mathematical notation as an impediment to learning in a way that programming languages have never, ever caused me.

Does no one else feel this way? I can't be alone, and like the others responding to you have said, your claim does not seem substantiated.


A lot of people say this, but if I were to sit next to you and tell you what the notation means whenever you asked, would you instantly understand the concept? I don't think so. You still need to do work on your own with the concept before you can get to any understanding.

Say, you come across this symbol: ζ(s)

And you asked me, what does it mean?

Oh, that's probably the Riemann zeta function (it could be other things, but that's probably its most famous meaning).

Would you be like, ah, now I understand it all!

I sincerely doubt just knowing the English words for the symbols (and all mathematical notation is just shorthand for English, or whatever other natural language; it can all be read out loud) would get you that much further along. I could give you one of several possible equivalent definitions, but you still wouldn't really know much of anything. I could start explaining some things about primes and how they relate to the Riemann zeta function, and you'd still be in the dark. I'd have to talk about the prime number theorem and estimates and complex analysis and a whole bunch of other things before the Riemann hypothesis even started making sense. In fact, people have written entire books just trying to explain in the most accessible way possible what that one symbol is:

https://www.wstein.org/rh/

I believe what's going on is that a lot of people who find notation to be the first stumbling block are trying to learn mathematics in isolation, without the benefit of a mathematical community to be able to talk to. You pick up a book, see a bunch of weird symbols, blame the notation, you put it down. But people who train for mathematics specifically rarely learn that way. Mathematics is a social activity, you talk to other people, you read the symbols out loud to each other on chalkboards or side-by-side as you scribble notes on paper or on a napkin. You talk to each other a lot, pronouncing the symbols as much as you write them down.

The notation is a very superficial obstacle. Notation is very incidental, malleable, almost irrelevant (or as mathematicians might say, non-canonical or not natural). If notation is your biggest obstacle, there's a lot more underneath that you probably don't have enough experience with.


I think I agree with you.

I would say this trouble comes about in programming though, in the form of DSLs. Anyone who's worked with Rails knows the pain of having to go look up the non-obvious syntax for xyz specific file that you haven't touched in months.

Maybe you could say this is a problem with large software systems in general - the problems they solve often aren't that complicated (especially if you've worked on other similar software), but learning all the names and patterns and existing features creates a high barrier to entry.


> Not true at all, there are several times I've attempted to read through a textbook only to be stopped by notation because something was introduced prior to being referenced, or notation is overloaded with multiple meanings.

Are you saying that it was the notation alone which caused you to completely stop reading the book? I can't see that I share that experience. There are certainly times where I don't fully understand a proof or a lemma due to uncertainty about the notation, but that's rarely a _blocker_. I just skip it, continue reading and come back to it later.

I agree with the GP here: The main thing which blocks _me_ on a book is the subject itself and how the author presents it. The problem is rarely the notion by itself, but the fact that the author uses concepts which I don't know about.

EDIT: It should also be pointed out that the GP didn't say that notation was _never_ a problem, only that it's a much _smaller_ problem than "those presented with the subject itself". The big question about your experience would be if you would have been unblocked if the notation was explained in English, or if it would just uncover that it uses concepts which haven't been fully explained to you.


You are making the huge mistake of assuming that if you hadn't given up when facing difficulty with the notation, the rest would have been easy, that you would have no trouble understanding the concepts!

Don't worry: you are not alone in making that huge mistake, plenty of people do. People that don't give up when confused about notation usually quickly learn that is the concepts and relations between them that require careful thought (and also, usually, that the notation was advantages they didn't realize when at first blush they found it confusing).


You are in no position to assume rytill 'gives up when confused about notation'.


I thought that was what he or she meant by "there are several times I've attempted to read through a textbook only to be stopped by notation". I wouldn't have said what I said otherwise.


There are some cases where this is true (the most infamous being Roman numerals, where the notation makes even simple arithmetic difficult - XI * VI = LXVI???), but in general notation is just something you learn once and mostly remember, as long as you understand the concepts.

However, I fully agree that reading a text that uses notation you are not familiar with without introducing it is almost futile. Even the simple fact of not knowing the names of the operators, which makes it impossible to read the formulas cursively in you mind (for example, when you encounter 'a + b' you can read it in your mind as 'a plus b', but for something like 'a <<=> b', even if given `a <<=> b = 2*a+b`, reading the text becomes a mental chore).


Incidentally, the example of roman numeral multiplication you chose is actually fairly easy. Just break it up as (XI×V) + (XI×I). Then since X×V=L and I×V=V (Romans would have these memorised just like we memorise multiplication tables for all the decimal digits) so the first multiplication is LV and the second is XI, which add to give LXVI by simply interleaving the letters.


That example works well because you don't have to do V * V. You can do XVI * VI = (X + V + I) * (V + I) = (XV) + (XI) + (VV) + (VI) + (IV) + (II) = L + X + XXV + V + V + I = LXXXVVVI = LXXXXVI, which "simplifies" to XCVI, but it's already a good bit messier.


Now imagine being able to 'recompile' a mathematical text as an AST for the derivation, so that you could dive down and see where all the terms and definitions were coming from, instead of having to try to search a PDF for some symbol you can't ctrl-F for.


I have seen math books that have "tables of notation", with a list of all the notations and where they're defined in the text. That helps. (Also, I suspect that the sort of author who would include such a thing is one that's thinking more carefully about notation.)


> I consistently have run into "perceived or real" confusing mathematical notation as an impediment to learning in a way that programming languages have never, ever caused me.

The C pointer syntax can be confusing even to experienced C programmers.


I think differential geometry may present the closest exception to this, not only is the notation often incredibly dense and subtle (i.e. spacing between indices when raising and lowering) but often everyone seems to have their personal favorite take on any given notation.


That's a strong assertion. You're implying that difficulties with the subject itself rarely have anything to do with difficulty communicating through some notation.

I'm not disagreeing, but I'm curious how you would back up that assertion.


It’s an experimental fact. (I guess you have to trust me on this one.)


> with the subject itself

that's the thing though, you can never grapple with the subject itself, only representations thereof. this is a mix of feelings/images/movements inside your head & mechanical manipulations of the notation; furthermore, the notation itself influences our internal model/feeling of the subject.


I am not saying that with enough dedication one couldn’t screw up the notation completely.


Mathematical notation finds elegance in brevity. That's why there's such an enormous alphabet of symbols representing important concepts.

If programming languages strive for the same elegance via notation, you end up with something like Perl.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: