❤ Bram ❤ is a user on social.wxcafe.net. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.
❤ Bram ❤ @bram

Everytime I look at something written in mathematical notation I can prevent myself from thinking "this is exactly what we do NOT want to do in programming: one letter variable, super super dense code, no comments, symbols/infix operators instead of obvious functions names"

And the result speak for itself : you need to **study math** things to understand it.

We you need to do that for programming you are just reading shitty code not meant to be share with humans.

Math is just shitty code.

· Web · 45 · 77

@bram But Shitty Code That Works™

Et puis c'est beau les maths
Même si c'est archi chaud à comprendre xD

@bram I confess that I rather see programming as malformed mathematics.

@mona even from the readability and "I'm working with other humans that needs to understand what I'm communicating easily" point of view?

@bram yes! mathematical symbols (well, a good deal of them) are lucid to me. code is not.

@mona @bram if its a language I know, im the opposite, but im hoping my partner can shed some light on math for me.

@bram I must thank you, in fact, for assisting me toward this understanding of myself. the symbols and manipulations of algebra and trigonometry and calculus are almost recreational to me, and I have more than once wished that programming was more like math homework.

@mona oh, with pleasure :)

You can take a look at haskell, if you look at it closely it's programming with math constructions.

@bram and then there is LaTeX: all the short and concise but barely readable mathematical notations are actually written as a very long mess of even less readable stuff

@Thib LaTeX is a great example of "how not to make the syntax of your programming language" >.>

@wxcafe @Thib from a language design stance it's ... quite bad :/

But as a tool it's great yes (but there is a tone of things that aren't good in it)

@wxcafe @Thib ouais, c'est clairement pas super fair de le comparer à des langages d'aujourd'hui mais ça reste assez désespérant comme syntaxe ...

@wxcafe @Thib haskell c'est justement toutes les mauvaises pratiques des maths dans un langage de programmation -_-

@wxcafe @Thib ocaml c'est pareil mais Français :p

@wxcafe @bram @Thib les gens qui disent du mal d'ocaml sont bienvenus sur social.wxcafe.net ? Eh beh 🙄

@luluberlu @wxcafe @Thib je dis bien du mal des furries, vous êtes juste pareil :p

@bram @luluberlu @wxcafe @Thib Les furries sont cependant plus répandus :D

@bram @efacxw non, tu peux pas utiliser de jolies notations mathématiques, ça gère pas unicode

@thib Attends mais y'avait un ∀ dans un de tes fichiers sources, c'était pas du OCaml ?
Et on peut l'utiliser en Haskell ????

@bram @wxcafe

@Doshirae @bram @efacxw nan, le gros des sources c'est du Coq, c'est un langage différent d'OCaml, et qui gère l'unicode (même si le parseur est funky).
Utiliser quoi en Haskell ?

@Doshirae ouais, l'extension des fichiers Coq, c'est .v

Euh, je sais pas si tu peux utiliser ∀ en Haskell, mais ça ne m'étonnerait pas ?
(J'ai fait très peu de Haskell)

@efacxw @bram ouais mais on est en 2018, quoi.
Faudrait que je regarde les trucs qui se placent au-dessus de LaTeX, genre melt, ou les trucs censés le remplacer, comme… euh… y en a ?

@Thib @bram i always feel a bit skeptical when reading mathematical notation in a programming paper, in that the idea needs to be validated in two different domains, only one of which i'm really very qualified to operate in. the vocabulary of math tracks across so many different fields, i tend to feel overwhelmed pretty quickly with all i need to unpack in a few symbols.

@bram that's only if it's in fully reduced form.

you can convert any expression into a series of addition and subtraction.

@UberGeek that for the most commonly used form I encounter all the time

@bram
Even well-written and documented code cannot be understood without study.

Love,
Someone for whom calculus makes more sense than Python

@DialMforMara @bram but this is just not true! i could easily read and understand a fairly complex python script long before i learned python. meanwhile i had many years of mandatory math education through grade school, and still don't understand any even somewhat complex math

@bram mathematics is a social community and people in that community spend years working with others getting to learn the subtly nuances of all that dense stuff. It’s meant to be shared with others and the there’s a reason for the dense code but it takes a conceptual shift to be fluent in it.

@noflashphotography and I think that's the problem: it has never been built to be accessible nor easy to understand and that has never been (is very rarely?) questionned.

Appart from one exception, I've only see the question "how to teach math better" and never the "how to make math easier to teach".

That and the fact that sociologically math is used in the educational system to exclude the "not good enough" students (which is nearly always the poors).

@bram yes/no. I can speak from experience that learning advanced math is a slow going process not because of the notation but because there’s so much to learn and people are usually having to do double or triple learning: concepts, notation, and problem solving strategy. If you have one of two of those down notation usually isn’t an issue If you put in the time.

@bram notation is questioned all the time. There is always a decades even centuries long process of crafting and whittling down notation. The problem is that notation for competent practioners isn’t always what’s best for beginners.

@noflashphotography on that point I really have the feeling that you get the same problem than with old programming language where once people get used to something they don't realised that it's actually bad and in the programming field we had the chance of having a lot of different programming languages that helped refine better ideas/notations/abstractions and prove that older programming languages where actually less good at that, a thing that you don't have in math (from what I know)

@bram Notation has been changed a lot in many fields. Look into the precursors of tensors and matrices that popped up in the 1860s till the 1920s. Same goes for differential equations and their representations. Differential geometry (language of general relativity) has changed substantially
since Einstein's time.

@noflashphotography sure (I've had a math teacher that liked to teach us some of that stuff) but I find it hard to compare to the hundreds of differents programming languages we already had in less than a centuary :/

@bram I dunno. They fit into several broad categories in terms of mindset.

@bram @shel one of the first things i do when i have to use a colleague's code is refactor it to make it readable (and modifyable;)

@meena @shel on the projects I work on we try to keep a coherent code quality and notation to avoid having to do that (but I often end up doing that on code of other projects if it's bad)

@meena @bram @shel I've tried this, and usually is a total waste of time. To refactor the code, you have to understand what it's intentions are; that requires intense study of the code.

The problem is, you can refactor the code to make things "readable," but the structure of the code may now be in a *worse* state of being able to communicate intent, actually making it harder to *comprehend*, despite being easier to read.

Raymond Hettinger has a video on when it's OK to ignore conventions.

@vertigo @bram @shel hence why i said "modifyable"

making code more readable is easy
making it more modifyable is harder — even harder than just making it more understandable.

@bram Or programming is just shitty math; depends on your vantage point 😏

@bram And since notation is often sloppy and there is no type or syntax checker, you often can't even deduce what a piece of math means without checking every single definition.
Compare this to a line of code like:
for request in requests() {
parse(request)
}
You don't have to know the underlying data types and you can grep through the code because it's just text, so you can easily make high level modifications without worrying about the lower levels.

@grainloom yes, I'm 100% on that with you.

It's not as good but I really understood math "sum" notation once I've read it in python ("[x for x in list]" in opposition to the big zeta thingy)

@bram I'll be honest I had no idea which greek letter "the sum symbol" was >.>
That's the other thing, it's impossible to search for symbols without a ton of special tooling.

@bram
I think, when we design things, we need to think about the appropriate shape of the learning curve.

Math notation & basically all programming languages have a learning curve that starts in the middle, goes up very steeply, and then gradually falls. Everybody on one side of the hump sees it as impossible and everybody on the other side of the hump sees it as easy, because all the learning is packed into a couple units of intense study.

@bram
UX design best-practices for GUIs act like a flat curve is normal but actually it's more like a middling starting point, a sudden drop, and and then difficulty rises to infinity (because things that are simple are easy but things that are complicated are impossible).

@enkiv2 oh, thanks for your explanation that very interesting, I don't have enough knowledge in UI/UX :/

And yeah, that would be great to reach that for programming language but that seems quite hard since programming ... is hard :/

Just a point on programming languages: once you pass the "programming is a difficult task" there is a lot of variety in it, like python/scheme/basic are way way easier to learn than things like C++/vhdl/haskell for example.

@enkiv2
@bram I love the graphs still I can't help but disagree with the legend. Y-axis sounds more like relative effort to the task complexity than plain absolute effort.

@kaiyou @bram
I considered the Y axis to be absolute effort taking into account the habits formed and concepts learned earlier in the curve. It would be much steeper if you jumped into the right-hand side in your first outing.

@enkiv2 @bram learning curves also depend at least partially on the learner

@a_breakin_glass @bram
A little, but it's not like we can't generalize.

Like, APL is super frontloaded. Likewise, assembly language. You have to legit study to do anything in it.

You don't have to study much to use unix shell (and stuff is discoverable via man pages and such).

You have to do little more than learn to use the mouse to use a lot of GUIs. But, the complicated stuff is impossible (while elsewhere it's just hard)

@enkiv2 @bram honestly, I'd say APL is less frontloaded than assembly language, and ASCII APL-esque languages (J, K) for that matter, primarily because it at least uses symbols rather than very lowlevel operations or cryptic combinations of ascii chars

@bram @enkiv2 unix and shell are easier, but it's still hard if you're a complete newb. the fact that you *really* don't have that much in the way of portable tools doesn't help

@enkiv2 @bram and GUIs aren't necessarily so impossible, it's just an artefact of how modern ones are made

@a_breakin_glass @bram
Yeah, GUIs as a general type (as opposed to *all popular WIMP GUI implementations since 1980*) are theoretically capable of all of this shit, and just don't.

@a_breakin_glass @bram
APL and assembly are both in that sweet spot of being a lot more scary-looking than actually hard.

(Assembly is tedious but not *hard*. You can write working assembly code, as a noob, just from an afternoon leafing through a reference book, which is not remotely true of C or python or other ostensibly easier languages because there's no structure or concepts to learn.)

Nevertheless, you need the book. For shell, you can start with ls & man.

@bram Complex code generally needs studying to uinderstand, in my experience.

@bram Oh God yes, especially the *overloading the meaning of the one letter variable names* just because you're too cool to, idk, GIVE ALL YOUR SYMBOLS FULL NAMES. Even after you've literally MADE THE FONT FACE MEANINGFUL, you've still run out.

And you can't possibly make a small improvement by introducing a new variable called 'Tau' meaning '2 * Pi', although it simplifies equations ENORMOUSLY, because in Relativity Tau means 'Proper Time

and then subscripts/superscripts, BRRRRR

@rysiek @bram

I know APL exists

And I know it needed an entirely separate keyboard, and some people in the 1970s thought that was a 'good idea' and not 'one of the most remarkably bad ideas ever', but that was back when there wasn't such a thing AS a 'standard keyboard' because the IBM 3270 terminal and IBM PC created that standard.

But I've heard the set operations stuff was pretty great!

And J is a slightly saner update, I think?

@rysiek @natecull (I like this video so much, I sometime send it to friend to mindblown them :p)

@bram
Speaking as a mathematician, all I have to say is ... Ouch.

@jhertzli oh sorry, I hope it didn't hurt you personally :/

@bram
Well... Someone has to remind us to be understandable.

@bram I was hesitant to engage because your remark is flame baiting but here I am.

Did you intuit programming? Did you not **study** programming? Can you write portable assembly? Can you write C++ that works across GNU, MS, and Intel compilers? Do you know that our fastest linear algebra routines are written in an effectively dead language?

Your remark offends me as a mathematician. I found beauty and meaning in math. I have yet to find beauty and meaning in code.

@davidk01 my wording was indeed written under strong emotions and was probably not the best way to say that.

And it indeed seems to failed to communicate what frustrate me: it's not mathematics itself but the notations/languages (and praticed) used to communicate on it which I really dislike and that is, from a programming language design and programming pratice point of view for maintainabilty and accessiblity, is a compilation of what is regarded as bad pratices for that.

@davidk01 the same way than in our communities we generally regard languages like c++, javascript or vhdl as badly designed programming languages (often with bad practices used in the community) while other languages like scheme or python generally falls in the other categories.

@davidk01 (oh and I've had a lot of maths during my education and I did "computer science" studies which are filled with both (more) maths and programming but yeah, I indeed have a bias today for programming)

@bram This has occurred to me before as well, and I wholeheartedly agree.
--Math-degreed programmer