Schnee (schnee) wrote,

Lessons (not) learned

An interesting tidbit from the TeXhax mailing list …

Context: an inquiry about TeX packages for writing chemistry papers that would work with plain TeX, without requiring LaTeX; a pointer to was provided, with the note that unfortunately this didn't allow filtering LaTeX-only packages. Another user then suggested ,,using the source`` (of the HTML page, that is), to wit:

$ wget -O - 2>/dev/null | tidy -n -i -asxml 2>/dev/null - | lxprintf -e 'a[contains(@href,"/pkg/")]' "\n" @href - | while read uri; do wget -O - $uri 2>/dev/null | tidy -n -i -asxml 2>/dev/null - | lxprintf -e 'td[.="Sources"]' "%s\n" 'following-sibling::td/a/code' -; done | grep -v latex


And another user then replied, writing:

Sometimes I just want to weep. There can be no doubt, based even on just the evidence above, that the Unix operating system is a very powerful tool, and the simple fact that one can identify all packages that do not have the string "LaTeX" (presumably case-insensitive) in their CTAN path is a clear demonstration of that fact. And yet the entire thing is gibberish. It could be Mayan, for all I know. I could stare at it for the rest of my life and still not have the slightest idea how it works. Why oh why oh why does someone not come up with a command-line interpreter (or as I fear you would call it, "a shell") that uses English verbs as its commands and Enqlish nouns/adjective/adverbs/etc as its qualifiers ? How on earth is anyone expected to know what "-i -o" implies, especially as what it implies is almost certainly a function of the command to which it is applied ? And why can one not apply 2>/dev/null distributively, such that it applies to all commands in the sequence rather than having to be spelled out in full for each.

Algol-68 showed the world how programming languages should look, feel and behave; VAX/VMS did the same for operating systems. But the world was too stupid to see their strengths, and abandoned them in favour of C and Unix. As I wrote before, I weep.

I'm inclined to agree re: the ,,gibberish`` part – not necessarily that you couldn't learn it if you stared at it for the rest of your life –, but what I found most intriguing were the remarks re: Algol-68 and VMS. Not having used either, I'm curious how they did better than C and Unix, and what lessons we could have learned, but chose not to.

Disclaimer: I like Unix, and I think the problem here isn't just with obscure syntax but also the simple fact you need to extract links from an HTML document, rather than parse a plaintext file. I'm no friend of C, though, which I think is a bit dishonest — it masquerades as a high-level language, even though it's anything but. (And (modern) C compilers are actively looking for ways to stab you in the back at every opportunity, though that's the fault of those compilers' designers and programmers more than the language proper.)

Tags: algol, c, latex, lessons not learned, unix, vms

Recent Posts from This Journal

  • Icelandic is cool

    Just spotted this over on OCremix: I think that should be " veiðigyðja fjallanna" instead — Icelandic doesn't work like…

  • How democracy should work

    From Allan J. Lichtman, Predicting the Next President: The Keys to the White House (2020 Edition), p. viii–ix: A properly functioning…

  • Blade Runner: San Francisco

    It's exactly what it says on the tin: Impressive — just wish it was longer. (H/t to canisrufus_uk BTW.)

  • Post a new comment


    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded