That was a fun read, and timely for me since I’ve been dabbling in many programming languages recently (it’s a silly bucket list item for me).
Obviously the article is based on the author’s experience, and they note that explicitly, but here are some languages that blew my mind, in addition to the those already in the article:
Forth (it’s so simple that I feel like I could implement most of it without any preparation)
K or J (array programming languages make a lot of frequent patterns into first class syntax
Also, if you don’t mind esoteric languages, I highly recommend:
Subleq (Turing complete instruction set… in a single instruction!)
Piet (a two-dimensional, graphical language that has similarities to Forth)
And I’d give Common Lisp an honorable mention for it’s condition+restarts system, as well as its REPL-driven workflow.
Agreed! Factor also has good documentation and a large community (as far as concatenative languages go). Others worth mentioning are PostScript, RetroForth, Min, and Uxntal.
Dunno whether the author is reading these replies, but in case you are: I’m reading this on a 27” monitor at a comfortable zoom level, and more than half of the vertical space is taken up by the AI slop image. Why? It’s an immediate turn-off for what is otherwise a fun article.
For me, Clojure deserves mention for many reasons. Perhaps the most notable is this: no other language I’ve seen (a) emphasizes the many flavors of concurrency so clearly and (b) puts them easily within reach via a standard library.
it was my second or third language, after java and ~concurrent with C, so clojure got to introduce me to a lot of things: lisp macros, dynamic typing, lazy evaluation, REPL/emacs-as-IDE, immutable data structures, zippers, etc.
Also, I have recently stumbled upon TypeDB, which looks very much like a new generation of Prolog, with a relational database-like API.
Neat, I haven’t heard of TypeDB yet, I’ll have to check it out. I got into CozoDB which uses a Datalog dialect, development is molasses but it’s cool so far.
It’s not a language that blew my mind, but the most peculiar language I’ve ever seen is this weird one called Chaos (github) (the site is dead now so I need to link the archive). It was billed as a language where the only conditions that can happen are only once at the end of a function. The author never truly clarifies why this is desirable nor do they ever give an example of how this methodology solves anything. I’ve also never seen any way to actually define your own types or really do anything useful in it. I have no idea if this is intentional like in Erlang, or just a complete oversight. Again, this is never acknowledged in the documentation.
I believe most JVMs use a second return register that the branch on after calls to jump to catch and cleanup calls. VMKit originally used the Itanium unwind model and it was far too slow in Java where exceptions are common.
The way I’d like to implement it is to return either a value or an exception object in the return register and use the carry flag as the discriminator, so each call (to a possibly throwing function) is a call followed by a branch on carry. Most CPUs statically predict branch-on-carry as not-taken and so this is basically free (tiny increase in I-cache usage) in the no-exception case, but also fairly cheap (pipeline bubble from incorrect speculation) if an exception is thrown.
If no error occurred, and the desired action was performed, the SWI routine will clear the ARM’s V (overflow) flag on exit. If an error did occur, the SWI routine will set V on exit. Furthermore, R0 will contain a pointer to an error block, which is described below.
This is the BSD system call calling convention as well. Return values are either integer values (or, in a very small number of cases such as mmap, pointers) or errno values. If the carry flag is set, they’re errno values. After the syscall / SWI instruction, you can branch on carry to the routine that sets errno.
On MIPS and RISC-V, I think a second return register is used to get the same thing, but that doesn’t play as nicely with the branch predictor (and precludes exposing system calls that return two things or errno).
Linux instead requires that all system calls return negative errno values or non-negative success values.
Hazel with its typed holes and structural editor, Egison which takes pattern matching to extremes. I think Austral is a cool language, a small one with linear types and capabilities. As an honorable mention, Excel spreadsheets can be overly fun! Excel is a Turing incomplete functionally mostly pure array dataflow language with a ridiculously optimized concurrent runtime!
Good article. As you mentioned at the end, the order in which you learned the languages create a great impression (or bias) towards those concepts that you get “wowed” by initially. Pascal is a good first language and I think even today it’s taught in many schools in Europe. But if you then learn Python or Java, no doubt you’ll still get wowed by some enhancements like JIT but most of the concepts will feel repetitive and you’re like “these are just the same wheels re-invented, there is nothing to be wowed about”!
That was a fun read, and timely for me since I’ve been dabbling in many programming languages recently (it’s a silly bucket list item for me).
Obviously the article is based on the author’s experience, and they note that explicitly, but here are some languages that blew my mind, in addition to the those already in the article:
Also, if you don’t mind esoteric languages, I highly recommend:
And I’d give Common Lisp an honorable mention for it’s condition+restarts system, as well as its REPL-driven workflow.
If you like Forth and haven’t tried Factor yet, you might find it lots of fun!
Agreed! Factor also has good documentation and a large community (as far as concatenative languages go). Others worth mentioning are PostScript, RetroForth, Min, and Uxntal.
Dunno whether the author is reading these replies, but in case you are: I’m reading this on a 27” monitor at a comfortable zoom level, and more than half of the vertical space is taken up by the AI slop image. Why? It’s an immediate turn-off for what is otherwise a fun article.
For me, Clojure deserves mention for many reasons. Perhaps the most notable is this: no other language I’ve seen (a) emphasizes the many flavors of concurrency so clearly and (b) puts them easily within reach via a standard library.
same
it was my second or third language, after java and ~concurrent with C, so clojure got to introduce me to a lot of things: lisp macros, dynamic typing, lazy evaluation, REPL/emacs-as-IDE, immutable data structures, zippers, etc.
Neat, I haven’t heard of TypeDB yet, I’ll have to check it out. I got into CozoDB which uses a Datalog dialect, development is molasses but it’s cool so far.
It’s not a language that blew my mind, but the most peculiar language I’ve ever seen is this weird one called Chaos (github) (the site is dead now so I need to link the archive). It was billed as a language where the only conditions that can happen are only once at the end of a function. The author never truly clarifies why this is desirable nor do they ever give an example of how this methodology solves anything. I’ve also never seen any way to actually define your own types or really do anything useful in it. I have no idea if this is intentional like in Erlang, or just a complete oversight. Again, this is never acknowledged in the documentation.
Orca, which is a two-dimensional live-coding music synthesis language.
https://100r.co/site/orca.html
Pure Data, a very powerful live reactive music language, was also a mind expanding experience for me.
https://puredata.info/
I would include Unison in any such list! (Content-addressable functions.)
The latest discussion of Unison on Lobsters: https://lobste.rs/s/pwdhjv/unison_programming_language
Under Java, David Teller lists the following:
Really? OK – that sounds interesting… just how IS finally implemented in the JVM? Does anyone know what he means here?
I believe most JVMs use a second return register that the branch on after calls to jump to catch and cleanup calls. VMKit originally used the Itanium unwind model and it was far too slow in Java where exceptions are common.
The way I’d like to implement it is to return either a value or an exception object in the return register and use the carry flag as the discriminator, so each call (to a possibly throwing function) is a call followed by a branch on carry. Most CPUs statically predict branch-on-carry as not-taken and so this is basically free (tiny increase in I-cache usage) in the no-exception case, but also fairly cheap (pipeline bubble from incorrect speculation) if an exception is thrown.
On ARM, I remember the convention was to use the overflow flag for this. e.g. http://www.riscos.com/support/developers/prm/errors.html
This is the BSD system call calling convention as well. Return values are either integer values (or, in a very small number of cases such as
mmap
, pointers) orerrno
values. If the carry flag is set, they’reerrno
values. After the syscall / SWI instruction, you can branch on carry to the routine that sets errno.On MIPS and RISC-V, I think a second return register is used to get the same thing, but that doesn’t play as nicely with the branch predictor (and precludes exposing system calls that return two things or errno).
Linux instead requires that all system calls return negative errno values or non-negative success values.
Hazel with its typed holes and structural editor, Egison which takes pattern matching to extremes. I think Austral is a cool language, a small one with linear types and capabilities. As an honorable mention, Excel spreadsheets can be overly fun! Excel is a Turing incomplete functionally mostly pure array dataflow language with a ridiculously optimized concurrent runtime!
Good article. As you mentioned at the end, the order in which you learned the languages create a great impression (or bias) towards those concepts that you get “wowed” by initially. Pascal is a good first language and I think even today it’s taught in many schools in Europe. But if you then learn Python or Java, no doubt you’ll still get wowed by some enhancements like JIT but most of the concepts will feel repetitive and you’re like “these are just the same wheels re-invented, there is nothing to be wowed about”!
What a lovely read this was. I’m not familiar with Coq or Rooster and would love to give them a try.