It’s funny how the author gets closer and closer to enlightenment and then decides to abandon the road to full macro hygiene and uses this weird quirk of Janet which shouldn’t really work in the first place, because the author’s initial notion of macros as code-to-code transformers was actually correct.
The thing is that in a Lisp-1, you either automatically unquote to fully-qualified symbols (as Clojure does), which sidesteps the problem for 90%, or you don’t expand to “bare” symbols. The way you use gensym to avoid capture is just scratching the surface: If you recall, gensym generates a symbol that is not equal to any other symbol, yet it prints as a regular symbol (and reading it back after printing in doesn’t yield the same symbol). With hygienic macros, Scheme does something similar to all symbols.
For example, in explicit renaming macros, the following construction renames x, let and + to some gensym-like form that prevents capture:
(define-syntax foo
(er-macro-transformer
(lambda (expr rename compare)
(let ((%x (rename 'x))
(%let (rename 'let))
(%+ (rename '+))
(arg (cadr expr)))
`(,%let ((,%x 1))
(,%+ ,%x ,arg))))))
(let ((+ -) ;; Attempt to capture + and make it refer to something else
(x 10)) ;; Just a local x that happens to be used inside the macro as well
(foo x)) => 11
The idea here is that rename takes a “bare” symbol, and constructs a symbol on the fly (a la gensym) that looks the same, but resolves to what the “bare” symbol would resolve to at the place the macro is defined, rather than used.
Yes, er-macros are quite ugly to read, but improvements have been made after explicit renaming macros were invented that don’t require such laborious manual renaming (like syntax-case and implicit renaming macros). And of course syntax-rules does this all automatically.
If you’re really interested in how this works under the hood, read “macros that work” by Clinger and Rees and/or the later paper “Hygienic macros through explicit renaming” by Clinger.
So, as far as I understand the trick in Janet, it’s because Janet macros expand to Janet data structures, which could include either symbols, or the functions that those symbols resolve to?
I don’t know enough about Janet specifics to say much about that. Perhaps it works because it’s closer to an interpreter than a compiler, so at the point where a macro is expanded it already has all the things it refers to ready at hand?
For example, in an ahead of time compiler which supports cross compilation this trick would not be possible - the runtime value which the macro expands to might be defined in a library that only runs on the target architecture.
In the Scheme world there’s been a lot of research into phase separation, to avoid exactly such issues - compile-time macros that rely on procedures that ought to only be available at runtime are a recipe for disaster. A typical example is macros which call procedures that manipulate state. If run in an interpreter, the runtime state will be visible in the macro and manipulated by it. But if compiled ahead of time, the runtime state is manipulated inside the running compiler’s evaluator (if that makes sense to you), but once the resulting program is run, there’s a fresh runtime which has none of that state.
I mean, Janet is definitely closer to an interpreter, and I’m not aware of a cross-compiler story for it, and it doesn’t AOT so much.
What Janet does do is create and marshal environments/images a lot, and bundle executables out of that, and writing code like that does mean being aware of what code runs before the bundling, and what runs after.
This is a fascinating read! I had no idea this was possible.
However, I would caution against generalizing here to lisps more broadly; the ability to embed a function value directly in a macroexpansion seems to be a quirk of CL and Janet as far as I can tell; even other lisps sharing close ancestry with CL like Emacs Lisp don’t support it.
Turns out I had made a typo and it does in fact work in Clojure.
However, the rationale for doing it does not really apply in Clojure since the macro system integrates smoothly with the namespace system, and backquote fully-qualifies all symbols by default with the namespace in which the intended function is found, so while it’s possible to use this technique, it’s a solution for a problem that doesn’t exist; introducing shadowed names in the context of the macro caller cannot cause the macroexpansion to resolve to the wrong function.
Or is the implicit namespace-qualification a solution to a problem that doesn’t exist? :)
Common Lisp does the same thing, actually — maybe Clojure copied this from Common Lisp (?). It is a totally valid solution, but (at least in Common Lisp; not sure if Clojure does something more clever) you can still run into issues if your macros are defined in your own package, reference functions in that same package, and are also expanded in that same package — everything is in the same namespace. Which like… yeah then you oughtta know what your macros look like, I guess. But “lexically scoped macros” or whatever work regardless of the namespace structure.
(Also, strong caveat: I have no idea what I’m actually talking about and am basing those statements on what I read in On Lisp and have never written production lisp in my life.)
It is a totally valid solution, but (at least in Common Lisp; not sure if Clojure does something more clever) you can still run into issues if your macros are defined in your own package, reference functions in that same package, and are also expanded in that same package — everything is in the same namespace.
Yeah, this doesn’t happen at all in Clojure. Even if you’re referencing something from the current namespace it gets fully expanded into an unambiguous reference in the quoted form. It’s basically impossible to write an unhygenic macro in Clojure unintentionally.
It has its weird issues, though. You can unintentionally write a macro that doesn’t want to expand due to hygiene errors:
(ns foobar)
(def x 10)
;; ...Perhaps a lot of code...
(defmacro foo [arg]
`(let [x 1]
(+ x 1)))
If you try to use foo, it will complain that the x in the let bindings is not a “simple symbol” (because it gets expanded to (let [foobar/x 1] (+ foobar/x 1)) which is thankfully not valid). And fair enough, you will hit this issue as soon as you try to use the macro, so it should be relatively easy to debug.
Also, the system breaks down when you’re trying to write macro-writing macros. Something like this simply fails with the same error, that foo is not a “simple symbol”:
The same happens if you change make-foo to accept the name of the macro but still use quasiquotation (not exactly sure why that is, though). The only thing that seems to work is if you convert the let to a manual list building exercise:
You can unintentionally write a macro that doesn’t want to expand due to hygiene errors:
That’s kind of the whole point; you made an error (bound a symbol without gensym) and the compiler flagged it as such. Much better than an accidental symbol capture.
Something like this simply fails with the same error, that foo is not a “simple symbol”
Yeah, because it’s anaphoric. The entire system is designed around getting you to avoid this. (Though you can fight it if you are very persistent.) The correct way to write that kind of macro is to accept the name as an argument (as you did in the second version) but your second version is much uglier than it needs to be because you dropped quasiquote unnecessarily:
Fennel works similarly in that it prevents you from using quoted symbols as identifiers without gensym/auto-gensym. However, it does not tie directly into the namespace system (because Fennel is designed to avoid globals and its modules are very different from Clojure namespaces anyway) but works entirely lexically instead, so if you want a value from a module, your macroexpansion has to locally require the module.
What happens in Janet if you rebind the injected variable to a different value? It seems to me that this shouldn’t work in the general case. Also, I don’t see how this could work if you inject a variable which is declared later in the file.
A lambda can be slightly less efficient (depending on the compiler) but mostly it screws up the indentation real bad. Not a huge deal really and doing it without a macro would be fine too.
I had trouble following all this (you’ve read the Common Lisp spec way more closely than I ever bothered to), but you might be interested in John Shutt’s Kernel language. To avoid unhygienic macros, Kernel basically outlaws quasiquote and unquote and constructs all macros out of list, cons and so on. Which has the same effect as unquoting everything. A hyperstatic system where symbols in macros always expand to their binding at definition time, never to be overridden. Implying among other things that you can never use functions before defining them.
There’s a lot I love about Kernel (it provides a uniform theory integrating functions and macros and intermediate beasts) but the obsession with hygiene is not one of them. I took a lot of inspiration from Kernel in my Lisp with first-class macros, but I went all the way in the other direction and supported only macros with quasiquote and unquote. You can define symbols in any order in Wart, and override any symbols at any time, including things like if and cons. The only things you can’t override are things that look like punctuation. Parens, quote, quasiquote, unquote, unquote-splice, and a special symbol @ for apply analogous to unquote-splice. Wart is even smart enough to support apply on macros, something Kernel couldn’t do – as long as your macros are defined out of quasiquote and unquote. I find this to be a sort of indirect sign that it gets closer to the essence of macros by decoupling them into their component pieces like Kernel did, but without complecting them with concerns of hygiene.
(Bel also doesn’t care about hygienic macros and claims to support fully first-class apply on macros. Though I don’t understand how Bel’s macroexpand works in spite of some effort in that direction.)
Depends on what you’re protecting against. Macros are fundamentally a convenience. As I understand the dialectic around hygienic macros, the goal is always just to add guardrails to the convenient path, not to make the guardrails mandatory. Most such systems deliberately provide escape hatches for things like anaphoric macros. So I don’t think I’ve ever heard someone say hygiene needs to be an ironclad guarantee.
Honestly I agree with the inclusion of escape hatches if they are unlikely to be hit accidentally; I’m just surprised that the Kernel developers also agree, since they took such a severe move as to disallow quasiquote altogether.
So I don’t think I’ve ever heard someone say hygiene needs to be an ironclad guarantee.
I don’t want to put words in peoples’ mouths, but I’m pretty sure this is the stance of most Racket devs.
Racket doesn’t forbid string->symbol either, it just provides it with some type-safe scaffolding called syntax objects. We can definitely agree that makes it more difficult to use. But the ‘loophole’ does continue to exist.
I’m not aware of any macro in Common Lisp that cannot be implemented in Racket (modulo differences in the runtimes like Lisp-1 vs Lisp-2, property lists, etc.) It just gets arbitrarily gnarly.
Thanks for the clarification. I have attempted several times to understand Racket macros but never really succeeded because it’s just so much more complicated compared to the systems I’m familiar with.
Yeah, I’m totally with you. They make it so hard that macros are used a lot less in the Scheme world. If you’re looking to understand macros, I’d recommend a Lisp that’s not a Scheme. I cut my teeth on them using Arc Lisp, which was a great experience even though Arc is a pretty thin veneer over Racket.
Nowadays when I need a Racket macro I just show up in #racket and say “boy, this sure is easy to write using defmacro, too bad hygenic macros are so confusing” and someone will be like “they’re not confusing! all you have to do is $BLACK_MAGIC” and then boom; I have the macro I need.
Kernel does not avoid unhygienic macros. Whereas Scheme R6RS syntax-case makes it more difficult to write unhygienic macros but still possible. It possible to write unhygienic code with Kernel, such defining define-macro without using or the need for quasiquote et al.
Kernel basically outlaws quasiquote and unquote
Kernel does not outlaw quasiquote and unquote semantic. There is $quote and unquote is merely (eval symbol env), whereas quasiquote is just a reader trick inside Scheme (also see [0]).
and constructs all macros out of list, cons and so on.
Yes an no.
Scheme macros, and even CL macros are meant a) a hook into the compiler to speed things up e.g. compose, or clojure’s =>, or b) change the prefix-based evaluation strategy to build, so called, Domain Specific Languages such as records eg. SRFI-9.
Kernel eliminates the need to think “this a macro or is this procedure”, instead everything is an operative, it is up the interpreter or compiler to figure what can be compiled (ahead-of-time) or not, which is slightly more general that everything is a macro, at least because an operative as access to the dynamic scope.
Based on your comment description, Wart is re-inventing Kernel or something like that (without formal description unlike John Shutt).
Page 67 of the Kernel Report says macros don’t need apply because they don’t evaluate their arguments. I think that’s wrong because macros can evaluate their arguments when unquoted. Indeed, most macro args are evaluated eventually, using unquote. In the caller’s environment. Most of the value of macros lies in selectively turning off eval for just the odd arg. And macros are most of the use of fexprs, as far as I’ve been able to glean.
Kernel eliminates the need to think “this a macro or is this procedure”
Yes, that’s the goal. But it doesn’t happen for apply. I kept running into situations where I had to think about whether the variable was a macro. Often, within the body of a higher-order function/macro, I just didn’t know. So the apply restriction spread through my codebase until I figured this out.
I spent some time trying to find a clean example where I use @ on macros in Wart. Unfortunately this capability is baked into Wart so deeply (and Wart is so slow, suffering from the combinatorial explosion of every fexpr-based Lisp) that it’s hard to explain. But Wart provides the capability to cleanly extend even fundamental operations like if and def and mac, and all these use the higher-order functions on macros deep inside their implementations.
Based on your comment description, Wart is re-inventing Kernel or something like that (without formal description unlike John Shutt).
I would like to think I reimplemented the core idea of Kernel ($vau) while decoupling it from considerations of hygiene. And fixed apply in the process. Because my solution to apply can’t work in hygienic Kernel.
I don’t making any claim of novelty here. I was very much inspired by the Kernel dissertation. But I found the rest of its language spec.. warty :D
Promoting solely unhygenic macros, is similar as far as I understand, to promote “code formal proof are useless” or something similar about ACID or any kind guarantees a software might provide.
Both Scheme, and Kernel offer the ability to bypass the default hygienic behavior, and hence promote, first, a path of least surprise (and hard to find bugs), and allow the second (aka. prolly shoot yourself in the foot at some point).
At least for me, the value of Lisp is in its late bound nature during the prototyping phase. So the useability is top priority. Compromising useability with more complicated macro syntax (resulting in far fewer people defining macros, as happens in the scheme world) for better properties for mature programs seems a poor trade-off. And yes, I don’t use formal methods while prototyping either.
The only drawback of hygienic macro that I know about is that is more difficult to implement than define-macro, but again I do know everything about macros.
We’ll have to agree to disagree about syntax-rules. Just elsewhere on this thread there’s someone describing their various attempts to unsuccessfully use macros in Scheme. I have had the same experience. It’s not just the syntax of syntax-rules. Scheme is pervasively designed (like Kernel) with hygiene in mind. It makes for a very rigid language, with things like the phase separation rules, that is the antithesis of the sort of “sketching” I like to use Lisp for.
This is definitely a funky but cool feature of Janet - Functions are values that evaluate to themselves, so they just work when unquoted in janet macros.
Most lisp-like languages work this way. Symbols evaluate to the value of a corresponding variable, and lists perform a function call; everything else evaluates to itself. Have you ever wondered why you must quote symbols if you would like to refer to them literally, but you do not have to quote numbers or strings?
Most lisps work this way at runtime but it’s unusual to see it work this way at compile time because you might have a macro which is compiled on one architecture but needs to run on another one; embedding the compile-time value is only safe if you do it in a completely portable way, and it’s often not possible to make that kind of portability guarantee unless you’re working with functions that only compile to bytecode.
you might have a macro which is compiled on one architecture but needs to run on another one
That is highly unusual; generally everything, including the compiler, runs in a single lisp image on a single platform. I will also note the same issue applies to any other type of object: if the compilation and evaluation environments are distinct, then you will also need to provide a mapping of packages, symbols, cons cells, …
Not that I understood everything in the article, yet I learned a lot. Janet is the first language I am writing macros with great success. From simple web helpers to beasts generating whole modules and loving it.
Yes, this is a nice little trick. I used something similar in my loop implementation in scheme (which does not have packages): I did not want to expose subordinate functions to the outside world, so I dumped them in a scope which was only visible to the macro-expander, and placed references to them directly in the expanded code.
Someone on the orange site mentioned syntactic closures as a means to hygiene that is more principled w.r.t. this problem and now I’m curious if anyone knows about a lisp with this feature?
It’s funny how the author gets closer and closer to enlightenment and then decides to abandon the road to full macro hygiene and uses this weird quirk of Janet which shouldn’t really work in the first place, because the author’s initial notion of macros as code-to-code transformers was actually correct.
The thing is that in a Lisp-1, you either automatically unquote to fully-qualified symbols (as Clojure does), which sidesteps the problem for 90%, or you don’t expand to “bare” symbols. The way you use
gensym
to avoid capture is just scratching the surface: If you recall,gensym
generates a symbol that is not equal to any other symbol, yet it prints as a regular symbol (and reading it back after printing in doesn’t yield the same symbol). With hygienic macros, Scheme does something similar to all symbols.For example, in explicit renaming macros, the following construction renames
x
,let
and+
to some gensym-like form that prevents capture:The idea here is that
rename
takes a “bare” symbol, and constructs a symbol on the fly (a lagensym
) that looks the same, but resolves to what the “bare” symbol would resolve to at the place the macro is defined, rather than used.Yes, er-macros are quite ugly to read, but improvements have been made after explicit renaming macros were invented that don’t require such laborious manual renaming (like syntax-case and implicit renaming macros). And of course syntax-rules does this all automatically.
If you’re really interested in how this works under the hood, read “macros that work” by Clinger and Rees and/or the later paper “Hygienic macros through explicit renaming” by Clinger.
So, as far as I understand the trick in Janet, it’s because Janet macros expand to Janet data structures, which could include either symbols, or the functions that those symbols resolve to?
I don’t know enough about Janet specifics to say much about that. Perhaps it works because it’s closer to an interpreter than a compiler, so at the point where a macro is expanded it already has all the things it refers to ready at hand?
For example, in an ahead of time compiler which supports cross compilation this trick would not be possible - the runtime value which the macro expands to might be defined in a library that only runs on the target architecture.
In the Scheme world there’s been a lot of research into phase separation, to avoid exactly such issues - compile-time macros that rely on procedures that ought to only be available at runtime are a recipe for disaster. A typical example is macros which call procedures that manipulate state. If run in an interpreter, the runtime state will be visible in the macro and manipulated by it. But if compiled ahead of time, the runtime state is manipulated inside the running compiler’s evaluator (if that makes sense to you), but once the resulting program is run, there’s a fresh runtime which has none of that state.
It is very easy to paint yourself into a corner (or at least get very confused) if phases are not properly separated. See for example this blog post from “A Pythonista’s adventures in Scheme land” about phase separation.
I mean, Janet is definitely closer to an interpreter, and I’m not aware of a cross-compiler story for it, and it doesn’t AOT so much.
What Janet does do is create and marshal environments/images a lot, and bundle executables out of that, and writing code like that does mean being aware of what code runs before the bundling, and what runs after.
This is a fascinating read! I had no idea this was possible.
However, I would caution against generalizing here to lisps more broadly; the ability to embed a function value directly in a macroexpansion seems to be a quirk of CL and Janet as far as I can tell; even other lisps sharing close ancestry with CL like Emacs Lisp don’t support it.
Turns out I had made a typo and it does in fact work in Clojure.
However, the rationale for doing it does not really apply in Clojure since the macro system integrates smoothly with the namespace system, and backquote fully-qualifies all symbols by default with the namespace in which the intended function is found, so while it’s possible to use this technique, it’s a solution for a problem that doesn’t exist; introducing shadowed names in the context of the macro caller cannot cause the macroexpansion to resolve to the wrong function.
Or is the implicit namespace-qualification a solution to a problem that doesn’t exist? :)
Common Lisp does the same thing, actually — maybe Clojure copied this from Common Lisp (?). It is a totally valid solution, but (at least in Common Lisp; not sure if Clojure does something more clever) you can still run into issues if your macros are defined in your own package, reference functions in that same package, and are also expanded in that same package — everything is in the same namespace. Which like… yeah then you oughtta know what your macros look like, I guess. But “lexically scoped macros” or whatever work regardless of the namespace structure.
(Also, strong caveat: I have no idea what I’m actually talking about and am basing those statements on what I read in On Lisp and have never written production lisp in my life.)
Yeah, this doesn’t happen at all in Clojure. Even if you’re referencing something from the current namespace it gets fully expanded into an unambiguous reference in the quoted form. It’s basically impossible to write an unhygenic macro in Clojure unintentionally.
It has its weird issues, though. You can unintentionally write a macro that doesn’t want to expand due to hygiene errors:
If you try to use
foo
, it will complain that thex
in thelet
bindings is not a “simple symbol” (because it gets expanded to(let [foobar/x 1] (+ foobar/x 1))
which is thankfully not valid). And fair enough, you will hit this issue as soon as you try to use the macro, so it should be relatively easy to debug.Also, the system breaks down when you’re trying to write macro-writing macros. Something like this simply fails with the same error, that
foo
is not a “simple symbol”:The same happens if you change
make-foo
to accept the name of the macro but still use quasiquotation (not exactly sure why that is, though). The only thing that seems to work is if you convert thelet
to a manual list building exercise:But this breaks down as soon as you try to pass in identifiers as arguments:
That’s kind of the whole point; you made an error (bound a symbol without gensym) and the compiler flagged it as such. Much better than an accidental symbol capture.
Yeah, because it’s anaphoric. The entire system is designed around getting you to avoid this. (Though you can fight it if you are very persistent.) The correct way to write that kind of macro is to accept the name as an argument (as you did in the second version) but your second version is much uglier than it needs to be because you dropped quasiquote unnecessarily:
Thanks for explaining how to make this work, I stand corrected!
That’s an elegant solution to hygene. I might have to give this Clojure language a try, it sounds pretty great!
Are there other Lisps that work this way, or is Clojure unique in this regard?
Both Clojure and Common Lisp’s Macro systems seem like a huge kludge after learning syntax-case.
Fennel works similarly in that it prevents you from using quoted symbols as identifiers without gensym/auto-gensym. However, it does not tie directly into the namespace system (because Fennel is designed to avoid globals and its modules are very different from Clojure namespaces anyway) but works entirely lexically instead, so if you want a value from a module, your macroexpansion has to locally require the module.
https://fennel-lang.org/macros
What happens in Janet if you rebind the injected variable to a different value? It seems to me that this shouldn’t work in the general case. Also, I don’t see how this could work if you inject a variable which is declared later in the file.
Janet inline values, you can’t redefine something that isn’t specifically a var - if it is a var, it is accessed via indirection.
Perhaps I have missed the point completely, but why should one use a macro for this? Why not a function? (with a lambda parameter)?
A lambda can be slightly less efficient (depending on the compiler) but mostly it screws up the indentation real bad. Not a huge deal really and doing it without a macro would be fine too.
I had trouble following all this (you’ve read the Common Lisp spec way more closely than I ever bothered to), but you might be interested in John Shutt’s Kernel language. To avoid unhygienic macros, Kernel basically outlaws quasiquote and unquote and constructs all macros out of
list
,cons
and so on. Which has the same effect as unquoting everything. A hyperstatic system where symbols in macros always expand to their binding at definition time, never to be overridden. Implying among other things that you can never use functions before defining them.There’s a lot I love about Kernel (it provides a uniform theory integrating functions and macros and intermediate beasts) but the obsession with hygiene is not one of them. I took a lot of inspiration from Kernel in my Lisp with first-class macros, but I went all the way in the other direction and supported only macros with quasiquote and unquote. You can define symbols in any order in Wart, and override any symbols at any time, including things like
if
andcons
. The only things you can’t override are things that look like punctuation. Parens, quote, quasiquote, unquote, unquote-splice, and a special symbol@
forapply
analogous to unquote-splice. Wart is even smart enough to supportapply
on macros, something Kernel couldn’t do – as long as your macros are defined out of quasiquote and unquote. I find this to be a sort of indirect sign that it gets closer to the essence of macros by decoupling them into their component pieces like Kernel did, but without complecting them with concerns of hygiene.(Bel also doesn’t care about hygienic macros and claims to support fully first-class
apply
on macros. Though I don’t understand how Bel’s macroexpand works in spite of some effort in that direction.)It’s easy to write unhygenic macros without quasiquote. Does Kernel also outlaw constructing symbols?
No, looks like page 165 of the Kernel spec does provide
string->symbol
.Doesn’t that seem like a big loophole that would make it easy to be unhygenic?
Depends on what you’re protecting against. Macros are fundamentally a convenience. As I understand the dialectic around hygienic macros, the goal is always just to add guardrails to the convenient path, not to make the guardrails mandatory. Most such systems deliberately provide escape hatches for things like anaphoric macros. So I don’t think I’ve ever heard someone say hygiene needs to be an ironclad guarantee.
Honestly I agree with the inclusion of escape hatches if they are unlikely to be hit accidentally; I’m just surprised that the Kernel developers also agree, since they took such a severe move as to disallow quasiquote altogether.
I don’t want to put words in peoples’ mouths, but I’m pretty sure this is the stance of most Racket devs.
Not true, because Scheme’s
syntax-rules
explicitly provides an escape hatch for literals, which can be used to violate hygiene in a deliberate manner. Racket implementssyntax-rules
.On the other hand, you’re absolutely right that they don’t make it easy. I have no idea what to make of anaphoric macros like this one from the
anaphoric
package.Racket doesn’t forbid
string->symbol
either, it just provides it with some type-safe scaffolding called syntax objects. We can definitely agree that makes it more difficult to use. But the ‘loophole’ does continue to exist.I’m not aware of any macro in Common Lisp that cannot be implemented in Racket (modulo differences in the runtimes like Lisp-1 vs Lisp-2, property lists, etc.) It just gets arbitrarily gnarly.
Thanks for the clarification. I have attempted several times to understand Racket macros but never really succeeded because it’s just so much more complicated compared to the systems I’m familiar with.
Yeah, I’m totally with you. They make it so hard that macros are used a lot less in the Scheme world. If you’re looking to understand macros, I’d recommend a Lisp that’s not a Scheme. I cut my teeth on them using Arc Lisp, which was a great experience even though Arc is a pretty thin veneer over Racket.
Have you read Fear of Macros? Also there is Macros and Languages in Racket which takes a more exercise based approach.
At least twice.
Nowadays when I need a Racket macro I just show up in #racket and say “boy, this sure is easy to write using defmacro, too bad hygenic macros are so confusing” and someone will be like “they’re not confusing! all you have to do is $BLACK_MAGIC” and then boom; I have the macro I need.
Kernel does not avoid unhygienic macros. Whereas Scheme R6RS syntax-case makes it more difficult to write unhygienic macros but still possible. It possible to write unhygienic code with Kernel, such defining
define-macro
without using or the need for quasiquote et al.Kernel does not outlaw quasiquote and unquote semantic. There is
$quote
andunquote
is merely(eval symbol env)
, whereas quasiquote is just a reader trick inside Scheme (also see [0]).Yes an no.
Scheme macros, and even CL macros are meant a) a hook into the compiler to speed things up e.g.
compose
, or clojure’s=>
, or b) change the prefix-based evaluation strategy to build, so called, Domain Specific Languages such as records eg. SRFI-9.Kernel eliminates the need to think “this a macro or is this procedure”, instead everything is an operative, it is up the interpreter or compiler to figure what can be compiled (ahead-of-time) or not, which is slightly more general that everything is a macro, at least because an operative as access to the dynamic scope.
Based on your comment description, Wart is re-inventing Kernel or something like that (without formal description unlike John Shutt).
re apply for macros: read page 67 at https://ftp.cs.wpi.edu/pub/techreports/pdf/05-07.pdf
[0] https://github.com/cisco/ChezScheme/blob/main/s/syntax.ss#L7644
Page 67 of the Kernel Report says macros don’t need
apply
because they don’t evaluate their arguments. I think that’s wrong because macros can evaluate their arguments when unquoted. Indeed, most macro args are evaluated eventually, usingunquote
. In the caller’s environment. Most of the value of macros lies in selectively turning off eval for just the odd arg. And macros are most of the use of fexprs, as far as I’ve been able to glean.Yes, that’s the goal. But it doesn’t happen for
apply
. I kept running into situations where I had to think about whether the variable was a macro. Often, within the body of a higher-order function/macro, I just didn’t know. So theapply
restriction spread through my codebase until I figured this out.I spent some time trying to find a clean example where I use
@
on macros in Wart. Unfortunately this capability is baked into Wart so deeply (and Wart is so slow, suffering from the combinatorial explosion of every fexpr-based Lisp) that it’s hard to explain. But Wart provides the capability to cleanly extend even fundamental operations likeif
anddef
andmac
, and all these use the higher-order functions on macros deep inside their implementations.For example, here’s a definition where I override the pre-existing
with
macro to add new behavior when it’s called with(with table ...)
: https://github.com/akkartik/wart/blob/main/054table.wart#L54The backtick syntax it uses there is defined in https://github.com/akkartik/wart/blob/main/047generic.wart, which defines these advanced forms for defining functions and macros:
That file overrides this basic definition of
mac
: https://github.com/akkartik/wart/blob/main/040.wart#L30Which is defined in terms of
mac!
: https://github.com/akkartik/wart/blob/main/040.wart#L1When I remove apply for macros, this definition no longer runs, for reasons I can’t easily describe.
As a simpler example that doesn’t use apply for macros, here’s where I extend the primitive two-branch
if
to support multiple branches: https://github.com/akkartik/wart/blob/main/045check.wart#L1I would like to think I reimplemented the core idea of Kernel (
$vau
) while decoupling it from considerations of hygiene. And fixedapply
in the process. Because my solution toapply
can’t work in hygienic Kernel.I don’t making any claim of novelty here. I was very much inspired by the Kernel dissertation. But I found the rest of its language spec.. warty :D
Promoting solely unhygenic macros, is similar as far as I understand, to promote “code formal proof are useless” or something similar about ACID or any kind guarantees a software might provide.
Both Scheme, and Kernel offer the ability to bypass the default hygienic behavior, and hence promote, first, a path of least surprise (and hard to find bugs), and allow the second (aka. prolly shoot yourself in the foot at some point).
At least for me, the value of Lisp is in its late bound nature during the prototyping phase. So the useability is top priority. Compromising useability with more complicated macro syntax (resulting in far fewer people defining macros, as happens in the scheme world) for better properties for mature programs seems a poor trade-off. And yes, I don’t use formal methods while prototyping either.
Syntax rules are not much more complicated to use than define-macro, ref: https://www.gnu.org/software/guile/manual/html_node/Syntax-Rules.html
The only drawback of hygienic macro that I know about is that is more difficult to implement than define-macro, but again I do know everything about macros.
ref: https://gitlab.com/nieper/unsyntax/
We’ll have to agree to disagree about
syntax-rules
. Just elsewhere on this thread there’s someone describing their various attempts to unsuccessfully use macros in Scheme. I have had the same experience. It’s not just the syntax ofsyntax-rules
. Scheme is pervasively designed (like Kernel) with hygiene in mind. It makes for a very rigid language, with things like the phase separation rules, that is the antithesis of the sort of “sketching” I like to use Lisp for.This is definitely a funky but cool feature of Janet - Functions are values that evaluate to themselves, so they just work when unquoted in janet macros.
Most lisp-like languages work this way. Symbols evaluate to the value of a corresponding variable, and lists perform a function call; everything else evaluates to itself. Have you ever wondered why you must quote symbols if you would like to refer to them literally, but you do not have to quote numbers or strings?
Most lisps work this way at runtime but it’s unusual to see it work this way at compile time because you might have a macro which is compiled on one architecture but needs to run on another one; embedding the compile-time value is only safe if you do it in a completely portable way, and it’s often not possible to make that kind of portability guarantee unless you’re working with functions that only compile to bytecode.
That is highly unusual; generally everything, including the compiler, runs in a single lisp image on a single platform. I will also note the same issue applies to any other type of object: if the compilation and evaluation environments are distinct, then you will also need to provide a mapping of packages, symbols, cons cells, …
For example:
However, the issue of cross-compilation does apply to bootstrapping. I suggest taking a look at the following papers:
SBCL: a Sanely-Bootstrappable Common Lisp
Bootstrapping Common Lisp using Common Lisp
Not that I understood everything in the article, yet I learned a lot. Janet is the first language I am writing macros with great success. From simple web helpers to beasts generating whole modules and loving it.
This series is getting better and better. Thanks”
I understood nothing in the article and know nothing about LISP, but I can tell a lot of effort went into it anyway. Nice job.
Yes, this is a nice little trick. I used something similar in my loop implementation in scheme (which does not have packages): I did not want to expose subordinate functions to the outside world, so I dumped them in a scope which was only visible to the macro-expander, and placed references to them directly in the expanded code.
Someone on the orange site mentioned syntactic closures as a means to hygiene that is more principled w.r.t. this problem and now I’m curious if anyone knows about a lisp with this feature?
IIRC they’re in MIT Scheme
Also in Chibi.