Remix.run Logo
kscarlet 2 days ago

It's easy to just stick them together, but to me (who writes too much Lisp for my own health) this is unsatisfactory.

The dream: just like macro can be seen as a (staged) extension mechanism for Lisp evaluator, there should be an extension mechanism for the static type system, which allows me to define new types, define new syntax (like Haskell do-notation) which makes use of typing environment and expected type of current context (return-type polymorphism), etc.

The reality: very few environments figure this out. In Coalton Lisp macros do work, but only at the level of untyped S-expr. A Lisp macro can't know about types of the variables in the lexical environment, or expected type of its own context. But it quite possibly works fine for the "typescript-like" use case you described.

The problem I see: H-M type system isn't designed with extensibility in mind, and it's hopeless to make it extensible. More technical explanation of why it's hard to integrate with Lisp macro is that H-M relies on a unification-based inference stage which execution flow is very different from macro expansion.

Possible solution: There's no fundamental reason why static type can't have something as powerful as Lisp macro. However first of all you would need an extensible type system, which seems to still be an open research problem. I think bidirectional type system is hopeful -- it's so different from H-M at a fundamental level though that I think it's hopeless to retrofit into Coalton.

Xmd5a 2 days ago | parent | next [-]

There are several ways to look at lisp macros but I think the most fundamental aspect is that they are hooks that allow one to run code at compile time in a dynamic compilation environment. In other words with lisp macros comes compile-time state. This is a big deal. Imagine the following piece of code:

    (serveMQTT ...)
    (serveMQTT ...)
    (serveMQTT ...)
    (serveMQTTDoc ...)
With serveMQTT as a macro, we could store each mqtt topic, accepted args, etc ... into a compile-time datastructure. serveMQTTDoc, a macro too, would fetch the content of this datastructure, generate the documentation string for all mqtt topics at compile time, and return this string at run-time when queried.

Compile-time state and a language that is also its own macro-language. That'd be a nice thing to have.

kazinator a day ago | parent [-]

So you might think, but macro-expansion as it is implemented in mainstream Lisp dialects is separate from compilation and isn't informed by compilation.

A form to be compiled is first subject to expansion, and then the expanded result, now devoid of macros is compiled.

Implementations that have multiple modes of processing, such as both a compiler and interpreter, share the macro expander between them.

There is redundancy between macro expansion and compilation. The macro expander has to understand and properly traverse special forms, and build up an environment structure that follow lexical scopes. Then the compiler has to do the same. Of course, the compiler won't have macro content in its environment stacks (lexical macros); those are gone.

JonChesterfield 2 days ago | parent | prev [-]

What's the objection to putting type annotations on the macro, and refusing to compile code where the arguments to the macro don't match the type?

kscarlet 2 days ago | parent | next [-]

There's no objection, but I think this solves a different problem than above described, and is possibly a more complicate matter in itself than it may look like. First, are you describing a way to typecheck macro so that it doesn't generate ill-typed code? The type system that does this is active research (I think requires modal typing?). Also this does not increase the expressive power of untyped Lisp macro -- just making them safer, while I was proposing their expressive power is unsatisfactory and should be enhanced via a sort of reflection. That would only makes them even harder to type!

reikonomusha 2 days ago | parent | prev [-]

Since a macro is a definition of syntax, I think you'd essentially need something like typing judgments to show how the typed elements of the syntax relate to one another, so that the type checker (e.g., a typical Hindley-Milner unifier) can use those rules. These are usually written as the fraction-looking things that show up in PLT papers. This is, as GP says, essentially extending the type system, which is a task fraught with peril (people write entire papers about type system extensions and their soundness, confluence, etc.).

JonChesterfield 2 days ago | parent [-]

Depends on your point of view really. I'd define a macro as a function with slightly weird evaluation rules.

If you want to write a macro where any call to it which typechecks creates code which itself typechecks, you have to deal with eval of sexpr which is roughly a recursive typecheck on partial information, which sounds tractable to me.