Remix.run Logo
meindnoch 6 days ago

  @on user
  > onAskAboutConvoLang() -> (
      if(??? (+ boolean /m last:3 task:Inspecting message)
          Did the user ask about Convo-Lang in their last message
      ???) then (
  
          @ragForMsg public/learn-convo
          ??? (+ respond /m task:Generating response about Convo-Lang)
              Answer the users question using the following information about Convo-Lang
          ???
      )
  )
  
  > user
Who in their right mind would come up with such a "syntax"? An LLM?
convo-lang 5 days ago | parent | next [-]

Sometimes I feel like an LLM . I takes a little getting used to, but that is the same for any new language. And the Convo-Lang syntax highlighter helps to.

The triple questions marks (???) are used to enclose natural language that is evaluated by the LLM and is considered an inline-prompt since it is evaluated inline within a function / tool call. I wanted there to be a very clear delineation between the deterministic code that is executed by the Convo-Lang interpreter and the natural language that is evaluated by the LLM. I also wanted there to be as little need for escape characters as possible.

The content in the parentheses following the triple question marks is the header of the inline-prompt and consists of modifiers that control the context and response format of the LLM.

Here is a breakdown of the header of the first inline-prompt: (+ boolean /m last:3 task:Inspecting message)

----

- modifier: +

- name: Continue conversation

- description: Includes all previous messages of the current conversation as context

----

- modifier: /m

- name: Moderator Tag

- description: Wraps the content of the prompt in a <moderator> xml tag and injects instruction into the system describing how to handle moderator tags

----

- modifier: last:{number}

- name: Select Last

- description: Discards all but the last three messages from the current conversation when used with the (+) modifier

----

- modifier: task:{string}

- name: Task Description

- description: Used by UI components to display a message to the user describing what the LLM is doing.

----

Here is a link to the Convo-Lang docs for inline-prompts - https://learn.convo-lang.ai/#inline-prompts

lnenad 6 days ago | parent | prev | next [-]

I have to agree, it looks wild, even the simpler examples don't feel ergonomic.

ljm 6 days ago | parent | prev [-]

… I think I’ll just stick with pydantic AI for now