Neh-Thalggu : Language Oriented Programming for Vibe Coders

Recently I’ve been a) doing a lot of vibe-coding, b) struggling with failures of AI assistants, c) thinking how this whole thing can be made to work better.

It’s clear to me that while AI assistance and asking for things in natural language is a wonderful experience - (it certainly keeps me more engaged and in a “flow state” than just looking at code) - it’s also unreliable and error prone. And, frankly, quite a waste of resources.

The conclusion I’ve come to is that much of the time, the AI is helping us by writing unnecessary boilerplate. The stuff which is genuinely new, and genuinely specific to us, is something we still need to be able to specify. And ideally would be able to specify in a concise, specialist, formal language. But the AI helps because it can fill in a lot of the generic infrastructure around our core requirements, that we normally still have to write and continue to deal with.

The success of AI coding is really an indictment on the current state of programming language design. That we still haven’t managed to squeeze enough of the redundant and unnecessary complexity out of our languages.

So I went back to thinking about Domain Specific Languages, and particularly the Racket Language notion of “language oriented programming”. LOP, or building our systems out of many specialised little-languages, is attractive in principle. But usually hard to slot into our tool stack and workflows.

Then it occurred to me that the rise of AI assistance, vibe-coding tools like Cursor. And particularly the Model-Context Protocol are the perfect opportunity to make that easier.

MCP was intended for AI agents to be able to query databases or interact with the browser etc. But you could also put your DSL compiler behind that interface and have an AI Agent call it whenever it needed to expand a snippet in a DSL into a larger piece of code.

And so I present Neh-Thalggu. A DSL-via-MCP server written in Clojure.

Clojure is, of course, a perfect language for writing DSL parsers and doing code generation.

The way Neh-Thalggu works is that it’s a small server you run on your local machine, waiting to be contacted either by a human through its web interface, or by an AI agent over MCP.

DSLs are implemented as Clojure scripts which are loaded dynamically. You don’t have to recompile the server to add a new one. Just write a couple of files in an appropriate format and drop them into the “plugins” directory.

DSLs then present via MCP as “Tools”, so that the AI assistant can read their description and see how to use them. And so, a human coder, working in, say, Cursor and chatting with Claude, can drop a snippet written in the DSL into the chat window and ask for it to be expanded. The AI delegates this to the server where it will get expanded by a deterministic process locally, (more reliable and less energy demanding than getting the LLM to do it itself). The AI agent can then slot this code back into the codebase. Your DSL can provide a second, rough linting function called “eyeball” which checks that the final integration the AI made with the expanded code didn’t screw up.

Although this server and method are likely to be of interest to Clojurians, you can create DSLs that compile to any language. And this tool could be used in any development environment and project that could benefit from DSLs to reduce workload and increase reliability. I have a tiny layout language, for example, that just expands into Jinja2 templates for use in a Python web project. I’m working on some others that will expand into larger collections of objects in some music software I’m working on in Haxe.

In fact, apart from the fact that the main tool usage function is called “compile” you can drop any functionality to process one string into another, that you can write in Clojure, into Neh-Thalggu, and have it immediately available to any AI agents in the vicinity.

Obviously these are early days. The world is just starting to learn about AI assisted coding. And I’m just starting to experiment with what I can do with Neh-Thalggu. But I think it’s clear that the future is going to be about finding the most practical ways to integrate and combine genAI with formal specifications and constraints. And good tools to complement “vibe-coding” practice. Now that MCP is a standard it’s easy to integrate them. And Clojure is a great language to write all these small tools and DSLs that will act as scaffolding to help genAI build correctly.

2 Likes