It's been a while since I last published anything on Habr, about 10 years or so, and today is the day to share my small open-source project.
The project, called Gaunt Sloth Assistant, is a CLI AI client built with TypeScript (LangChain.js), distributed via npm, and works on Linux, Windows, and Mac. The user is in full control of the prompts, and forming your own system prompt is encouraged, but it has a default one as well.
GitHub: https://github.com/andruhon/gaunt-sloth-assistant
NPM: https://www.npmjs.com/package/gaunt-sloth-assistant
Gaunt Sloth currently has dependencies allowing it to use a simple JSON configuration for VertexAI, Anthropic, Groq, DeepSeek and OpenAI (consequently everyone else who using OpenAI format, e.g. Inception). Hypothetically, it should work with any model supported by LangChain; there's even a package available for Yandex, which I have never tried, but I think it should work if you install the package and provide a JS config. OLLAMA? It might work; I have never tried, but I will appreciate it if someone shares their experience.
The Gaunt Sloth can review pull requests and match them with requirements from a Jira or GitHub issue, review local diffs, chat, has access to the filesystem, and can write code to the filesystem.
The Gaunt Sloth is a versatile tool with a range of helpful capabilities:
Reviews pull requests (e.g., 42) and matches them with requirements from a Jira or GitHub issue (e.g., 12).
Reviews local diffs.
Provides an interactive chat session.
Has access to the filesystem to read and write code.
Of course, it has MCP and OAuth, so you can connect to a remote MCP such as Jira and create and edit stories like a boss.
It also has a tiny feature that can log time against a Jira issue when it finishes reviewing a PR. This is not documented yet, but you can find the config example in the release notes or ask me in the comments (as far as I know, Jira MCP can't do that).
Apart from that, you can supply simple local AI tools in LangChainJS tool format, as simple as this:
import { tool } from "@langchain/core/tools";
import { z } from "zod";
const multiply = tool(
({ a, b }: { a: number; b: number }): number => a * b,
{name: "multiply", description: "Multiply two numbers", schema: z.object({ a: z.number(), b: z.number(), })}
);
It is very config and guidelines-driven. I have a separate config in each project, setting it up for me and providing the necessary guidelines, so the AI does not screw up because of a lack of information.
Also, I have a number of non-coding projects. I have a separate one for Jira with detailed instructions on how to work with it and another for writing.
Why, when there is X and Y?
Some months ago, I was looking for a CLI assistant based on LangChainJS/LangGraphJS and didn't find many. There was a factor of curiosity and other factors as well.
The initial intent was to build a tool into which I could pipe a diff and send this diff along with guidelines to the AI, but over time it evolved, new features were built, and potentially, this can be used as a coding agent.
E.g., you type gth code
, tell it implement requirements.md
, so it would read the file with requirements and go coding.
GeminiCli, ClaudeCode? They were not officially released when I started, and I didn't know they were in development.
Aider, Goose? Sure, they are good and probably better options, but this would be harder to make your contributions and add features you need.
So what?
There are more features I would like to build than I have time for.
Contributors are welcome.
Trying it out and creating an issue, or sharing feedback in Discussions is also a contribution; a PR would be even better.