Side Project Unwrapped: Anvilope!
May 30, 2025
#project #anvilope #rustlang #llm #ai
I’m certainly past due for blogging about this side project I’ve been working on. I keep vague-posting about it on Mastodon, but people probably have no clue wtf I’m talking about. So let’s remedy that!
Enter Anvilope
Anvilope is a simple email daemon. It sits around listening on an IMAP connection and categorizes incoming messages using a self-hosted LLM. Using a LLM allows for categories to be defined using natural languge, and it will hopefully be complementary to, and not a replacement for, existing tools like traditional spam filters and regex rules.
The project lives on SourceHut here:
https://sr.ht/~dvshkn/anvilope
Basic Info
- Rust, btw
- SQLite for storage
- Ollama for inference and model distribution
- Jinja templates for prompts
My motivations for this project are two-fold. First, like many children of the net I have some old inboxes that are overgrown, and it would be great to keep them more tidy. Second, I’ve been looking for more ways to improve at Rust, and writing a daemon sounds like a classic rite of passage.
In terms of functionality Anvilope has become pretty good at passively listening for email, but it does not write any changes back to accounts yet. Below is a list of TODOs I would like to finish before declaring a 0.1 version.
Chores / Not Implemented
- Copy or move messages to specific folders after categorizing
- Make the database leaner
- Logging cleanup
- Prompt injection testing
- Basic security improvements
Finally, I’m a big believer in posting screenshots of things one’s built, which is awkward in this case because Anvilope really only produces log output in its current state. That’s still better than nothing, though, so here is some not entirely staged output of Anvilope satisfactorily identifying a marketing email:
...
[2025-05-28T23:05:20Z DEBUG reqwest::connect] starting new connection: http://127.0.0.1:11434/
[2025-05-28T23:17:48Z DEBUG anvilope::llm] OllamaGenerateResponse {
response: "{\"class\": \"marketing\", \"reasoning\": \"The email message is advertising a product (Honda's adventure-bike lineup) that...
created_at: "2025-05-28T23:17:48.44245113Z",
prompt_eval_count: 4177,
eval_count: 67,
}
[2025-05-28T23:17:48Z INFO anvilope] message classified event
[2025-05-28T23:17:48Z INFO anvilope] updating class of message_id=210 to "marketing"
[2025-05-28T23:17:48Z DEBUG anvilope::llm] LLM job processing finished
[2025-05-28T23:25:20Z INFO anvilope] IDLE has timed out! restarting...
[2025-05-28T23:25:20Z INFO anvilope::mail::session] entering idle state
...
FHQ: Frequently Hallucinated Questions
These are questions that I have either already asked myself or that I imagine people would ask if the project was further along. It’s basically me talking to myself :-)
Does Anvilope require expensive hardware?
Today the answer is no, and hopefully it remains that way somewhat. I’m intentionally developing and testing with modest specs because this sort of text classification doesn’t seem like it should require a behemoth LLM with expensive hardware. My testbed where I run Anvilope against my own emails is just a spare mini PC I had lying around.
Testbed Specs
- Arch Linux, btw
- Mistral 7B v0.3
- AMD Ryzen 5 3550H
- 16 GB RAM
The testbed is only using CPU inference, and the memory bandwidth is almost definitely not great. Moderately long context windows bring this setup to a crawl which is admitedly pretty annoying. We’ll see if I can manage anything clever with sliding windows.
Which model works best with Anvilope?
Truthfully, I don’t know! I have been happy with Mistral 7B so far considering that my prompts are probably more suspect than the model itself. However it is completely possible that a smaller model might still perform adequately. In order to rigorously answer this question I would need to create an email categorization benchmark which is a lot of work.
Can I use this with XYZ company’s inference API?
No, I’m intentionally not including support for cloud inference APIs (OpenAI, Anthropic, etc). Keeping all of Anvilope’s functionality self-hostable is an important virtue of this project due to the sensitive nature of email. It’s possible that in the future more backends in addition to Ollama might be supported, but they need to be self-hostable.
Why not use XYZ mail app instead?
Quite simply, I don’t want to use their app. I know that Apple and Notion have rolled out similar functionality, but I don’t want to corral my different email accounts all into one app. Tackling categorization through IMAP means that it will work with virtually all apps.
Have you considered switching to Python from Rust to speed up development?
I ask myself this question roughly once per week. I probably could have hammered out something good enough for my needs in Python in a week or two, but a lot of stuff would have been hard-coded, and I’d inevitably be left digging through a pile of scripts when something broke. Instead I want to build myself something a bit nicer that feels well put together.
Have you considered vibe coding to speed up development?
One of my personal goals with this project is to learn async Rust, so having an LLM write all the code would be counter-productive. Also as it stands today prolonged vibe coding tends to produce difficult to maintain code and elevates security vulnerability risk, which I don’t want.
Are you ever going to update the README?
I’ll probably condense a lot of this post into the README soonish. Install from source instructions will come after I smooth out some of the initial setup.
Will Anvilope have a mascot?
Wow! What a very unexpected question that I definitely have not pondered ahead of time. I suppose if the project had to have a mascot it might be an antelope—some sort of Anvilope Antelope.