Introducing SLOP: A Simple Language Open Protocol for AI Interaction
This post introduces the SLOP project, a community-driven effort to standardize AI API interactions. I'm Russell (fxhp), and I'm excited to share how this project came together and how you can get involved!
A Community Effort Kicked Off by Nathan
First, a huge shoutout to Nathan from agnt.gg for spearheading the SLOP project. Nathan organized the initial repository and rallied the community through his Discord server. His vision laid the groundwork for what SLOP has become today. You can find both the main repository and join the vibrant Discord community here: https://i-love-slop.com?ref=GXSKWN. If you're passionate about open AI standards, this is the place to connect and contribute!
What is unturf. ?
Before diving into SLOP, let’s talk about unturf. to break wheels & fostering open collaboration. While the main page https://www.unturf.com describes a possible vision for the future, the real action happens in the community driven subdomains & community spaces. We host free AI services like ai.unturf.com and supports projects like SLOP with infrastructure & a spirit of openness. It’s a loose-knit group of innovators—folks like me and you—working to make software as a service & AI accessible to everyone.
What is SLOP?
SLOP stands for Simple Language Open Protocol, a lightweight, RESTful pattern for interacting with AI services. Think of it as a universal handshake for AI APIs, making it easy to integrate models, tools, and resources across platforms. SLOP defines well-known core endpoints so that orgainizations can interopt using standard endpoints.
Ideally a SLOP server will be created in every programming language and placed into the examples directory of the main repo.
- /chat: Talk to AI models.
- /tools: Use predefined utilities (e.g., calculator, greeter).
- This is where you can call public or private tools or algorithm to work on public or private data and return a result. A tool could be written in ANY programming language & does not need to be written in the same language as your SLOP server. As long as your code in your slop server knows how to dispatch the process and return a result it may be used in agentic flows! Yes, this includes that narly ffmpeg bash script you still use sometimes.
- /memory:
- Store and retrieve key-value pairs. CRUD lifecycle (create-read-update-delete).
- /models: list of models & types upstream to the SLOP server.
- RFC in developemnt could be moved to /info/models (html) or /info/models.[html,xml,json,yaml,etc] right now I have a Python algorithm that dynamically determines a list of models the upstream SLOP server can provide in my implementation. People have ask for us to also keep track of the type of model, for example embeddings or image or video or various hybrid enums. Right now it's a simple list returned in JSON.
- /resources: Manage structured data or knowledge bases.
- Supports PUT to create new records (but currently no delete) We are currently RFC the ability to grab a list of resources by a prefix.
- /info: RFC in development
- /pay: Simulate transactions (for fun or future monetization).
It’s all about simplicity: HTTP requests with JSON payloads, no fuss, no proprietary lock-in. SLOP lets any AI service—from big providers to homebrew setups—speak the same language. The community is still refining the spec, but it’s already proving its worth.
My First Python Implementation
I’ve donated the first full Python implementation of SLOP to the community written in minimal flask. This server, running at https://slop.unturf.com/openapi, is in alpha but fully functional. It supports all the SLOP endpoints and connects to 14 open-source text models, thanks to unturf’s infrastructure & partners like naptha.ai!
I plan to create mimimal SLOP servers fro all the Python frameworks: Flask, Pyramid, FastAPI, etc.
You can fork the public domain code here: https://git.unturf.com/engineering/unturf/slop.unturf.com. At the moment, we use a file-based memory store (data/memory.json, data/resources.json), and is deployed via uWSGI for production readiness.
Here’s a quick example of chatting with it using curl:
curl -X POST \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "Hello, SLOP!"}], "model": "adamo1139/Hermes-3-Llama-3.1-8B-FP8-Dynamic"}' \
https://slop.unturf.com/chat
The response will come back with an AI-generated message—try it out!
Note: that as of testing the naptha model inference server node2 is currently having issues so this command returns a CloudFlare error, we know about it & it was reported upstream to try to figure out the root cause.
# currently an error, reported & trying to diagnose.
curl -X POST \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "Hello, SLOP!"}], "model": "NousResearch/Hermes-3-Llama-3.1-8B"}' \
https://slop.unturf.com/chat
Optional Streamlit Frontend
To make SLOP even more accessible, I’ve also provided an optional Streamlit frontend, live at https://streamlit.slop.unturf.com (also in alpha). This web interface lets you interact with the SLOP server without writing code. You can:
- Chat with models in a conversational UI.
- Use tools like the calculator or greeter.
- Manage memory entries.
- Explore and update resources (including the new prefix search and PUT features!).
The frontend is open-source too—find it in the main repo—and it’s a great way to demo SLOP to non-technical users. It’s built to handle long-running chat requests (up to 300 seconds), so even big completions work smoothly.
Get Involved!
SLOP is a community project, and we need your help to grow it. Here’s how you can dive in:
Fork my SLOP
If you like Python, request an account on our gitlab server or fork this repo https://git.unturf.com/engineering/unturf/slop.unturf.com and start experimenting. Add new tools, or adapt it to your needs.
Test the Server:
Use the swagger docs or cURL or your programming language of choice with an HTTP client to interact directly with my SLOP server https://slop.unturf.com/openapi with your own requests. Report issues or suggest features to include.
Join the Community:
Connect with Nathan and the crew at https://i-love-slop.com?ref=GXSKWN. The Discord is buzzing with ideas so jump in from time to time to help shape SLOP’s future.
Try the Optional Streamlit Frontend:
Play with https://streamlit.slop.unturf.com and let us know how it can improve.
If you are a web dev or designer feel free to come up with new applications or frontends that use our standard community powered endpoints!
SLOP is about democratizing AI interaction. Whether you’re a developer, a tinkerer, or just curious, there’s a place for you here. Let’s build something simple, open, and powerful together!
What's Next?
Stateless HTTP for the win. ; )
Our biggest open source competitor is MCP, we aim to be more open & transparent & organic & grass roots from the bottom up not dropped from the ivory tower. If this vibes with you start building SLOP integrations whenever you reach for an MCP integration so that we can grow both communities & learn from each other.