I created minimal apps for each framework, all returning {"Hello": "World"} as JSON. They ran on Uvicorn with a single worker for baseline performance. Tests hit http://127.0.0.1:8000 with wrk (15s, 4 threads, 64 connections).
from blacksheep import Application, get, json
app = Application()
@get("/")
async def home():
return json({"Hello": "World"})from starlette.applications import Starlette
from starlette.responses import JSONResponse
from starlette.routing import Route
async def homepage(request):
return JSONResponse({'Hello': 'World'})
app = Starlette(routes=[Route('/', homepage)])from micropie import App
class Root(App):
async def index(self):
return {"Hello": "World"}
app = Root()from quart import Quart
app = Quart(__name__)
@app.route('/')
async def greet():
return {"Hello": "World"}from fastapi import FastAPI
app = FastAPI()
@app.get("/")
async def read_root():
return {"Hello": "World"}I ran multiple tests and picked the best (highest Requests/sec) for each framework. The table shows results, ordered fastest to slowest by Requests/sec:
| Framework | Requests/sec | Avg Latency (ms) | Max Latency (ms) | Transfer/sec (MB) | Total Requests | Data Read (MB) |
|---|---|---|---|---|---|---|
| Blacksheep | 23041.66 | 2.82 | 88.84 | 3.12 | 345832 | 46.83 |
| Starlette | 21615.41 | 3.00 | 90.34 | 2.93 | 324374 | 43.93 |
| MicroPie | 18519.02 | 3.53 | 105.00 | 2.84 | 277960 | 42.68 |
| FastAPI | 8899.40 | 7.22 | 56.09 | 1.21 | 133542 | 18.08 |
| Quart | 8601.40 | 7.52 | 117.99 | 1.17 | 129089 | 17.60 |
MicroPie (18,519.02 req/s, 3.53 ms latency) trails Blacksheep and Starlette but
outperforms Quart and FastAPI, which also prioritize clean code. MicroPie’s design
dynamically serializes dictionaries to JSON at runtime (using orjson if available)
and adds headers like Content-Type. This simplifies code—returning a dictionary ({"Hello": "World"})
instead of response objects like Blacksheep’s json() or Starlette’s JSONResponse—but adds
overhead from runtime checks and serialization. It also supports multiple response types
(strings, bytes, iterables), increasing latency.
In contrast:
- Blacksheep’s
json()function optimizes JSON serialization and headers. - Starlette’s
JSONResponseclass streamlines serialization. - FastAPI’s declarative syntax and OpenAPI features add overhead, even in this minimal example.
- Quart’s Flask-inspired API and JSON serialization slow it down.
MicroPie’s leaner design, with optional orjson and minimal middleware, delivers over double the throughput of FastAPI and Quart,
with lower latency (3.53 ms vs. ~7.22–7.52 ms), balancing simplicity and performance better.
MicroPie suits small to medium apps where clean code matters. FastAPI excels for large APIs with typing and documentation. Blacksheep and Starlette are best when speed is critical, despite more boilerplate.
- Blacksheep leads (23,041.66 req/s, 2.82 ms), best for high-throughput.
- Starlette follows (21,615.41 req/s, 3.00 ms), balancing speed and flexibility.
- MicroPie (18,519.02 req/s) beats Quart and FastAPI, offering clean code with strong performance, ~20–25% slower than Blacksheep and Starlette.
- FastAPI and Quart (~8,600–8,900 req/s) lag due to feature overhead.
- Test Conditions: Single Uvicorn worker shows baseline performance; adding workers may improve results, but rankings should hold.