This is a simple benchmark for python async frameworks. Almost all of the frameworks are ASGI-compatible (aiohttp and tornado are exceptions on the moment).

The objective of the benchmark is not testing deployment (like uvicorn vs hypercorn and etc) or database (ORM, drivers) but instead test the frameworks itself. The benchmark checks request parsing (body, headers, formdata, queries), routing, responses.

Read more about the benchmark: The Methodic

Table of contents

Accept a request and return HTML response with a custom dynamic header

The test simulates just a single HTML response.

Sorted by max req/s

Framework Requests/sec Latency 50% (ms) Latency 75% (ms) Latency Avg (ms)
blacksheep 1.2.0 18312 2.96 4.47 3.47
muffin 0.86.0 16350 3.37 4.98 3.90
falcon 3.0.1 15212 3.43 5.50 4.18
baize 0.12.1 13746 3.92 5.95 4.63
starlette 0.16.0 13505 3.84 6.12 4.71
emmett 2.3.2 13085 3.96 6.34 4.86
fastapi 0.70.0 9570 5.17 8.86 6.66
sanic 21.9.1 8684 5.77 9.57 7.41
aiohttp 3.8.0 7292 8.57 8.71 8.78
quart 0.15.1 3423 19.27 19.86 18.68
tornado 6.1 3318 18.93 19.11 19.29
django 3.2.9 1883 33.61 36.74 34.28

Parse path params, query string, JSON body and return a json response

The test simulates a simple JSON REST API endpoint.

Sorted by max req/s

Framework Requests/sec Latency 50% (ms) Latency 75% (ms) Latency Avg (ms)
blacksheep 1.2.0 10490 4.77 8.20 6.08
muffin 0.86.0 10346 4.83 8.20 6.15
falcon 3.0.1 9659 7.04 8.24 6.63
starlette 0.16.0 8011 6.20 10.63 7.96
sanic 21.9.1 7433 6.59 11.35 8.63
emmett 2.3.2 6797 9.51 11.57 9.53
baize 0.12.1 6488 10.07 10.32 9.86
fastapi 0.70.0 6062 10.68 11.54 10.55
aiohttp 3.8.0 4658 13.49 13.94 13.74
tornado 6.1 2860 22.26 22.41 22.38
quart 0.15.1 2158 29.07 29.61 29.65
django 3.2.9 1581 38.48 43.17 40.44

Parse uploaded file, store it on disk and return a text response

The test simulates multipart formdata processing and work with files.

Sorted by max req/s

Framework Requests/sec Latency 50% (ms) Latency 75% (ms) Latency Avg (ms)
blacksheep 1.2.0 5699 8.80 15.31 11.27
muffin 0.86.0 4398 11.24 20.03 14.58
sanic 21.9.1 3982 12.29 21.48 16.09
falcon 3.0.1 3471 18.04 22.77 18.59
baize 0.12.1 2702 22.74 25.60 23.67
starlette 0.16.0 2435 26.58 35.24 26.27
aiohttp 3.8.0 2253 28.21 28.34 28.49
fastapi 0.70.0 2178 30.26 39.16 29.35
tornado 6.1 2121 30.05 30.16 30.17
quart 0.15.1 1775 35.51 36.27 36.15
emmett 2.3.2 1455 40.99 48.11 44.10
django 3.2.9 996 63.49 70.47 64.26

Composite stats

Combined benchmarks results

Sorted by completed requests

Framework Requests completed Avg Latency 50% (ms) Avg Latency 75% (ms) Avg Latency (ms)
blacksheep 1.2.0 517515 5.51 9.33 6.94
muffin 0.86.0 466410 6.48 11.07 8.21
falcon 3.0.1 425130 9.5 12.17 9.8
starlette 0.16.0 359265 12.21 17.33 12.98
baize 0.12.1 344040 12.24 13.96 12.72
emmett 2.3.2 320055 18.15 22.01 19.5
sanic 21.9.1 301485 8.22 14.13 10.71
fastapi 0.70.0 267150 15.37 19.85 15.52
aiohttp 3.8.0 213045 16.76 17.0 17.0
tornado 6.1 124485 23.75 23.89 23.95
quart 0.15.1 110340 27.95 28.58 28.16
django 3.2.9 66900 45.19 50.13 46.33