This is a simple benchmark for python async frameworks. Almost all of the frameworks are ASGI-compatible (aiohttp and tornado are exceptions on the moment).
The objective of the benchmark is not testing deployment (like uvicorn vs hypercorn and etc) or database (ORM, drivers) but instead test the frameworks itself. The benchmark checks request parsing (body, headers, formdata, queries), routing, responses.
- Read about the benchmark: The Methodic
- Check complete results for the latest benchmark here: Results (2022-03-14)
Combined results
Sorted by sum of completed requests
Framework | Requests completed | Avg Latency 50% (ms) | Avg Latency 75% (ms) | Avg Latency (ms) |
---|---|---|---|---|
blacksheep 1.2.5 |
519825 | 5.46 | 9.49 | 6.96 |
sanic 21.12.1 |
470400 | 7.37 | 9.88 | 7.57 |
muffin 0.87.0 |
469725 | 6.34 | 11.19 | 8.14 |
falcon 3.0.1 |
436800 | 7.58 | 13.19 | 9.7 |
starlette 0.17.1 |
365490 | 9.94 | 17.77 | 12.9 |
baize 0.15.0 |
349425 | 11.85 | 13.64 | 12.29 |
emmett 2.4.5 |
328275 | 18.18 | 22.85 | 19.69 |
fastapi 0.75.0 |
255615 | 12.48 | 22.29 | 16.11 |
aiohttp 3.8.1 |
209310 | 17.23 | 17.47 | 17.31 |
tornado 6.1 |
121185 | 24.53 | 24.73 | 24.59 |
quart 0.16.3 |
109755 | 28.36 | 29.24 | 28.37 |
django 4.0.3 |
38610 | 71.15 | 75.81 | 76.2 |
More details: Results (2022-03-14)
Archive
Results (2022-03-14)
Results (2021-12-27)
Results (2021-11-02)
Results (2021-10-21)
Results (2021-09-01)
Results (2021-08-02)
Results (2021-07-06)
Results (2021-06-22)
Results (2021-06-14)
Results (2021-06-07)
Results (2021-05-31)
Results (2021-05-17)
Results (2021-05-10)
Results (2021-05-06)
Results (2021-04-30)
Results (2021-04-29)
Results (2021-04-28)
subscribe via RSS