uvicorn-gunicorn-docker | Docker image with Uvicorn | Continuous Deployment library
kandi X-RAY | uvicorn-gunicorn-docker Summary
kandi X-RAY | uvicorn-gunicorn-docker Summary
Python web applications running with Uvicorn (using the "ASGI" specification for Python asynchronous web applications) have shown to have some of the best performances, as measured by third-party benchmarks. The achievable performance is on par with (and in many cases superior to) Go and Node.js frameworks. This image has an auto-tuning mechanism included to start a number of worker processes based on the available CPU cores. That way you can just add your code and get high performance automatically, which is useful in simple deployments.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Sends a message .
- Print version information .
- Process all environments .
- Process tag .
- Initialize the request scope .
uvicorn-gunicorn-docker Key Features
uvicorn-gunicorn-docker Examples and Code Snippets
Community Discussions
Trending Discussions on uvicorn-gunicorn-docker
QUESTION
Let's say I have 4 load-balanced Python API processes that calculate the factorial of a number.
Let's say factorial returns a Pydantic object or deeply nested dict. I do not want to use Redis for caching because nested dict/list serialization is expensive. So I use the LRU function cache.
Problem: 4 LRU caches exist for each process. When I clear the cache it clears for only 1 process (whichever catches the request).
- I want to share the LRU cache between all 4 processes. Would a custom decorator using multiprocessing.shared_memory be possible?
- If that is not possible, I want to at least clear the cache of all processes. Using multi-processing Queue or Listener/Client blocks the API functionality as I have to
while True
.
Multiple Python API processes running this code:
...ANSWER
Answered 2022-Jan-21 at 13:50My problem was not the slow JSON/pickle serialization. It was slow value retrieval from Redis storage due to larger cached variable size.
Here's a simple solution to share data between Python processes. This solution converts your Python variables to bytes (pickles them) so if slow serialization is your problem, this might not be for you.
Server (separate Python process that holds the cache):
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install uvicorn-gunicorn-docker
You can use uvicorn-gunicorn-docker like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page