fastapi-pagination | FastAPI pagination 📖
kandi X-RAY | fastapi-pagination Summary
kandi X-RAY | fastapi-pagination Summary
FastAPI pagination
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Paginate a query
- Reset context manager
- Create a new page with the given items
- Perform pagination with pagination
- Add pagination routes to parent
- Update the route
- Create a dependency dependency function
- Create a context manager for pagination
- Return a CursorRawParams object
- Decodes a cursor
- Initialize the database
- Return user data
- Create a new CursorPage instance
- Encode a Cursor
- Render a page of users
- Create a new limit offset page
fastapi-pagination Key Features
fastapi-pagination Examples and Code Snippets
Community Discussions
Trending Discussions on fastapi-pagination
QUESTION
I have a simple REST api which is a books store created with FastAPI and mongo db as the backend (I have used Motor
as the library instead of Pymongo
). I have a GET
endpoint to get all the books in the database which also supports query strings (For example : A user can search for books with a single author or with a genre type etc).
Below are the corresponding codes for this endpoint :
routers.py
ANSWER
Answered 2021-May-17 at 16:45There is no right or wrong answer to such a question. A lot depends on the technology stack that you use, as well as the context that you have, considering also the future directions of both the software you wrote as well as the software you use (mongo).
Answering your questions:
It depends on the load you have to manage and the dev stack you use. Usually it is done at database level, since retrieving the first 110 and dropping the first 100 is quite dumb and resource consuming (the database will do it for you).
To me is seems pretty simple on how to do it via
fastapi
: just add to yourget
function the parameterslimit: int = 10
andskip: int = 0
and use them in the filtering function of your database.Fastapi
will check the data types for you, while you could check that limit is not negative or above, say, 100.It says that there is no silver bullet and that since
skip
function of mongo does not perform well. Thus he believes that the second option is better, just for performances. If you have billions and billions of documents (e.g. amazon), well, it may be the case to use something different, though by the time your website has grown that much, I guess you'll have the money to pay an entire team of experts to sort things out and possibly develop your own database.
Concluding, the limit
and skip
approach is the most common one. It is usually done at the database level, in order to reduce the amount of work of the application and bandwidth.
Mongo is not very efficient in skipping and limiting results. If your database has, say a million of documents, then I don't think you'll even notice. You could even use a relational database for such a workload. You can always benchmark the options you have and choose the most appropriate one.
I don't know much about mongo, but I know that generally, indexes can help limiting and skipping records (docs in this case), but I'm not sure if it's the case for mongo as well.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install fastapi-pagination
gino
databases
ormar
orm
tortoise
django
piccolo
sqlmodel
motor
mongoengine
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page