openai-manager | OpenAI requests by balancing prompts
kandi X-RAY | openai-manager Summary
kandi X-RAY | openai-manager Summary
openai-manager is a Python library. openai-manager has no bugs, it has no vulnerabilities, it has build file available and it has low support. You can install using 'pip install openai-manager' or download it from GitHub, PyPI.
Speed up your OpenAI requests by balancing prompts to multiple API keys.
Speed up your OpenAI requests by balancing prompts to multiple API keys.
Support
Quality
Security
License
Reuse
Support
openai-manager has a low active ecosystem.
It has 7 star(s) with 1 fork(s). There are 2 watchers for this library.
It had no major release in the last 12 months.
openai-manager has no issues reported. There are no pull requests.
It has a neutral sentiment in the developer community.
The latest version of openai-manager is 1.0.0
Quality
openai-manager has no bugs reported.
Security
openai-manager has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
License
openai-manager does not have a standard license declared.
Check the repository for any license declaration and review the terms closely.
Without a license, all rights are reserved, and you cannot use the library in your applications.
Reuse
openai-manager releases are available to install and integrate.
Deployable package is available in PyPI.
Build file is available. You can build the component from source.
Installation instructions, examples and code snippets are available.
Top functions reviewed by kandi - BETA
kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of openai-manager
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of openai-manager
openai-manager Key Features
No Key Features are available at this moment for openai-manager.
openai-manager Examples and Code Snippets
No Code Snippets are available at this moment for openai-manager.
Community Discussions
No Community Discussions are available at this moment for openai-manager.Refer to stack overflow page for discussions.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install openai-manager
Install openai-manager on PyPI. Prepare your OpenAI credentials in.
Install openai-manager on PyPI. pip install openai-manager
Prepare your OpenAI credentials in Environmental Varibles: any envvars beginning with OPENAI_API_KEY will be used to initialized the manager. Best practice to load your api keys is to prepare a .env file like: OPENAI_API_KEY_1=sk-Nxo****** OPENAI_API_KEY_2=sk-TG2****** OPENAI_API_KEY_3=sk-Kpt****** # You can set a global proxy for all api_keys OPENAI_API_PROXY=http://127.0.0.1:7890 # You can also append proxy to each api_key. # Make sure the indices match. OPENAI_API_PROXY_1=http://127.0.0.1:7890 OPENAI_API_PROXY_2=http://127.0.0.1:7890 OPENAI_API_PROXY_3=http://127.0.0.1:7890 Then load your environmental varibles before running any scripts: export $(grep -v '^#' .env | xargs) YAML config file: you can add more fine-grained restrictions on each API key if you know the ratelimit for each key in advance. See example_config.yml for details. import openai_manager openai_manager.append_auth_from_config(config_path='example_config.yml')
Run this minimal running example to see how to boost your OpenAI completions. (more interfaces coming!) import openai as official_openai import openai_manager @timeit def test_official_separate(): for i in range(10): prompt = "Once upon a time, " response = official_openai.Completion.create( model="code-davinci-002", prompt=prompt, max_tokens=20, ) print("Answer {}: {}".format(i, response["choices"][0]["text"])) @timeit def test_manager(): prompt = "Once upon a time, " prompts = [prompt] * 10 responses = openai_manager.Completion.create( model="code-davinci-002", prompt=prompts, max_tokens=20, ) assert len(responses) == 10 for i, response in enumerate(responses): print("Answer {}: {}".format(i, response["choices"][0]["text"]))
Install openai-manager on PyPI. pip install openai-manager
Prepare your OpenAI credentials in Environmental Varibles: any envvars beginning with OPENAI_API_KEY will be used to initialized the manager. Best practice to load your api keys is to prepare a .env file like: OPENAI_API_KEY_1=sk-Nxo****** OPENAI_API_KEY_2=sk-TG2****** OPENAI_API_KEY_3=sk-Kpt****** # You can set a global proxy for all api_keys OPENAI_API_PROXY=http://127.0.0.1:7890 # You can also append proxy to each api_key. # Make sure the indices match. OPENAI_API_PROXY_1=http://127.0.0.1:7890 OPENAI_API_PROXY_2=http://127.0.0.1:7890 OPENAI_API_PROXY_3=http://127.0.0.1:7890 Then load your environmental varibles before running any scripts: export $(grep -v '^#' .env | xargs) YAML config file: you can add more fine-grained restrictions on each API key if you know the ratelimit for each key in advance. See example_config.yml for details. import openai_manager openai_manager.append_auth_from_config(config_path='example_config.yml')
Run this minimal running example to see how to boost your OpenAI completions. (more interfaces coming!) import openai as official_openai import openai_manager @timeit def test_official_separate(): for i in range(10): prompt = "Once upon a time, " response = official_openai.Completion.create( model="code-davinci-002", prompt=prompt, max_tokens=20, ) print("Answer {}: {}".format(i, response["choices"][0]["text"])) @timeit def test_manager(): prompt = "Once upon a time, " prompts = [prompt] * 10 responses = openai_manager.Completion.create( model="code-davinci-002", prompt=prompts, max_tokens=20, ) assert len(responses) == 10 for i, response in enumerate(responses): print("Answer {}: {}".format(i, response["choices"][0]["text"]))
Support
For any new features, suggestions and bugs create an issue on GitHub.
If you have any questions check and ask questions on community page Stack Overflow .
Find more information at:
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page