llm | An ecosystem of Rust libraries for working with large language models
kandi X-RAY | llm Summary
kandi X-RAY | llm Summary
llm is a Rust library. llm has no bugs, it has no vulnerabilities, it has a Permissive License and it has medium support. You can download it from GitHub.
llm is an ecosystem of Rust libraries for working with large language models - it's built on top of the fast, efficient GGML library for machine learning. Image by @darthdeus, using Stable Diffusion. The primary entrypoint for developers is the llm crate, which wraps llm-base and the supported model crates. Documentation for released version is available on Docs.rs. For end-users, there is a CLI application, llm-cli, which provides a convenient interface for interacting with supported models. Text generation can be done as a one-off based on a prompt, or interactively, through REPL or chat modes. The CLI can also be used to serialize (print) decoded models, quantize GGML files, or compute the perplexity of a model. It can be downloaded from the latest GitHub release or by installing it from crates.io. llm is powered by the ggml tensor library, and aims to bring the robustness and ease of use of Rust to the world of large language models. At present, inference is only on the CPU, but we hope to support GPU inference in the future through alternate backends.
llm is an ecosystem of Rust libraries for working with large language models - it's built on top of the fast, efficient GGML library for machine learning. Image by @darthdeus, using Stable Diffusion. The primary entrypoint for developers is the llm crate, which wraps llm-base and the supported model crates. Documentation for released version is available on Docs.rs. For end-users, there is a CLI application, llm-cli, which provides a convenient interface for interacting with supported models. Text generation can be done as a one-off based on a prompt, or interactively, through REPL or chat modes. The CLI can also be used to serialize (print) decoded models, quantize GGML files, or compute the perplexity of a model. It can be downloaded from the latest GitHub release or by installing it from crates.io. llm is powered by the ggml tensor library, and aims to bring the robustness and ease of use of Rust to the world of large language models. At present, inference is only on the CPU, but we hope to support GPU inference in the future through alternate backends.
Support
Quality
Security
License
Reuse
Support
llm has a medium active ecosystem.
It has 3968 star(s) with 196 fork(s). There are 34 watchers for this library.
It had no major release in the last 12 months.
There are 40 open issues and 110 have been closed. On average issues are closed in 12 days. There are 4 open pull requests and 0 closed requests.
It has a neutral sentiment in the developer community.
The latest version of llm is v0.1.1
Quality
llm has no bugs reported.
Security
llm has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
License
llm is licensed under the Apache-2.0 License. This license is Permissive.
Permissive licenses have the least restrictions, and you can use them in most projects.
Reuse
llm releases are available to install and integrate.
Installation instructions are not available. Examples and code snippets are available.
Top functions reviewed by kandi - BETA
kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of llm
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of llm
llm Key Features
No Key Features are available at this moment for llm.
llm Examples and Code Snippets
No Code Snippets are available at this moment for llm.
Community Discussions
No Community Discussions are available at this moment for llm.Refer to stack overflow page for discussions.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install llm
You can download it from GitHub.
Rust is installed and managed by the rustup tool. Rust has a 6-week rapid release process and supports a great number of platforms, so there are many builds of Rust available at any time. Please refer rust-lang.org for more information.
Rust is installed and managed by the rustup tool. Rust has a 6-week rapid release process and supports a great number of platforms, so there are many builds of Rust available at any time. Please refer rust-lang.org for more information.
Support
The llm Dockerfile is in the utils directory; the NixOS flake manifest and lockfile are in the project root.
Find more information at:
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page