llama-rs | Run LLaMA inference on CPU, with Rust 🦀🚀🦙

 by   rustformers Rust Version: Current License: Apache-2.0

kandi X-RAY | llama-rs Summary

kandi X-RAY | llama-rs Summary

llama-rs is a Rust library. llama-rs has no bugs, it has no vulnerabilities, it has a Permissive License and it has medium support. You can download it from GitHub.

Run LLaMA inference on CPU, with Rust 🦙
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              llama-rs has a medium active ecosystem.
              It has 2830 star(s) with 133 fork(s). There are 27 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 27 open issues and 57 have been closed. On average issues are closed in 8 days. There are 4 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of llama-rs is current.

            kandi-Quality Quality

              llama-rs has no bugs reported.

            kandi-Security Security

              llama-rs has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              llama-rs is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              llama-rs releases are not available. You will need to build from source code and install.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of llama-rs
            Get all kandi verified functions for this library.

            llama-rs Key Features

            No Key Features are available at this moment for llama-rs.

            llama-rs Examples and Code Snippets

            No Code Snippets are available at this moment for llama-rs.

            Community Discussions

            No Community Discussions are available at this moment for llama-rs.Refer to stack overflow page for discussions.

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install llama-rs

            Make sure you have a Rust 1.65.0 or above and C toolchain[^1] set up. llama-rs is a Rust library, while llama-cli is a CLI application that wraps llama-rs and offers basic inference capabilities. The following instructions explain how to build llama-cli. NOTE: For best results, make sure to build and run in release mode. Debug builds are going to be very slow.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/rustformers/llama-rs.git

          • CLI

            gh repo clone rustformers/llama-rs

          • sshUrl

            git@github.com:rustformers/llama-rs.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link