GLM-130B | GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)

 by   THUDM Python Version: Current License: Apache-2.0

kandi X-RAY | GLM-130B Summary

kandi X-RAY | GLM-130B Summary

GLM-130B is a Python library. GLM-130B has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has medium support. You can download it from GitHub.

Blog • Download Model • 🪧 Demo • ️ Email • Paper [ICLR 2023]. Google Group (Updates) or Wechat Group or Slack channel (Discussions).
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              GLM-130B has a medium active ecosystem.
              It has 6264 star(s) with 478 fork(s). There are 83 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 87 open issues and 76 have been closed. On average issues are closed in 3 days. There are 6 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of GLM-130B is current.

            kandi-Quality Quality

              GLM-130B has no bugs reported.

            kandi-Security Security

              GLM-130B has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              GLM-130B is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              GLM-130B releases are not available. You will need to build from source code and install.
              Build file is available. You can build the component from source.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of GLM-130B
            Get all kandi verified functions for this library.

            GLM-130B Key Features

            No Key Features are available at this moment for GLM-130B.

            GLM-130B Examples and Code Snippets

            No Code Snippets are available at this moment for GLM-130B.

            Community Discussions

            No Community Discussions are available at this moment for GLM-130B.Refer to stack overflow page for discussions.

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install GLM-130B

            8 * RTX 3090. 4 * RTX 3090. 8 * RTX 2080 Ti. It is recommended to use the an A100 (40G * 8) server, as all GLM-130B evaluation results (~30 tasks) reported can be easily reproduced with a single A100 server in about half a day. With INT8/INT4 quantization, efficient inference on a single server with 4 * RTX 3090 (24G) is possible, see Quantization of GLM-130B for details. Combining quantization and weight offloading techniques, GLM-130B can also be inferenced on servers with even smaller GPU memory, see Low-Resource Inference for details.
            Python 3.9+ / CUDA 11+ / PyTorch 1.10+ / DeepSpeed 0.6+ / Apex (installation with CUDA and C++ extensions is required, see here)
            SwissArmyTransformer>=0.2.11 is required for quantization

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/THUDM/GLM-130B.git

          • CLI

            gh repo clone THUDM/GLM-130B

          • sshUrl

            git@github.com:THUDM/GLM-130B.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link