spacy-transformers | 🛸 Use pretrained transformers like BERT , XLNet and GPT-2 | Natural Language Processing library

 by   explosion Python Version: v1.2.3 License: MIT

kandi X-RAY | spacy-transformers Summary

kandi X-RAY | spacy-transformers Summary

spacy-transformers is a Python library typically used in Artificial Intelligence, Natural Language Processing, Pytorch, Bert applications. spacy-transformers has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has medium support. You can download it from GitHub.

🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              spacy-transformers has a medium active ecosystem.
              It has 1243 star(s) with 160 fork(s). There are 31 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              spacy-transformers has no issues reported. There are 1 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of spacy-transformers is v1.2.3

            kandi-Quality Quality

              spacy-transformers has 0 bugs and 0 code smells.

            kandi-Security Security

              spacy-transformers has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              spacy-transformers code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              spacy-transformers is licensed under the MIT License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              spacy-transformers releases are available to install and integrate.
              Build file is available. You can build the component from source.
              Installation instructions, examples and code snippets are available.
              spacy-transformers saves you 1528 person hours of effort in developing the same functionality from scratch.
              It has 1955 lines of code, 162 functions and 30 files.
              It has high code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed spacy-transformers and discovered the below as its top functions. This is intended to give you an instant insight into spacy-transformers implemented functionality, and help decide if they suit your requirements.
            • Create a TransformerV3vec model
            • Convert tensors to arrays
            • Forward transformer computation
            • Calculate the alignment of tokens
            • Verify inputs are correct
            • Transpose a list
            • Reads the configuration from a byte string
            • Create a temporary directory
            • Convert the model to a bytes object
            • Create a TransformerV2vec model
            • Creates a model for the transformer
            • Transformer for Transformer
            • Find listeners for this component
            • Adds a listener to the model
            • Replace listener configuration with listener cfg
            • Deserialize Transformer data
            Get all kandi verified functions for this library.

            spacy-transformers Key Features

            No Key Features are available at this moment for spacy-transformers.

            spacy-transformers Examples and Code Snippets

            Finetune BERT Embeddings with spaCy and Rasa,Updates
            Pythondot img1Lines of Code : 25dot img1no licencesLicense : No License
            copy iconCopy
            pipeline:
             - name: HFTransformersNLP
               model_name: "bert"
               model_weights: "PATH_TO_YOUR_FINETUNED_MODEL_DIRECTORY"
               cache_dir: "PATH_TO_SOME_CACHE_FOLDER"
             - name: LanguageModelFeaturizer
             - name: DIETClassifier
               random_seed: 42
               intent_clas  
            copy iconCopy
            [components.transformer.model]
            @architectures = "ginza-transformers.TransformerModel.v1"
            name = "megagonlabs/transformers-ud-japanese-electra-base-discriminator"
            
            [components.transformer.model.tokenizer_config]
            use_fast = false
            tokenizer_class = "sud  
            copy iconCopy
            [components.transformer]
            factory = "transformer_custom"
            
            [components.transformer.model]
            name = "megagonlabs/transformers-ud-japanese-electra-base-ginza"
              

            Community Discussions

            QUESTION

            How to resume training in spacy transformers for NER
            Asked 2022-Jan-20 at 07:21

            I have created a spacy transformer model for named entity recognition. Last time I trained till it reached 90% accuracy and I also have a model-best directory from where I can load my trained model for predictions. But now I have some more data samples and I wish to resume training this spacy transformer. I saw that we can do it by changing the config.cfg but clueless about 'what to change?'

            This is my config.cfg after running python -m spacy init fill-config ./base_config.cfg ./config.cfg:

            ...

            ANSWER

            Answered 2022-Jan-20 at 07:21

            The vectors setting is not related to the transformer or what you're trying to do.

            In the new config, you want to use the source option to load the components from the existing pipeline. You would modify the [component] blocks to contain only the source setting and no other settings:

            Source https://stackoverflow.com/questions/70772641

            QUESTION

            ERROR: Cannot install en-core-web-trf because these package versions have conflicting dependencies
            Asked 2022-Jan-16 at 12:55

            I use the following commands (from spacy website here) to install spacy and en_core_web_trf under Windows 10 home 64 bit, however, I have encountered problems while running the last (third line) command.

            ...

            ANSWER

            Answered 2022-Jan-15 at 21:24

            try:
            pip uninstall spacy-transformers 1.1.4
            pip uninstall spacy-transformers 1.1.3
            pip uninstall spacy-transformers 1.1.2
            then execute:

            Source https://stackoverflow.com/questions/70725486

            QUESTION

            How to export SpaCy model with multiple components
            Asked 2021-Dec-21 at 10:15

            I'm trying to build a SpaCy pipeline using multiple components. My current pipeline only has two components at the moment, one entity ruler, and another custom component.

            The way I build it is like this:

            ...

            ANSWER

            Answered 2021-Dec-21 at 10:15

            At the place where you load the model, you need to have access to the code that defined the custom component. So if your file that defines the custom component is custom.py, you can put import custom at the top of the file where you're loading your pipeline and it should work.

            Also see the docs on saving and loading custom components.

            Source https://stackoverflow.com/questions/70432899

            QUESTION

            How to use existing huggingface-transformers model into spacy?
            Asked 2021-Oct-29 at 03:58

            I'm here to ask you guys if it is possible to use an existing trained huggingface-transformers model with spacy.

            My first naive attempt was to load it via spacy.load('bert-base-uncased'), it didn't work because spacy demands a certain structure, which is understandable.

            Now I'm trying to figure out how to use the spacy-transformers library to load the model, create the spacy structure, and use it from that point as a normal spacy-aware model.

            I don't know if it is even possible as I couldn't find anything regarding the subject. I've tried to read the documentation but all guides, examples, and posts I found, start from a spacy structured model like spacy/en_core_web_sm, but how did that model was created in the first place? I can believe someone has to train everything again with spacy.

            Can I get some help from you?

            Thanks.

            ...

            ANSWER

            Answered 2021-Oct-29 at 03:58

            What you do is add a Transformer component to your pipeline and give the name of your HuggingFace model as a parameter to that. This is covered in the docs, though people do have trouble finding it. It's important to understand that a Transformer is only one piece of a spaCy pipeline, and you should understand how it all fits together.

            To pull from the docs, this is how you specify a custom model in a config:

            Source https://stackoverflow.com/questions/69738938

            QUESTION

            Could not find function 'spacy-transformers.TransformerModel.v3' in function registry 'architectures'
            Asked 2021-Oct-28 at 14:51

            I was trying to create a custom NER model. I used spacy library to create the model. And this line of code is to create the config file from the base.config file. My code is :

            ...

            ANSWER

            Answered 2021-Oct-24 at 10:22

            This happened since spacy had a new update 3.1 recently. And the base_config file have the architecture mentioned as "spacy-transformers.TransformerModel.v3". Change it into "spacy-transformers.TransformerModel.v1"

            Source https://stackoverflow.com/questions/69694277

            QUESTION

            Cant load spacy en_core_web_trf
            Asked 2021-Oct-01 at 23:49

            As the self guide says, I've installed it with (conda environment)

            ...

            ANSWER

            Answered 2021-Oct-01 at 23:49

            Are you sure you did install spacy-transformers? After installing spacy?

            I am using pip: pip install spacy-transformers and I have no problems loading the en_core_web_trf.

            Source https://stackoverflow.com/questions/69406767

            QUESTION

            Spacy-Transformers: Access GPT-2?
            Asked 2021-Aug-28 at 05:16

            I'm using Spacy-Transformers to build some NLP models.

            The Spacy-Transformers docs say:

            spacy-transformers

            spaCy pipelines for pretrained BERT, XLNet and GPT-2

            The sample code on that page shows:

            ...

            ANSWER

            Answered 2021-Aug-28 at 05:16

            The en_core_web_trf uses a specific Transformers model, but you can specify arbitrary ones using the TransformerModel wrapper class from spacy-transformers. See the docs for that. An example config:

            Source https://stackoverflow.com/questions/68946827

            QUESTION

            an error to build a custom model using spaCy
            Asked 2021-Jul-23 at 12:48
            Issue

            Following the official instruction, I'm trying to add an extra training dataset and train a model on local cpu environment.

            But I don't change the content of base_config.cfg and config.cfg files.

            How can I fix these errors to build a model and evaluate it?

            Error

            I'm not sure about the first one is an issue or not, and I have no idea to fill the config.cfg file.

            1. The config.cfg file was an empty even after executing the code on the below procedure so far section.

            2. The error message was shown when executing train command.

            ...

            ANSWER

            Answered 2021-Jul-23 at 08:19

            It looks like you double-pasted the config or something? From the errors you'll note that it says you have two [paths] sections. About halfway through your file there's a comment like this:

            Source https://stackoverflow.com/questions/68495699

            QUESTION

            catalogue.RegistryError: [E893] Could not find function 'spacy.copy_from_base_model.v1' in function registry 'callbacks'
            Asked 2021-May-03 at 17:08

            I'm a Spacy's new user and I'm trying to run this ner_demo_update project and I got this error : catalogue.RegistryError: [E893] Could not find function 'spacy.copy_from_base_model.v1' in function registry 'callbacks'. If you're using a custom function, make sure the code is available. If the function is provided by a third-party package, e.g. spacy-transformers, make sure the package is installed in your environment. I'll like to know if someone has face the same issue.

            ...

            ANSWER

            Answered 2021-May-03 at 17:08

            copy_from_base_model.v1 is a new function, introduced in spaCy v3.0.6. Are you perhaps running an older version of spaCy? If so, can you try updating it? This will likely resolve your error.

            See also: https://github.com/explosion/spaCy/discussions/7985

            Source https://stackoverflow.com/questions/67370416

            QUESTION

            How to use LanguageDetector() from spacy_langdetect package?
            Asked 2021-Mar-20 at 23:11

            I'm trying to use the spacy_langdetect package and the only example code I can find is (https://spacy.io/universe/project/spacy-langdetect):

            ...

            ANSWER

            Answered 2021-Mar-20 at 23:11

            With spaCy v3.0 for components not built-in such as LanguageDetector, you will have to wrap it into a function prior to adding it to the nlp pipe. In your example, you can do the following:

            Source https://stackoverflow.com/questions/66712753

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install spacy-transformers

            Installing the package from pip will automatically install all dependencies, including PyTorch and spaCy. Make sure you install this package before you install the models. Also note that this package requires Python 3.6+, PyTorch v1.5+ and spaCy v3.0+. For GPU installation, find your CUDA version using nvcc --version and add the version in brackets, e.g. spacy[transformers,cuda92] for CUDA9.2 or spacy[transformers,cuda100] for CUDA10.0.

            Support

            ⚠️ Important note: This package has been extensively refactored to take advantage of spaCy v3.0. Previous versions that were built for spaCy v2.x worked considerably differently. Please see previous tagged versions of this README for documentation on prior versions.
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/explosion/spacy-transformers.git

          • CLI

            gh repo clone explosion/spacy-transformers

          • sshUrl

            git@github.com:explosion/spacy-transformers.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Reuse Pre-built Kits with spacy-transformers

            Consider Popular Natural Language Processing Libraries

            transformers

            by huggingface

            funNLP

            by fighting41love

            bert

            by google-research

            jieba

            by fxsjy

            Python

            by geekcomputers

            Try Top Libraries by explosion

            spaCy

            by explosionPython

            thinc

            by explosionPython

            spacy-course

            by explosionPython

            sense2vec

            by explosionPython

            spacy-models

            by explosionPython