BERT-keras | Keras implementation of BERT with pre-trained weights | Machine Learning library
kandi X-RAY | BERT-keras Summary
kandi X-RAY | BERT-keras Summary
Status: Archive (code is provided as-is, no updates expected).
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Generate sentences from text corpus
- Return a copy of the given sentence
- Create a sentence batch from a list of tokens
- Create a TaskDataBatch from a list of sentences
- Evaluate the gradient
- Clean the keras module
- Compute TPU compatible variables
- Replace keras
- Remove kas modules from sys modules
- Calculate the attention layer
- Multihead attention layer
- Return the shape of x
- Reshape x
BERT-keras Key Features
BERT-keras Examples and Code Snippets
Community Discussions
Trending Discussions on BERT-keras
QUESTION
I have a BERT multilanguage model from Google. And I have a lot of text data in my language (Korean). I want BERT to make better vectors for texts in this language. So I want to additionally train BERT on that text corpus I have. Like if I would have w2v model trained on some data and would want to continue training it. Is it possible with BERT?
There are a lot of examples of "fine-tuning" BERT on some specific tasks like even the original one from Google where you can train BERT further on your data. But as far as I understand it (I might be wrong) we do it within our task-specified model (for classification task for example). So... we do it at the same time as training our classifier (??)
What I want is to train BERT further separately and then get fixed vectors for my data. Not to build it into some task-specified model. But just get vector representation for my data (using get_features function) like they do in here. I just need to train the BERT model additionally on more data of the specific language.
Would be endlessly grateful for any suggestions/links on how to train BURT model further (preferably Tensorflow). Thank you.
...ANSWER
Answered 2019-Sep-30 at 13:34Package transformers
provides code for using and fine-tuning of most currently popular pre-trained Transformers including BERT, XLNet, GPT-2, ... You can easily load the model and continue training.
You can get the multilingual BERT model:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install BERT-keras
You can use BERT-keras like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page