bert-for-tf2 | Keras TensorFlow 2.0 implementation | Machine Learning library
kandi X-RAY | bert-for-tf2 Summary
kandi X-RAY | bert-for-tf2 Summary
A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Load weights for the given Bert model
- Map name to tfhub albert variable name
- Check if tfhub model is a tfhub model
- Checks if the given ckpt_path exists
- Call the layer
- Create attention mask
- Tokenize text
- Convert input to unicode
- Split text into tokens
- Connects the model
- Get activation function
- Connects the transformer
- Load vocabulary
- Call the model
- Call the attention layer
- Convert ids to tokens
- Convert a list of items into a list
- Get the version number
- Encode the ids in text
- Encode a piece of pieces
- Ensure text is printable
bert-for-tf2 Key Features
bert-for-tf2 Examples and Code Snippets
Community Discussions
Trending Discussions on bert-for-tf2
QUESTION
I am having a hard time converting albert (more specifically, albert_base model) to tflite. Here is my code defining my model using bert-for-tf2 (https://github.com/kpe/bert-for-tf2) <- thanks for this great implementation by the way...
...ANSWER
Answered 2019-Dec-11 at 07:47Funny that I have been struggling with the problem for hours but only right after I uploaded question I solved the problem...
So the solution is, use tensorflow version 1.15.0! Using tensorflow2 seems to cause the issue.
However, I still cannot convert the model to tflite since it does not support 'IdentityN' ops yet. I don't think I can write a custom op myself, so I think I should just wait for tflite update....
QUESTION
I wanted to fine-tune Albert_base with further mlm task, but I realized there is no pretrained ckpt file provided for albert-base. So my plan was to convert the saved_model(or model loaded from tf-hub) to checkpoint myself, and then pretrain albert-base using the code provided (https://github.com/google-research/ALBERT/blob/master/run_pretraining.py).
Before further pretraining, to check whether the conversion to ckpt was successful, I re-converted the ckpt file to saved_model format, and loaded it as keras layer using bert-for-tf2 (https://github.com/kpe/bert-for-tf2/tree/master/bert) However, when I loaded the re-converted albert_base, its embeddings were different from those from the one loaded from the original albert_base.
Here is how I converted the original saved_model to ckpt, and then back to saved_model. (I used tf vertsion = 1.15.0 on colab)
...ANSWER
Answered 2019-Dec-18 at 06:15Problem Solved! So the problem WAS actually the difference in tensor names. So I changed the names of tensors in the checkpoints using the following code (https://gist.github.com/batzner/7c24802dd9c5e15870b4b56e22135c96).
Just need to change 'module/bert/....' to 'bert/....' and it's all good.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install bert-for-tf2
You can use bert-for-tf2 like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page