kandi background
Explore Kits

tensor2tensor | deep learning models and datasets designed to make deep learning | Machine Learning library

 by   tensorflow Python Version: v1.15.7 License: Apache-2.0

 by   tensorflow Python Version: v1.15.7 License: Apache-2.0

Download this library from

kandi X-RAY | tensor2tensor Summary

tensor2tensor is a Python library typically used in Institutions, Learning, Education, Artificial Intelligence, Machine Learning, Deep Learning, Pytorch, Tensorflow applications. tensor2tensor has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has high support. You can install using 'pip install tensor2tensor' or download it from GitHub, PyPI.
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Support
Support
Quality
Quality
Security
Security
License
License
Reuse
Reuse

kandi-support Support

  • tensor2tensor has a highly active ecosystem.
  • It has 11369 star(s) with 2907 fork(s). There are 453 watchers for this library.
  • It had no major release in the last 12 months.
  • There are 551 open issues and 667 have been closed. On average issues are closed in 39 days. There are 9 open pull requests and 0 closed requests.
  • It has a negative sentiment in the developer community.
  • The latest version of tensor2tensor is v1.15.7
tensor2tensor Support
Best in #Machine Learning
Average in #Machine Learning
tensor2tensor Support
Best in #Machine Learning
Average in #Machine Learning

quality kandi Quality

  • tensor2tensor has 0 bugs and 0 code smells.
tensor2tensor Quality
Best in #Machine Learning
Average in #Machine Learning
tensor2tensor Quality
Best in #Machine Learning
Average in #Machine Learning

securitySecurity

  • tensor2tensor has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
  • tensor2tensor code analysis shows 0 unresolved vulnerabilities.
  • There are 0 security hotspots that need review.
tensor2tensor Security
Best in #Machine Learning
Average in #Machine Learning
tensor2tensor Security
Best in #Machine Learning
Average in #Machine Learning

license License

  • tensor2tensor is licensed under the Apache-2.0 License. This license is Permissive.
  • Permissive licenses have the least restrictions, and you can use them in most projects.
tensor2tensor License
Best in #Machine Learning
Average in #Machine Learning
tensor2tensor License
Best in #Machine Learning
Average in #Machine Learning

buildReuse

  • tensor2tensor releases are available to install and integrate.
  • Deployable package is available in PyPI.
  • Build file is available. You can build the component from source.
  • Installation instructions, examples and code snippets are available.
  • tensor2tensor saves you 78413 person hours of effort in developing the same functionality from scratch.
  • It has 86910 lines of code, 6563 functions and 471 files.
  • It has medium code complexity. Code complexity directly impacts maintainability of the code.
tensor2tensor Reuse
Best in #Machine Learning
Average in #Machine Learning
tensor2tensor Reuse
Best in #Machine Learning
Average in #Machine Learning
Top functions reviewed by kandi - BETA

kandi has reviewed tensor2tensor and discovered the below as its top functions. This is intended to give you an instant insight into tensor2tensor implemented functionality, and help decide if they suit your requirements.

  • Perform beam search .
  • Evolve evolved Transformer decoder .
  • Multihead attention .
  • A basic hyperparameters .
  • Input function .
  • Perform a multihead attention .
  • Multi - layer transformer .
  • Transformer .
  • Define wrapper for collect .
  • Apply a convolutional layer .

tensor2tensor Key Features

Many state of the art and baseline models are built-in and new models can be added easily (open an issue or pull request!).

Many datasets across modalities - text, audio, image - available for generation and use, and new ones can be added easily (open an issue or pull request for public datasets!).

Models can be used with any dataset and input mode (or even multiple); all modality-specific processing (e.g. embedding lookups for text tokens) is done with bottom and top transformations, which are specified per-feature in the model.

Support for multi-GPU machines and synchronous (1 master, many workers) and asynchronous (independent workers synchronizing through a parameter server) distributed training.

Easily swap amongst datasets and models by command-line flag with the data generation script t2t-datagen and the training script t2t-trainer.

Train on Google Cloud ML and Cloud TPUs.

Quick Start

copy iconCopydownload iconDownload
pip install tensor2tensor && t2t-trainer \
  --generate_data \
  --data_dir=~/t2t_data \
  --output_dir=~/t2t_train/mnist \
  --problem=image_mnist \
  --model=shake_shake \
  --hparams_set=shake_shake_quick \
  --train_steps=1000 \
  --eval_steps=100

Walkthrough

copy iconCopydownload iconDownload
pip install tensor2tensor

# See what problems, models, and hyperparameter sets are available.
# You can easily swap between them (and add new ones).
t2t-trainer --registry_help

PROBLEM=translate_ende_wmt32k
MODEL=transformer
HPARAMS=transformer_base_single_gpu

DATA_DIR=$HOME/t2t_data
TMP_DIR=/tmp/t2t_datagen
TRAIN_DIR=$HOME/t2t_train/$PROBLEM/$MODEL-$HPARAMS

mkdir -p $DATA_DIR $TMP_DIR $TRAIN_DIR

# Generate data
t2t-datagen \
  --data_dir=$DATA_DIR \
  --tmp_dir=$TMP_DIR \
  --problem=$PROBLEM

# Train
# *  If you run out of memory, add --hparams='batch_size=1024'.
t2t-trainer \
  --data_dir=$DATA_DIR \
  --problem=$PROBLEM \
  --model=$MODEL \
  --hparams_set=$HPARAMS \
  --output_dir=$TRAIN_DIR

# Decode

DECODE_FILE=$DATA_DIR/decode_this.txt
echo "Hello world" >> $DECODE_FILE
echo "Goodbye world" >> $DECODE_FILE
echo -e 'Hallo Welt\nAuf Wiedersehen Welt' > ref-translation.de

BEAM_SIZE=4
ALPHA=0.6

t2t-decoder \
  --data_dir=$DATA_DIR \
  --problem=$PROBLEM \
  --model=$MODEL \
  --hparams_set=$HPARAMS \
  --output_dir=$TRAIN_DIR \
  --decode_hparams="beam_size=$BEAM_SIZE,alpha=$ALPHA" \
  --decode_from_file=$DECODE_FILE \
  --decode_to_file=translation.en

# See the translations
cat translation.en

# Evaluate the BLEU score
# Note: Report this BLEU score in papers, not the internal approx_bleu metric.
t2t-bleu --translation=translation.en --reference=ref-translation.de

Installation

copy iconCopydownload iconDownload
# Assumes tensorflow or tensorflow-gpu installed
pip install tensor2tensor

# Installs with tensorflow-gpu requirement
pip install tensor2tensor[tensorflow_gpu]

# Installs with tensorflow (cpu) requirement
pip install tensor2tensor[tensorflow]

Run on FloydHub

copy iconCopydownload iconDownload
# Test the quick-start on a Workspace's Terminal with this command
t2t-trainer \
  --generate_data \
  --data_dir=./t2t_data \
  --output_dir=./t2t_train/mnist \
  --problem=image_mnist \
  --model=shake_shake \
  --hparams_set=shake_shake_quick \
  --train_steps=1000 \
  --eval_steps=100

Papers

copy iconCopydownload iconDownload
@article{tensor2tensor,
  author    = {Ashish Vaswani and Samy Bengio and Eugene Brevdo and
    Francois Chollet and Aidan N. Gomez and Stephan Gouws and Llion Jones and
    \L{}ukasz Kaiser and Nal Kalchbrenner and Niki Parmar and Ryan Sepassi and
    Noam Shazeer and Jakob Uszkoreit},
  title     = {Tensor2Tensor for Neural Machine Translation},
  journal   = {CoRR},
  volume    = {abs/1803.07416},
  year      = {2018},
  url       = {http://arxiv.org/abs/1803.07416},
}

Cant import Tensorflow 2.2.0rc2 in Google Colab when installed from setup.py

copy iconCopydownload iconDownload
# Executes the cell in bash mode
%%bash

if [ ! -d "/content/deep-deblurring/" ]; 
    then 
        git clone https://github.com/ElPapi42/deep-deblurring;
        cd deep-deblurring/
    else 
        cd deep-deblurring/; 
        git pull; 
fi;

git checkout development
cd ..

pip uninstall -y tensorflow tensor2tensor tensorboard tensorboardcolab tensorflow-datasets tensorflow-estimator tensorflow-gan tensorflow-hub tensorflow-metadata tensorflow-privacy tensorflow-probability

pip install colab-env
pip install --upgrade grpcio

pip install -r "/content/deep-deblurring/requirements.txt"

cd deep-deblurring/
python setup.py install
cd ..
-----------------------
[1] import sys
    sys.meta_path[:] = [hook for hook in sys.meta_path if not h.__class__.__name__ == '_TensorflowImportHook']

[2] import tensorflow as tf
    print(tf.__version__)

Community Discussions

Trending Discussions on tensor2tensor
  • Using TensorFlow with GPU taking a long time for loading library related to CUDA
  • Uncomplete installation of the RASA package with the issue: FileNotFoundError: [Errno 2] No such file or directory: 'HISTORY.rst'
  • Getting ModuleNotFoundError only if debug mode is enabled
  • Cant import Tensorflow 2.2.0rc2 in Google Colab when installed from setup.py
Trending Discussions on tensor2tensor

QUESTION

Using TensorFlow with GPU taking a long time for loading library related to CUDA

Asked 2021-Jun-15 at 13:04

Machine Setting:

  • GPU: GeForce RTX 3060

  • Driver Version: 460.73.01

  • CUDA Driver Veresion: 11.2

  • Tensorflow: tensorflow-gpu 1.14.0

  • CUDA Runtime Version: 10.0

  • cudnn: 7.4.1

Note:

  1. CUDA Runtime and cudnn version fits the guide from Tensorflow official documentation.
  2. I've also tried for TensorFlow-gpu = 2.0, still the same problem.

Problem:

I am using Tensorflow for an objection detection task. My situation is that the program will stuck at

2021-06-05 12:16:54.099778: I tensorflow/stream_executor/platform/default/dso_loader.cc:42] Successfully opened dynamic library libcublas.so.10

for several minutes.

And then stuck at next loading process

2021-06-05 12:21:22.212818: I tensorflow/stream_executor/platform/default/dso_loader.cc:42] Successfully opened dynamic library libcudnn.so.7

for even longer time. You may check log.txt for log details.

After waiting for around 30 mins, the program will start to running and WORK WELL.

However, whenever program invoke self.session.run(...), it will load the same two library related to cuda (libcublas and libcudnn) again, which is time-wasted and annoying.

I am confused that where the problem comes from and how to resolve it. Anyone could help?

Discussion Issue on Github

===================================

Update

After @talonmies 's help, the problem was resolved by resetting the environment with correct version matching among GPU, CUDA, cudnn and tensorflow. Now it works smoothly.

ANSWER

Answered 2021-Jun-15 at 13:04

Generally, if there are any incompatibility between TF, CUDA and cuDNN version you can observed this behavior.

For GeForce RTX 3060, support starts from CUDA 11.x. Once you upgrade to TF2.4 or TF2.5 your issue will be resolved.

For the benefit of community providing tested built configuration

enter image description here

CUDA Support Matrix

enter image description here

Source https://stackoverflow.com/questions/67847219

Community Discussions, Code Snippets contain sources that include Stack Exchange Network

Vulnerabilities

No vulnerabilities reported

Install tensor2tensor

This iPython notebook explains T2T and runs in your browser using a free VM from Google, no installation needed. Alternatively, here is a one-command version that installs T2T, downloads MNIST, trains a model and evaluates it:.

Support

For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

DOWNLOAD this Library from

Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

Share this Page

share link
Reuse Pre-built Kits with tensor2tensor
Compare Machine Learning Libraries with Highest Support
Compare Machine Learning Libraries with Highest Quality
Compare Machine Learning Libraries with Highest Security
Compare Machine Learning Libraries with Permissive License
Compare Machine Learning Libraries with Highest Reuse
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

  • © 2022 Open Weaver Inc.