AdaBound | An optimizer that trains as fast as Adam and as good as SGD | Machine Learning library

 by   Luolc Python Version: 0.0.5 License: Apache-2.0

kandi X-RAY | AdaBound Summary

kandi X-RAY | AdaBound Summary

AdaBound is a Python library typically used in Artificial Intelligence, Machine Learning, Deep Learning, Pytorch, Tensorflow, Keras applications. AdaBound has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has medium support. You can install using 'pip install AdaBound' or download it from GitHub, PyPI.

An optimizer that trains as fast as Adam and as good as SGD, for developing state-of-the-art deep learning models on a wide variety of popular tasks in the field of CV, NLP, and etc. Based on Luo et al. (2019). Adaptive Gradient Methods with Dynamic Bound of Learning Rate. In Proc. of ICLR 2019.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              AdaBound has a medium active ecosystem.
              It has 2898 star(s) with 327 fork(s). There are 76 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 18 open issues and 7 have been closed. On average issues are closed in 14 days. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of AdaBound is 0.0.5

            kandi-Quality Quality

              AdaBound has 0 bugs and 0 code smells.

            kandi-Security Security

              AdaBound has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              AdaBound code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              AdaBound is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              AdaBound releases are available to install and integrate.
              Deployable package is available in PyPI.
              Build file is available. You can build the component from source.
              Installation instructions, examples and code snippets are available.
              AdaBound saves you 183 person hours of effort in developing the same functionality from scratch.
              It has 452 lines of code, 38 functions and 7 files.
              It has medium code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed AdaBound and discovered the below as its top functions. This is intended to give you an instant insight into AdaBound implemented functionality, and help decide if they suit your requirements.
            • Test the loss function
            • Define a densenet
            • Shortcut for ResNet
            • Build the dataset
            • Train the network
            • Create an optimizer
            • Get argument parser
            • Build the model
            • Return the name of the CKPT
            • Load a checkpoint from a checkpoint
            Get all kandi verified functions for this library.

            AdaBound Key Features

            No Key Features are available at this moment for AdaBound.

            AdaBound Examples and Code Snippets

            Action Recognition with pytorch,Training
            Pythondot img1Lines of Code : 28dot img1no licencesLicense : No License
            copy iconCopy
            model: resnet18
            msc: False            # if you use temporal multi-scale input
            
            class_weight: True    # if you use class weight to calculate cross entropy or not
            writer_flag: True     # if you use tensorboardx or not
            
            n_classes: 400
            batch_size: 32
            inp  
            AdaBound in Tensorflow,Usage
            Pythondot img2Lines of Code : 19dot img2License : Permissive (Apache-2.0)
            copy iconCopy
            # learning can be either a scalar or a tensor
            
            # use exclude_from_weight_decay feature, 
            # if you wanna selectively disable updating weight-decayed weights
            
            optimizer = AdaBoundOptimizer(
                learning_rate=1e-3,
                final_lr=1e-1,
                beta_1=0.9,
                 
            Imporve-lenet
            Pythondot img3Lines of Code : 9dot img3no licencesLicense : No License
            copy iconCopy
            model=Net(num_classes=9).to('cuda')
            #model=MobileNetV2(n_class=9).to('cuda')
            #model=resnet50(num_class=9,pretrained=False).to('cuda')
            print(model)
            print("cuda:0")
            
            model=Net(num_classes=9).to('cpu')
            #model = MobileNetV2(n_class=9).to("cpu")
            #model=re  

            Community Discussions

            QUESTION

            How to solve the issue of fluctuations in validation accuracy?
            Asked 2020-Jan-04 at 12:47

            I tried using SGD, Adadelta, Adabound, Adam. Everything gives me fluctuations in validation accuracy. I tried all the activation functions in keras, but still, I'm getting fluctuations in val_acc.
            Training samples: 1352
            Validation Samples: 339
            Validation Accuracy

            ...

            ANSWER

            Answered 2020-Jan-04 at 12:47

            Your model may be too noise sensitive, see this answer.

            Based on the answer in the link and what I see from your model, your network may be too deep for the amount of data you have (large model and not enough datas ==> overfitting ==> noise sensitivity). I suggest to use a simpler model as a sanity check.

            The learning rate could also be a possible reason (as stated by Neb). You are using the default learning rate of sgd (which is 0.01, maybe too high). Try with 1.e-3 or below.

            Source https://stackoverflow.com/questions/59590327

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install AdaBound

            AdaBound requires Python 3.6.0 or later. We currently provide PyTorch version and AdaBound for TensorFlow is coming soon.

            Support

            WebsiteDemos
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            Install
          • PyPI

            pip install adabound

          • CLONE
          • HTTPS

            https://github.com/Luolc/AdaBound.git

          • CLI

            gh repo clone Luolc/AdaBound

          • sshUrl

            git@github.com:Luolc/AdaBound.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link