BackPropNetwork | Backpropagation using Stochastic gradient descent

 by   kipgparker C# Version: Current License: MIT

kandi X-RAY | BackPropNetwork Summary

kandi X-RAY | BackPropNetwork Summary

BackPropNetwork is a C# library. BackPropNetwork has no bugs, it has no vulnerabilities, it has a Permissive License and it has low support. You can download it from GitHub.

Backpropagation using Stochastic gradient descent
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              BackPropNetwork has a low active ecosystem.
              It has 26 star(s) with 20 fork(s). There are 2 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 1 open issues and 1 have been closed. On average issues are closed in 100 days. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of BackPropNetwork is current.

            kandi-Quality Quality

              BackPropNetwork has 0 bugs and 0 code smells.

            kandi-Security Security

              BackPropNetwork has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              BackPropNetwork code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              BackPropNetwork is licensed under the MIT License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              BackPropNetwork releases are not available. You will need to build from source code and install.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of BackPropNetwork
            Get all kandi verified functions for this library.

            BackPropNetwork Key Features

            No Key Features are available at this moment for BackPropNetwork.

            BackPropNetwork Examples and Code Snippets

            No Code Snippets are available at this moment for BackPropNetwork.

            Community Discussions

            Trending Discussions on BackPropNetwork

            QUESTION

            Why Is My Machine Learning Algorithm Getting Stuck?
            Asked 2020-Jun-02 at 09:04

            So I am hitting a wall with my C# Machine Learning project. I am attempting to train an algorithm to recognize numbers. Since this is only an exercise I have a image set of 200 numbers (20 each for 0 to 9). Obviously if I wanted a properly trained algorithm I would use a more robust training set, but this is just an exercise to see if I can get it working in the first place. I can get it up to 60% accuracy, but not past that. I have been doing some research into activation functions and I from what I understand, LeakyRelu is the function I should be using. However, if I use the LeakyRelu function across the board then it doesn't learn anything, and I'm not sure how to use the LeakyRelu as an output activation function. Using sigmoid or tanh as an output activation function makes more sense to me. Here is a block of code that creates the array that feeds the backpropagation:

            ...

            ANSWER

            Answered 2020-Jun-02 at 09:04

            added the answer for future visitors

            • Try converting the grayscale values from 0-255 interval to 0-1 interval. Just divide each pixel with 255. the fact that LeakyRELU performed better than sigmoid or tanh is because the values are too large. large in a sense that they get mistreated by tanh and sigmoid and get rounded by the computer to integers.

            • Look carefully about how the neural network weights are initialised if you intend to use tanh or sigmoid.

            • Since this is a classification problem, I recommend you use a softmax activation function in your output layer.

            after preprocessing the data, @JMC0352 got 88% accuracy only.

            the reason why you're getting 88% only is because neural network (alone) is not suitable for image recognition. convolutional neural networks are used for that matter. In order to understand the problem intuitively, you can picture raw neural networks as making sense of all the pixels together, where as conv. nets. make sense of relatively close pixels.

            Source https://stackoverflow.com/questions/62139387

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install BackPropNetwork

            You can download it from GitHub.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/kipgparker/BackPropNetwork.git

          • CLI

            gh repo clone kipgparker/BackPropNetwork

          • sshUrl

            git@github.com:kipgparker/BackPropNetwork.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link