vowpal_wabbit | Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techni | Machine Learning library

 by   VowpalWabbit C++ Version: 9.8.0 License: Non-SPDX

kandi X-RAY | vowpal_wabbit Summary

kandi X-RAY | vowpal_wabbit Summary

vowpal_wabbit is a C++ library typically used in Artificial Intelligence, Machine Learning, Deep Learning, Pytorch, Tensorflow applications. vowpal_wabbit has no bugs, it has no vulnerabilities and it has medium support. However vowpal_wabbit has a Non-SPDX License. You can download it from GitHub.

This is the Vowpal Wabbit fast online learning code.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              vowpal_wabbit has a medium active ecosystem.
              It has 8230 star(s) with 1954 fork(s). There are 351 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 122 open issues and 1122 have been closed. On average issues are closed in 105 days. There are 5 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of vowpal_wabbit is 9.8.0

            kandi-Quality Quality

              vowpal_wabbit has no bugs reported.

            kandi-Security Security

              vowpal_wabbit has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              vowpal_wabbit has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              vowpal_wabbit releases are available to install and integrate.
              Installation instructions are available. Examples and code snippets are not available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of vowpal_wabbit
            Get all kandi verified functions for this library.

            vowpal_wabbit Key Features

            No Key Features are available at this moment for vowpal_wabbit.

            vowpal_wabbit Examples and Code Snippets

            Unable to build vowpal wabbit in centos 7
            Pythondot img1Lines of Code : 28dot img1License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            ======= Boost ========
            cd boost_1_68_0/
            echo "using gcc : : /usr/bin/g++73 ; " >> tools/build/src/user-config.jam
            echo "using python : 3.6 : /usr/bin/python3.6 : /usr/include/python3.6m ; " >> tools/build/src/user-config.jam
            
            .
            copy iconCopy
            $ cd /usr/lib/x86_64-linux-gnu/
            $ sudo ln -s libboost_python-py35.so libboost_python3.s
            $ sudo ln -s libboost_python-py35.a libboost_python3.a
            
            (vowpal wabbit) contextual bandit dealing with new context
            Pythondot img3Lines of Code : 5dot img3License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            vw.learn('1:-2:0.5 | my_feature_name:5')
            vw.learn('1:2:0.5 | my_feature_name:15')
            
            1:-2:0.5 |A my_feature_name:5 |B yet_another_feature:4
            
            Can't install vowpalwabbit via pip in Windows
            Pythondot img4Lines of Code : 2dot img4License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            pip install vowpalwabbit
            
            Can't install Vowpalwabbit using pip on Windows 10
            Pythondot img5Lines of Code : 10dot img5License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            vcpkg install zlib:x64-windows
            vcpkg install boost-system:x64-windows
            vcpkg install boost-program-options:x64-windows
            vcpkg install boost-test:x64-windows
            vcpkg install boost-align:x64-windows
            vcpkg install boost-foreach:x64-windows
            vcpkg 
            How to interpret --audit and --invert_hash output for vowpal wabbit --rank model?
            Pythondot img6Lines of Code : 31dot img6License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            4 |u john |i hammer
            ...
            
            $ echo "4 |u john |i hammer" | vw --lrq ui5 --invert_hash model.readable.txt
            
            $ cat model.readable.txt
            Version 8.6.1
            Id 
            Min label:0
            Max label:4
            bits:18
            lda:0
            0 ngram
            How to install Vowpal Wabbit Python bindings under Travis CI?
            Pythondot img7Lines of Code : 6dot img7License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            install:
            - sudo apt-get install libboost-program-options-dev libboost-python-dev zlib1g-dev
            - sudo ln -sf /usr/lib/x86_64-linux-gnu/libboost_python-py35.a /usr/lib/x86_64-linux-gnu/libboost_python.a
            - sudo ln -sf /usr/lib/x86_64-linux-gnu/

            Community Discussions

            QUESTION

            Is it possible to run Vowpal Wabbit daemon with file socket instead of using a port?
            Asked 2020-Oct-30 at 16:08

            Official VW documentation has an example of how to run Vowpal Wabbit in daemon mode. https://github.com/VowpalWabbit/vowpal_wabbit/wiki/Daemon-example But it seems that VW always binds a particular port to the daemon (by default it is port 26542).

            It is possible to use file socket instead of a TCP socket for Vowpal Wabbit daemon mode?

            ...

            ANSWER

            Answered 2020-Oct-30 at 16:08

            Right now vowpalwabbit explicitly uses TCP sockets in deamon mode so the only thing you can define is the port you wish to use via --port.

            PS: I'm a maintainer for this project and we would be more than happy to accept a patch with this if you need it, as we currently don't have plans to support this feature. Here is the related TCP code: https://github.com/VowpalWabbit/vowpal_wabbit/blob/master/vowpalwabbit/parser.cc#L208

            Source https://stackoverflow.com/questions/64537355

            QUESTION

            Vowpal Wabbit: specifying the learning rate schedule
            Asked 2020-Oct-30 at 12:29

            I'm looking at VW's docs for update rule options, and I'm confused about the equation that specifies the learning rate schedule using the parameters initial_t, power_t, and decay_learning_rate.

            Based on the equation below this line in the docs

            specify the learning rate schedule whose generic form

            if initial_t is equal to zero (which is the setting by default), it seems that the learning rate will always be zero, for all timesteps and epochs. Is this right?

            Also, what would happen if both initial_t and power_t are set to zero? I tried initializing a VW with those settings and it didn't complain.

            ...

            ANSWER

            Answered 2020-Oct-30 at 12:29

            if initial_t is equal to zero (which is the setting by default), it seems that the learning rate will always be zero, for all timesteps and epochs. Is this right?

            initial_t is set to zero by default. By default the initial learning rate will not use initial_t to calculate its value but will start off at its default value, which is 0.5.

            Per the documentation, the flags adaptive, normalized, and invariant are on by default. If any of them is specified, the other flags are turned off. In the case that you turn on the invariant flag (so in the case that we are not using normalized or adaptive) the initial learning rate will be calculated using the initial_t and power_t values, and the default initial_t is set to one instead of zero.

            If initial_t is explicitly set to zero combined with the invariant flag being set, then yes, the learning rate will also be zero.

            Also, what would happen if both initial_t and power_t are set to zero? I tried initializing a VW with those settings and it didn't complain.

            If the initial learning rate is calculated using initial_t and power_t and both are explicitly set to zero, c++ should evaluate powf(0,0) to 1 resulting in the learning rate set to its default value, which can be specified by --learning_rate

            If you are running vowpalwabbit via the command line, you should be able to see what these values are set to:

            Source https://stackoverflow.com/questions/64392044

            QUESTION

            Can we specify which algorithm to use (e.g., decision tree, SVM, ensemble, NNs) in Vowpal Wabbit? Or, does Automl select the algorithm itself?
            Asked 2020-Aug-12 at 06:42

            I am trying to read the documentation of Vowpal Wabbit and it doesn't specify how to select specific learning algorithms (Not loss) like SVM,NN, Decision trees, etc. How does one select a specific learning algorithm?

            Or does it select the algorithm itself depending on problem type (regression/classification like an automl type or low-code ML library?

            There are some blogs showing to use Neural networks with -nn command but that isn't part of documentation--is this because it doesn't focus on specific algorithm, as noted above? If so, What is Vowpal Wabbit in essence?

            ...

            ANSWER

            Answered 2020-Aug-04 at 09:12

            Vowpal Wabbit is based on online learning (SGD-like updates, but there is also --bfgs if you really need batch optimization) and (machine learning) reductions. See some of the tutorials or papers to understand the idea of reductions. Many VW papers are also about Contextual Bandit, which is implemented as a reduction to a cost-sensitive one-against-all (OAA) classification (which is further reduced to regression). See a simple intro into reductions or a simple example how binary classification is reduced into regression.

            As far as I know, VowpalWabbit does not support Decision trees nor ensembles, but see --boosting and --bootstrap. It does not support SVM, but see --loss_function hinge (hinge loss is one of the two key concepts of SVM) and --ksvm. It does not support NN, but --nn (and related options) provides a very limited support simulating a single hidden layer (feed-forward with tanh activation function), which can be added into the reduction stack.

            Source https://stackoverflow.com/questions/63182360

            QUESTION

            How `vw --audit` internally computes the weights of the features?
            Asked 2020-Jun-16 at 15:49

            In vowpawabbit there is an option --audit that prints the weights of the features.

            If we have a vw contextual bandit model with four arms, how is this feature weight created?

            From what I understand vowpawabbit tries to fit one linear model to each arm.

            So if weights were calculated using an average across all the arms, then they would correlate with getting a reward generally, instead of which features makes the model pick one variant from another.

            I am interested know out how they are calculated to see how I can interpret the results obtained. I tried searching its Github repository but could not find anything meaningful.

            ...

            ANSWER

            Answered 2020-Jun-16 at 15:49

            I am interested know out how they are calculated to see how I can interpret the results obtained.

            Unfortunately knowing the first does not lead to knowing the second.

            Your question is concerned with contextual bandits, but it is important to note that interpreting model parameters is an issue that also occurs in supervised learning. Machine learning has made progress recently (i.e., my lifetime) largely by focusing concern on quality of predictions rather than meaningfulness of model parameters. In a blog post, Phoebe Wong outlines the issue while being entertaining.

            The bottom line is that our models are not causal, so you simply cannot conclude because "the weight of feature X is for arm A is large means that if I were to intervene in the system and increase this feature value that I will get more reward for playing arm A".

            We are currently working on tools for model inspection that leverage techniques such as permutation importance that will help you answer questions like "if I were to stop using a particular feature how would the frequency of playing each arm change for the trained policy". We're hoping that is helpful information.

            Having said all that, let me try to answer your original question ...

            In vowpawabbit there is an option --audit that prints the weights of the features.

            If we have a vw contextual bandit model with four arms, how is this feature weight created?

            The format is documented here. Assuming you are using --cb (not --cb_adf) then there are a fixed number of arms and so the offset field will increment over the arms. So for an example like

            Source https://stackoverflow.com/questions/62325466

            QUESTION

            Container exits even though it was started using -dit
            Asked 2018-Feb-22 at 08:45

            I have learnt that if I want a container to stay alive even though no process is run in the foreground (basically make bash run in foreground) I have to docker run -dit .

            Oddly enough, I am launching a container that way exactly, but it exists immediately.

            What am I doing wrong?

            The image is this. I start it using docker run --name my_container -dit islandsound/vowpal_wabbit and I want it to keep running until I stop it.

            ...

            ANSWER

            Answered 2018-Feb-22 at 08:45

            A Docker container always requires a process to be running in the foreground, otherwise the container will exit. No manner of options will change that.

            The options -dit control Docker and how it sets up the process but it's purely down to the ENTRYPOINT and CMD in the Dockerfile (or overrides on the command line) that control if the image stays running.

            • -d Detaches your screen from the container and allows it to run in the background. This doesn't keep a process in the container running though.
            • -i Keeps standard input open, sometimes required to run processes that expect someone to run it and use a keyboard and the like.
            • -t Assigns a pseudo tty to the process, like your terminal.

            docker run -dit ubuntu is a trick often used on images that run interactive shells like bash to keep them running in the background so you can attach or exec things in them.

            From the image description it looks like you need to supply the options --daemon --foreground when running the image to keep the process running in the foreground.

            Source https://stackoverflow.com/questions/48922436

            QUESTION

            Vowpal wabbit with Java / JNI using the linux precompiled binaries
            Asked 2018-Feb-10 at 22:38

            I was testing the vowpal_wabbit for the first time and basically, I'm going to execute it on a Docker container, so I've downloaded a precompiled version as suggested here official download page and apparently it worked as expected inside my container.

            It turns out that I was intending to use Java and invoke VW-Wabbit (and the option, although not easy, was to use JNI - and later check a wrapper for it created by the Indeed guys). However, if I'm using a precompiled when I try to load the native library, it won't find it (once I didn't do any compile/make/whatever), right?

            Given that:

            ...

            ANSWER

            Answered 2018-Feb-10 at 22:38

            The problem is the use of autogen.sh.

            autogen.sh is provided only as a last-resort for possibly unfamiliar/unsupported environments where the provided Makefile (and child directory */Makefiles) may not work.

            Among other things,autogen.sh calls GNU automake which overwrites the Makefiles (based on probing the environment).

            To generate the JNI, you need to run make java with the original Makefiles that come with the source.

            If you already overwrote the Makefiles, no worries! You can easily restore the originals back by typing:

            Source https://stackoverflow.com/questions/48669098

            QUESTION

            Vowpal Wabbit Matrix Factorization on one label
            Asked 2017-Jun-23 at 16:59

            What I'm after is a recommender system for the web, something like "related products". Based on the items a user has bought I want to find related items based on what other users has bought. I've followed the MovieLens tutorial (https://github.com/JohnLangford/vowpal_wabbit/wiki/Matrix-factorization-example) for making a recommender system.

            In the example above the users gave the movies a score (1-5). The model can then predict the score a user will give a specific item.

            My data, on the other hand, only knows what the user likes. I don't know what they dislike or how much they like something. So I've tried sending 1 as the value on all my entries, but that only gives me a model that returns 1 on every prediction.

            Any ideas on how I can structure my data so that I can receive prediction on how likely it is for the user to like an item between 0 and 1?

            Example data:

            ...

            ANSWER

            Answered 2017-Jun-23 at 16:59
            Short answer to the question:

            To get a prediction resembling "probabilities" you could use --loss_function logistic --link logistic. Be aware that in this single-label setting your probabilities risk tending to 1.0 quickly (i.e. become meaningless).

            Additional notes:
            • Working with a single label is problematic in the sense that there's no separation of the goal. Eventually the learner will peg all predictions to 1.0. To counter that - it is recommended to use --noconstant, use strong regularization, decrease the learning rate, avoid multiple passes, etc. (IOW: anything that avoids over-fitting to the single label)
            • Even better: add examples where the user hasn't bought/clicked, they should be plentiful, this will make your model much more robust and meaningful.
            • There's a better implementation of matrix factorization in vw (much faster and lighter on IO for big models). Check the --lrq option and the full demo under demo/movielens in the source tree.
            • You should pass the training-set directly to vw to avoid Useless use of cat

            Source https://stackoverflow.com/questions/44702005

            QUESTION

            Categorical Features in Vowpal Wabbit
            Asked 2017-May-30 at 11:06

            This link says that currently all feature labels must be followed by a float. But when I enter -1 3 |context day:Monday in this validator, it accepts it as day as a feature with value Monday.

            Further, If I can provide strings as values to a feature, how can I provide values which contain spaces. For example -1 3 |context day:Monday name:A B keeps only A as the value to the label name, and treats B as another label. But, in actual, I want to assign the label name the value "A B"

            ...

            ANSWER

            Answered 2017-May-30 at 11:06

            all feature labels must be followed by a float

            Yes, but if no colon and float is provided, the default feature value is 1.0.

            But when I enter -1 3 |context day:Monday in this validator, it accepts The validator is just approximate and not kept update for several years. I am not aware of any VW base learner that would allow non-float feature values.

            A solution to your problem is to escape spaces in your categorical feature values with spaces and convert a categorical feature with N values into N binary features (in the end it's the same). For example: -1 3 |context day_Monday name_A_B

            Source https://stackoverflow.com/questions/43868131

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install vowpal_wabbit

            For the most up to date instructions for getting started on Windows, MacOS or Linux please see the wiki. This includes:.
            Installing with a package manager
            Dependencies
            Building
            Tutorial

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/VowpalWabbit/vowpal_wabbit.git

          • CLI

            gh repo clone VowpalWabbit/vowpal_wabbit

          • sshUrl

            git@github.com:VowpalWabbit/vowpal_wabbit.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link