tensorly | TensorLy : Tensor Learning in Python | Machine Learning library

 by   tensorly Python Version: 0.8.1 License: Non-SPDX

kandi X-RAY | tensorly Summary

kandi X-RAY | tensorly Summary

tensorly is a Python library typically used in Artificial Intelligence, Machine Learning, Pytorch, Tensorflow, Numpy applications. tensorly has no bugs, it has no vulnerabilities, it has build file available and it has high support. However tensorly has a Non-SPDX License. You can install using 'pip install tensorly' or download it from GitHub, PyPI.

TensorLy: Tensor Learning in Python.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              tensorly has a highly active ecosystem.
              It has 1387 star(s) with 271 fork(s). There are 44 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 51 open issues and 183 have been closed. On average issues are closed in 289 days. There are 5 open pull requests and 0 closed requests.
              It has a positive sentiment in the developer community.
              The latest version of tensorly is 0.8.1

            kandi-Quality Quality

              tensorly has 0 bugs and 0 code smells.

            kandi-Security Security

              tensorly has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              tensorly code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              tensorly has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              tensorly releases are available to install and integrate.
              Deployable package is available in PyPI.
              Build file is available. You can build the component from source.
              It has 11484 lines of code, 838 functions and 149 files.
              It has high code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed tensorly and discovered the below as its top functions. This is intended to give you an instant insight into tensorly implemented functionality, and help decide if they suit your requirements.
            • Calculate parafac .
            • Validate constraints .
            • r Constraints a tensor .
            • Perform a cross product of the input tensor .
            • Compute the parafac model .
            • Calculate the non - negative tucker Hessian .
            • Calculate non - negative parafac hals .
            • Calculate the HAL decomposition problem .
            • Generate RST directive .
            • Import phantom module .
            Get all kandi verified functions for this library.

            tensorly Key Features

            No Key Features are available at this moment for tensorly.

            tensorly Examples and Code Snippets

            Install guide,To use sparse matrices
            Pythondot img1Lines of Code : 4dot img1License : Permissive (MIT)
            copy iconCopy
            $ git clone https://github.com/pydata/sparse/ & cd sparse
            $ pip install .
            
            $ git clone https://github.com/jcrist/tensorly.git tensorly-sparse & cd tensorly-sparse
            $ git checkout sparse-take-2
              
            Optimizing Deep Neural Network,Requirements
            Pythondot img2Lines of Code : 2dot img2no licencesLicense : No License
            copy iconCopy
            pip install keras
            pip install tensorly
              
            prgds,Dependencies:
            Pythondot img3Lines of Code : 2dot img3no licencesLicense : No License
            copy iconCopy
            conda install -c anaconda gcc
            
            conda install -c conda-forge gsl
              
            partial tucker decomposition
            Pythondot img4Lines of Code : 45dot img4License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            import mxnet as mx
            import numpy as np
            import tensorly as tl
            import matplotlib.pyplot as plt
            import tensorly.decomposition
            
            # Load data
            mnist = mx.test_utils.get_mnist()
            train_data = mnist['train_data'][:,0]
            
            
            err = np.zeros([28,28]) # here
            Specific tensor decomposition
            Pythondot img5Lines of Code : 18dot img5License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            import tensorly as tl
            from tensorly import random
            from tensorly.decomposition import partial_tucker
            
            size = 10
            order = 3
            shape = (size, )*order
            tensor = random.random_tensor(shape)
            
            core, fac
            Nonnegative tensor decomposition example using Tensorly
            Pythondot img6Lines of Code : 15dot img6License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            import tensorly as tl
            import numpy as np
            
            X = tl.tensor(np.random.random((10, 11, 12)))
            
            from tensorly.decomposition import robust_pca
            
            D, E = robust_pca(X)
            
            from tensorly.decomposition impor
            Multiplication of two-dimensional and three-dimensional tensors
            Pythondot img7Lines of Code : 11dot img7License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            import tensorly as tl
            import numpy as np
            tl.set_backend('tensorflow')
            
            k = 2; m = 3; n = 5; h = 4
            
            A = tl.tensor(np.random.random((m, n)))
            B = tl.tensor(np.random.random((k, n, h)))
            
            res = tl.tenalg.contract(A, 1, B, 1)
            
            Python analogue of "ktensor" function from matlab
            Pythondot img8Lines of Code : 18dot img8License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            import numpy as np
            import tensorly as tl
            tensor = tl.tensor([[ 0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.,  0.],
                                [ 0.,  0.,  0.,  0.,  1.,  1.,  1.,  1.,  0.,  0.,  0.,  0.],
                                [ 0.,  0.,  0.
            Kronecker product source code in TensorLy
            Pythondot img9Lines of Code : 19dot img9License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            a = [[1, 2, 3],
                 [2, 3, 4],
                 [3, 4, 5]]
            b = [[6, 7]]
            
            [[[[1], [2], [3]]],
             [[[2], [3], [4]]],
             [[[3], [4], [5]]]]
            
            [[[[6, 7]]]]
            
            [[[[1*6, 1*7], [2*6, 2*7], [3*
            Truncate SVD decomposition of Pytorch tensor without transfering to cpu
            Pythondot img10Lines of Code : 5dot img10License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            import tensorly as tl
            tl.set_backend('pytorch')
            
            U, S, V = tl.truncated_svd(matrix, n_eigenvecs=10)
            

            Community Discussions

            QUESTION

            partial tucker decomposition
            Asked 2021-Dec-28 at 21:06

            I want to apply a partial tucker decomposition algorithm to minimize MNIST image tensor dataset of (60000,28,28), in order to conserve its features when applying another machine algorithm afterwards like SVM. I have this code that minimizes the second and third dimension of the tensor

            ...

            ANSWER

            Answered 2021-Dec-28 at 21:05

            So if you look at the source code for tensorly linked here you can see that the documentation for the function in question partial_tucker says:

            Source https://stackoverflow.com/questions/70466992

            QUESTION

            Python tensorly parafac returning ValueError
            Asked 2021-May-07 at 09:33

            I have a set of manufactured data (generated from an explicit mathematical function) stored in a 3-dimensional tensor called A. When I try to run parafac, I receive the following:

            ...

            ANSWER

            Answered 2021-May-07 at 09:33

            In the latest version of TensorLy, parafac returns a CPTensor that acts as a tuple (weight, factors) : in addition to the factors of the decomposition, you also get a vector of weights. This is because the CP decomposition expresses the original tensor as a weighted sum of rank-1 tensors.

            In other words, if you are using the latest version of TensorLy, your code should be either:

            Source https://stackoverflow.com/questions/67415972

            QUESTION

            How do I design a test for the partial_tucker function from tensorly?
            Asked 2020-Apr-23 at 14:05

            I try to design a test in order to verify that the partial_tucker function from tensorly works as I expect it to work. In other words, I want to design an input for the partial_tucker function along with its associated expected output.

            So, what I have tried to do is to take an initial random tensor A (of order 4), compute its "low rank" tucker decomposition by hand then reconstruct the tensor of same shape than the initial tensor, say A_tilde. I think the A_tilde tensor is then the "low rank approximation" of the initial tensor A. Am I correct?

            Then I would like to us the partial_tucker function on that A_tilde tensor and I expect the result to be the same as the tucker decomposition that I have computed by hand. It is not the case so I guess my handcrafted tucker decomposition is wrong. If so, why?

            ...

            ANSWER

            Answered 2020-Apr-23 at 14:05

            You are implicitely making a lot of assumptions here: you are assuming, for instance that you can just trim a rank-R decomposition to get the rank-(R-1) decomposition. This is in general not true. Also, note that the Tucker decomposition you are using is not just Higher-Order SVD (HO-SVD). Rather, HO-SVD is used for initialization and followed by Higher Order Orthogonal Iteration (HOOI).

            You are also assuming that the low-rank decomposition is unique, for any given rank, which would allow you to compare the factors of the decomposition directly. This is also not the case (even in the matrix case, and with strong constraints such as orthonormality, you still would have sign indeterminacy).

            Instead you could for example check the relative reconstruction error. I suggest you have a look at the tests in TensorLy. There are lots of good references on this, if you are starting with tensors. For instance, the seminal work by Kolda and Bader; for Tucker in particular, the work by De Lathauwer et al (eg.on best low-rank approximation of tensors), etc.

            Source https://stackoverflow.com/questions/61027712

            QUESTION

            tensorly.kruskal_to_tensor() method explanation
            Asked 2020-Mar-27 at 13:27

            I'm trying to understand tl.kruskal_to_tensor () method in tensorly package. In the webpage I understand that it takes as input a list of matrices and produces the tensor whose cp-decomposiiton are the matrices? It takes as input a list of matrices.

            But I saw the following code:

            ...

            ANSWER

            Answered 2020-Mar-27 at 13:27

            This version of kruskal_to_tensor is documented in the dev version of the API.

            The np.ones corresponds to the weight of the Kruskal tensor: a Kruskal tensor expresses a tensor as a weighted sum of rank one tensors (outer product of vectors, collected as the columns of the factor matrices). In your case, the weights of the sum are all ones and accumulated in this vector of ones.

            Note that you could normalize the factors of your Kruskal tensor and absorb their magnitude in theses weights.

            Source https://stackoverflow.com/questions/60876383

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install tensorly

            You can install using 'pip install tensorly' or download it from GitHub, PyPI.
            You can use tensorly like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            Install
          • PyPI

            pip install tensorly

          • CLONE
          • HTTPS

            https://github.com/tensorly/tensorly.git

          • CLI

            gh repo clone tensorly/tensorly

          • sshUrl

            git@github.com:tensorly/tensorly.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link