clearml | Magical CI/CD to streamline your ML workflow | Machine Learning library

 by   allegroai Python Version: 1.15.1 License: Apache-2.0

kandi X-RAY | clearml Summary

kandi X-RAY | clearml Summary

clearml is a Python library typically used in Artificial Intelligence, Machine Learning, Deep Learning, Pytorch, Docker applications. clearml has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has medium support. You can install using 'pip install clearml' or download it from GitHub, PyPI.

ClearML is a ML/DL development and production suite, it contains three main modules:. Instrumenting these components is the ClearML-server, see Self-Hosting & Free tier Hosting.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              clearml has a medium active ecosystem.
              It has 4466 star(s) with 590 fork(s). There are 83 watchers for this library.
              There were 10 major release(s) in the last 6 months.
              There are 343 open issues and 467 have been closed. On average issues are closed in 136 days. There are 5 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of clearml is 1.15.1

            kandi-Quality Quality

              clearml has 0 bugs and 0 code smells.

            kandi-Security Security

              clearml has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              clearml code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              clearml is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              clearml releases are available to install and integrate.
              Deployable package is available in PyPI.
              Build file is available. You can build the component from source.
              Installation instructions are not available. Examples and code snippets are available.
              It has 85160 lines of code, 7121 functions and 293 files.
              It has high code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed clearml and discovered the below as its top functions. This is intended to give you an instant insight into clearml implemented functionality, and help decide if they suit your requirements.
            • Initializes the task .
            • Creates a component for a component .
            • Triggers a plot .
            • Adds a function step .
            • Create a task .
            • Parse content .
            • Copy the arguments to the parser
            • Run the wizard .
            • Create a task from a function
            • Start the given runs .
            Get all kandi verified functions for this library.

            clearml Key Features

            No Key Features are available at this moment for clearml.

            clearml Examples and Code Snippets

            copy iconCopy
            python train_pytorch_caltech_birds.py --config configs/timm/resnext101_32x8d_config.yaml
            
            python train_pytorch_ignite_caltech_birds.py --config configs/timm/resnext101_32x8d_config.yaml
            
            python train_clearml_pytorch_ignite_caltech_birds.py --config c  

            Community Discussions

            QUESTION

            What would stop credentials from validation on a ClearML server?
            Asked 2021-Dec-22 at 08:33

            I've set up a ClearML server in GCP using the sub-domain approach. I can access all three domains (https://app.clearml.mydomain.com, https://api.clearml.mydomain.com and https://files.clearml.mydomain.com) in a browser and see what I think is the correct response, but when connecting with the python SDK via clearml-init I get the following error:

            ...

            ANSWER

            Answered 2021-Dec-22 at 08:33

            Following the discussion here, it seemed that the load balancer being used was blocking GET requests with a payload which are used by ClearML. A fix is being worked on to allow the method to be changed to a POST request via an environment variable.

            Source https://stackoverflow.com/questions/70397010

            QUESTION

            Mounting an S3 bucket in docker in a clearml agent
            Asked 2021-May-14 at 19:57

            What is the best practice for mounting an S3 container inside a docker image that will be using as a ClearML agent? I can think of 3 solutions, but have been unable to get any to work currently:

            1. Use prefabbed configuration in ClearML, specifically CLEARML_AGENT_K8S_HOST_MOUNT. For this to work, the S3 bucket would be mounted separately on the host using rclone and then remapped into docker. This appears to only apply to Kubernetes and not Docker - and therefore would not work.
            2. Mount using s3fuse as specified here. The issue is will it work with the S3 bucket secret stored in ClearML browser sessions? This would also appear to be complicated and require custom docker images, not to mention running the docker image as --privileged or similar.
            3. Pass arguments to docker using "docker_args and docker_bash_setup_script arguments to Task.create()" as specified in the 1.0 release notes. This would be similar to (1), but the arguments would be for bind-mounting the volume. I do not see much documentation or examples on how this new feature may be used for this end.
            ...

            ANSWER

            Answered 2021-May-12 at 04:32

            i would recommend you to check out the Storage gateway S3 behind the gateway you can use the NFS, EFS or S3 bucket.

            Read more at : https://aws.amazon.com/storagegateway/?whats-new-cards.sort-by=item.additionalFields.postDateTime&whats-new-cards.sort-order=desc

            There are multiple ways you can do this. You can also use the CSI driver to connect the S3 also.

            https://github.com/ctrox/csi-s3

            rclone is nice option if you can use it, which will sync data to the POD host system in that if large files are there it might take time due to file size and network latecy.

            Personal suggestion S3 is object storage so if you are looking forward to do file operations like writing the file or zip file it might take time to do operation based on my personal experience.

            Remember that s3 is NOT a file system, but an object store - while mounting IS an incredibly useful capability - I wouldn't leverage anything more than file read or create - don't try to append a file, don't try to use file system trickery

            If that the case I would recommend using the NFS or SSD to the container.

            while if we look for s3fs-fuse it has own benefit of multipart upload and MD5 & local caching etc.

            The easiest way you can write your own script which will sync to the local directory with the directory of S3 bucket over HTTP or else Storage gateway S3 is good option.

            Amazon S3 File Gateway provides a seamless way to connect to the cloud in order to store application data files and backup images as durable objects in Amazon S3 cloud storage. Amazon S3 File Gateway offers SMB or NFS-based access to data in Amazon S3 with local caching.

            Source https://stackoverflow.com/questions/67496760

            QUESTION

            How to manage datasets in ClearML Web UI?
            Asked 2021-Mar-15 at 17:59

            Using a self-deployed ClearML server with the clearml-data CLI, I would like to manage (or view) my datasets in the WebUI as shown on the ClearML webpage (https://clear.ml/mlops/clearml-feature-store/):

            However, this feature does not show up in my Web UI. According to the pricing page, the feature store is not a premium feature. Do I need to configure my server in a special way to use this feature?

            ...

            ANSWER

            Answered 2021-Mar-15 at 17:59

            Disclaimer: I'm part of the ClearML (formerly Trains) Team

            I think this screenshot is taken from the premium version... The feature itself exists in the open-source version, but I "think" some of the dataset visualization capabilities are not available in the open-source self hosted version.

            Nonetheless, you have a fully featured feature-store, with the ability to add your own metrics / samples for every dataset/feature version. The open-source version also includes the advanced versioning & delta based storage for datasets/features (i.e. only the change set from the parent version is stored)

            Source https://stackoverflow.com/questions/66640850

            QUESTION

            ClearML get max value from logged values
            Asked 2021-Feb-25 at 01:04

            I use ClearML to track my tensorboard logs (from PyTorch Lightning) during training. At a point later I start another script which connects to existing task and do some testing.

            But unfortenautly I do not have all information in the second script, so I want to query them from the logged values from ClearML server.

            How would I do this?

            I thought about something like this, but havn't found anything in documentation:

            ...

            ANSWER

            Answered 2021-Feb-25 at 01:04

            Disclaimer I'm part of the ClearML (formerly Trains) team.

            To get an existing Task object for a running (or completed/failed) experiment, assuming we know Task ID:

            Source https://stackoverflow.com/questions/66320238

            QUESTION

            ClearML multiple tasks in single script changes logged value names
            Asked 2021-Feb-22 at 14:29

            I trained multiple models with different configuration for a custom hyperparameter search. I use pytorch_lightning and its logging (TensorboardLogger). When running my training script after Task.init() ClearML auto-creates a Task and connects the logger output to the server.

            I log for each straining stage train, val and test the following scalars at each epoch: loss, acc and iou

            When I have multiple configuration, e.g. networkA and networkB the first training log its values to loss, acc and iou, but the second to networkB:loss, networkB:acc and networkB:iou. This makes values umcomparable.

            My training loop with Task initalization looks like this:

            ...

            ANSWER

            Answered 2021-Feb-19 at 22:31

            Disclaimer I'm part of the ClearML (formerly Trains) team.

            pytorch_lightning is creating a new Tensorboard for each experiment. When ClearML logs the TB scalars, and it captures the same scalar being re-sent again, it adds a prefix so if you are reporting the same metric it will not overwrite the previous one. A good example would be reporting loss scalar in the training phase vs validation phase (producing "loss" and "validation:loss"). It might be the task.close() call does not clear the previous logs, so it "thinks" this is the same experiment, hence adding the prefix networkB to the loss. As long as you are closing the Task after training is completed you should have all experiments log with the same metric/variant (title/series). I suggest opening a GitHub issue, this should probably be considered a bug.

            Source https://stackoverflow.com/questions/66279581

            QUESTION

            ClearML how to change clearml.conf file in AWS Sagemaker
            Asked 2021-Feb-19 at 22:41

            I am working in AWS Sagemaker Jupyter notebook. I have installed clearml package in AWS Sagemaker in Jupyter. ClearML server was installed on AWS EC2. I need to store artifacts and models in AWS S3 bucket, so I want to specify credentials to S3 in clearml.conf file. How can I change clearml.conf file in AWS Sagemaker instance? looks like permission denied to all folders on it. Or maybe somebody can suggest a better approach.

            ...

            ANSWER

            Answered 2021-Feb-19 at 22:41

            Disclaimer I'm part of the ClearML (formerly Trains) team.

            To set credentials (and clearml-server hosts) you can use Task.set_credentials. To specify the S3 bucket as output for all artifacts (and debug images for that matter) you can just set it as the files_server.

            For example:

            Source https://stackoverflow.com/questions/66216294

            QUESTION

            ClearML SSH port forwarding fileserver not available in WEB Ui
            Asked 2021-Jan-11 at 18:39

            Trying to use clearml-server on own Ubuntu 18.04.5 with SSH Port Forwarding and not beeing able to see my debug samples.

            My setup:

            • ClearML server on hostA
            • SSH Tunnel connections to access Web App from working machine via localhost:18080
            • Web App: ssh -N -L 18081:127.0.0.1:8081 user@hostA
            • Fileserver: ssh -N -L 18081:127.0.0.1:8081 user@hostA

            In Web App under Task->Results->Debug Samples the Images are still refrenced by localhost:8081

            Where can I set the fileserver URL to be localhost:18081 in Web App? I tried ~/clearml.conf, but this did not work ( I think it is for my python script ).

            ...

            ANSWER

            Answered 2021-Jan-11 at 18:39

            Disclaimer: I'm a member of the ClearML team (formerly Trains)

            In ClearML, debug images' URL is registered once they are uploaded to the fileserver. The WebApp doesn't actually decide on the URL for each debug image, but rather obtains it for each debug image from the server. This allows you to potentially upload debug images to a variety of storage targets, ClearML File Server simply being the most convenient, built-in option.

            So, the WebApp will always look for localhost:8008 for debug images that have already been uploaded to the fileserver and contain localhost:8080 in their URL. A possible solution is to simply add another tunnel in the form of ssh -N -L 8081:127.0.0.1:8081 user@hostA.

            For future experiments, you can choose to keep using 8081 (and keep using this new tunnel), or to change the default fileserver URL in clearml.conf to point to port localhost:18081, assuming you're running your experiments from the same machine where the tunnel to 18081 exists.

            Source https://stackoverflow.com/questions/65671395

            QUESTION

            ClearML server IP address not used with localhost and SSH port forwarding
            Asked 2021-Jan-10 at 18:24

            Trying to use clearml-server on own Ubuntu 18.04.5.

            I use env variables to set the IP Address of my clearml-server.

            ...

            ANSWER

            Answered 2021-Jan-10 at 18:24

            Disclaimer: I'm a ClearML (Trains) team member

            Basically the docker-compose will expose only the API/Web/File server , you can further limit the exposure to your localhost only, by changing the following section in your ClearML server docker-compose.yml

            Source https://stackoverflow.com/questions/65655382

            QUESTION

            Can ClearML (formerly Trains) work a local server?
            Asked 2020-Dec-31 at 15:03

            I am trying to start my way with ClearML (formerly known as Trains).

            I see on the documentation that I need to have server running, either on the ClearML platform itself, or on a remote machine using AWS etc.

            I would really like to bypass this restriction and run experiments on my local machine, not connecting to any remote destination.

            According to this I can install the trains-server on any remote machine, so in theory I should also be able to install it on my local machine, but it still requires me to have Kubernetes or Docker, but I am not using any of them.

            Anyone had any luck using ClearML (or Trains, I think it's still quite the same API and all) on a local server?

            • My OS is Ubuntu 18.04.
            ...

            ANSWER

            Answered 2020-Dec-30 at 16:18

            Disclaimer: I'm a member of the ClearML team (formerly Trains)

            I would really like to bypass this restriction and run experiments on my local machine, not connecting to any remote destination.

            A few options:

            1. The Clearml Free trier offers free hosting for your experiments, these experiment are only accessible to you, unless you specifically want to share them among your colleagues. This is probably the easiest way to get started.
            2. Install the ClearML-Server basically all you need is docker installed and you should be fine. There are full instructions here , this is the summary:

            Source https://stackoverflow.com/questions/65509754

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install clearml

            You can install using 'pip install clearml' or download it from GitHub, PyPI.
            You can use clearml like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.

            Support

            More information in the official documentation and on YouTube. For examples and use cases, check the examples folder and corresponding documentation. If you have any questions: post on our Slack Channel, or tag your questions on stackoverflow with 'clearml' tag (previously trains tag). For feature requests or bug reports, please use GitHub issues. Additionally, you can always find us at clearml@allegro.ai.
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            Install
          • PyPI

            pip install clearml

          • CLONE
          • HTTPS

            https://github.com/allegroai/clearml.git

          • CLI

            gh repo clone allegroai/clearml

          • sshUrl

            git@github.com:allegroai/clearml.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link