sagemaker-mlops-end-to-end | And end-to-end demo of MLOps in SageMaker
kandi X-RAY | sagemaker-mlops-end-to-end Summary
kandi X-RAY | sagemaker-mlops-end-to-end Summary
sagemaker-mlops-end-to-end is a Python library. sagemaker-mlops-end-to-end has no bugs, it has no vulnerabilities and it has low support. However sagemaker-mlops-end-to-end build file is not available. You can download it from GitHub.
And end-to-end demo of MLOps in SageMaker
And end-to-end demo of MLOps in SageMaker
Support
Quality
Security
License
Reuse
Support
sagemaker-mlops-end-to-end has a low active ecosystem.
It has 3 star(s) with 1 fork(s). There are 3 watchers for this library.
It had no major release in the last 6 months.
sagemaker-mlops-end-to-end has no issues reported. There are 1 open pull requests and 0 closed requests.
It has a neutral sentiment in the developer community.
The latest version of sagemaker-mlops-end-to-end is current.
Quality
sagemaker-mlops-end-to-end has no bugs reported.
Security
sagemaker-mlops-end-to-end has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
License
sagemaker-mlops-end-to-end does not have a standard license declared.
Check the repository for any license declaration and review the terms closely.
Without a license, all rights are reserved, and you cannot use the library in your applications.
Reuse
sagemaker-mlops-end-to-end releases are not available. You will need to build from source code and install.
sagemaker-mlops-end-to-end has no build file. You will be need to create the build yourself to build the component from source.
Installation instructions are available. Examples and code snippets are not available.
Top functions reviewed by kandi - BETA
kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of sagemaker-mlops-end-to-end
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of sagemaker-mlops-end-to-end
sagemaker-mlops-end-to-end Key Features
No Key Features are available at this moment for sagemaker-mlops-end-to-end.
sagemaker-mlops-end-to-end Examples and Code Snippets
No Code Snippets are available at this moment for sagemaker-mlops-end-to-end.
Community Discussions
No Community Discussions are available at this moment for sagemaker-mlops-end-to-end.Refer to stack overflow page for discussions.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install sagemaker-mlops-end-to-end
IF NOT USING FEATURE STORE, IGNORE THE STEPS ABOVE AND FOLLOW THE BELOW STEPS. To setup Model Monitor. Once all the Endpoints have been created, navigate to the Endpoint UI in SageMaker Studio. Click on the endpoint deployed using the notebook in the model monitor folder,. Things to highlight in the demo -.
If using Feature Store, the first step of the Pipeline will need to read data from Feature Store.
In the file sagemaker-pipeline/pipeline-dw-fs.py lines 131 to 178 need to be replaced with the code in the notebook created by the DataWrangler Export.
The first step of the Pipeline will be step_read_train
Replace the first step in sagemaker-pipeline/pipeline.py with step_read_train and step_process from sagemaker-pipeline/pipeline-dw-fs.py.
Navigate to the model build repo created by the SageMaker Project, replace the code in pipelines/abalone/ with the code in sagemaker-pipeline/.
Trigger the pipeline by pushing the new code to the CodeCommit/Git repo (depending on the template selected)
Once the pipeline has completed, find the model package group in the Model Registry and find the ARN of the model package created in the group
Approve the model in the model registry, this will trigger the model deployment pipeline, you should see an endpoint being created in SageMaker
This endpoint will have the suffix -staging. You can navigate to CodePipeline, and under Pipelines you will see one with your project name and model-deploy. Click on that Pipeline and you will see a manual approval option. When approved, a new endpoint will be created with the suffix -prod.
These endpoints are created by the default seed code in the 1st party template and do not have Data Capture enabled.
Navigate to model-monitor/create_endpoint.ipynb to create an endpoint with DataCapture enabled
Run model-monitor/data_quality_monitor.ipynb to set up a Data Quality Monitoring schedule on the endpoint.
End to end lineage View of the trial component from the model in the Model Registry Lineage from the Endpoint to the Model Package Group and Version
Pipelines integration with Experiments
Debugging a Pipeline through the DAG view
CI/CD for automatic training and deployment
If using Feature Store, the first step of the Pipeline will need to read data from Feature Store.
In the file sagemaker-pipeline/pipeline-dw-fs.py lines 131 to 178 need to be replaced with the code in the notebook created by the DataWrangler Export.
The first step of the Pipeline will be step_read_train
Replace the first step in sagemaker-pipeline/pipeline.py with step_read_train and step_process from sagemaker-pipeline/pipeline-dw-fs.py.
Navigate to the model build repo created by the SageMaker Project, replace the code in pipelines/abalone/ with the code in sagemaker-pipeline/.
Trigger the pipeline by pushing the new code to the CodeCommit/Git repo (depending on the template selected)
Once the pipeline has completed, find the model package group in the Model Registry and find the ARN of the model package created in the group
Approve the model in the model registry, this will trigger the model deployment pipeline, you should see an endpoint being created in SageMaker
This endpoint will have the suffix -staging. You can navigate to CodePipeline, and under Pipelines you will see one with your project name and model-deploy. Click on that Pipeline and you will see a manual approval option. When approved, a new endpoint will be created with the suffix -prod.
These endpoints are created by the default seed code in the 1st party template and do not have Data Capture enabled.
Navigate to model-monitor/create_endpoint.ipynb to create an endpoint with DataCapture enabled
Run model-monitor/data_quality_monitor.ipynb to set up a Data Quality Monitoring schedule on the endpoint.
End to end lineage View of the trial component from the model in the Model Registry Lineage from the Endpoint to the Model Package Group and Version
Pipelines integration with Experiments
Debugging a Pipeline through the DAG view
CI/CD for automatic training and deployment
Support
For any new features, suggestions and bugs create an issue on GitHub.
If you have any questions check and ask questions on community page Stack Overflow .
Find more information at:
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page