sparksteps | : star : CLI tool to launch Spark jobs on AWS EMR | AWS library
kandi X-RAY | sparksteps Summary
kandi X-RAY | sparksteps Summary
:star: CLI tool to launch Spark jobs on AWS EMR
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Directory name
- Get the basename of a path
- Create argument parser
- Determine price prices for this instance
- Parse CLI arguments
- Wait for a single step to complete
sparksteps Key Features
sparksteps Examples and Code Snippets
Community Discussions
Trending Discussions on sparksteps
QUESTION
I am new to pySpark and I'm trying to implement a multi-step EMR/Spark job using MRJob, do I need to create a new SparkContext for each SparkStep, or can I share the same SparkContext for all SparkSteps?
I tried to look up the MRJob manual but unfortunately it was not clear on this.
Can someone please advise what's the correct approach?
Creating a separate SparkContext:
...
ANSWER
Answered 2018-Apr-10 at 07:25According to Dave at MRJob discussion group, we should create a new SparkContext for each step, as each step is a completely new invocation of Hadoop and Spark (ie. #1 above is the correct approach).
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install sparksteps
You can use sparksteps like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page