spark-jobserver | REST job server for Apache Spark
kandi X-RAY | spark-jobserver Summary
kandi X-RAY | spark-jobserver Summary
spark-jobserver provides a RESTful interface for submitting and managing Apache Spark jobs, jars, and job contexts. This repo contains the complete Spark job server project, including unit tests and deploy scripts. It was originally started at Ooyala, but this is now the main development repo. Other useful links: Troubleshooting, cluster, YARN client, YARN on EMR, Mesos, JMX tips. Also see Chinese docs / 中文.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of spark-jobserver
spark-jobserver Key Features
spark-jobserver Examples and Code Snippets
Community Discussions
Trending Discussions on spark-jobserver
QUESTION
I'm currently working with spark-jobserver, and when the spark-jobserver goes down my app just stop working but I don't get notified
There is a health check to spark-jobserver?
...ANSWER
Answered 2022-Feb-22 at 23:18If the service goes down, it also wouldn't notify. Therefore, there isn't a built-in option for this.
Ideally you'd use external monitoring like Prometheus Blackbox exporter, Nagios, or simply cron
to curl
/netcat
your service.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install spark-jobserver
Build and run Job Server in local development mode within SBT. NOTE: This does NOT work for YARN, and in fact is only recommended with spark.master set to local[*]. Please deploy if you want to try with YARN or other real cluster.
Deploy job server to a cluster. There are two alternatives (see the deployment section): server_deploy.sh deploys job server to a directory on a remote host. server_package.sh deploys job server to a local directory, from which you can deploy the directory, or create a .tar.gz for Mesos or YARN deployment.
EC2 Deploy scripts - follow the instructions in EC2 to spin up a Spark cluster with job server and an example application.
EMR Deploy instruction - follow the instruction in EMR
When POSTing new binaries, the content-type header must be set to one of the types supported by the subclasses of the BinaryType trait. e.g. "application/java-archive", "application/python-egg" or "application/python-wheel". If you are using curl command, then you must pass for example "-H 'Content-Type: application/python-wheel'".
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page