gcloud | GitHub Action for interacting with Google Cloud Platform | GCP library
kandi X-RAY | gcloud Summary
kandi X-RAY | gcloud Summary
GitHub Action which allows interacting with Google Cloud Platform.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of gcloud
gcloud Key Features
gcloud Examples and Code Snippets
Community Discussions
Trending Discussions on gcloud
QUESTION
Trying to use google-cloud-dataproc-serveless
with spark.jars.repositories
option
ANSWER
Answered 2022-Mar-25 at 05:05You need to have a Java trust store with your cert imported. Then submit the batch with
QUESTION
Github Actions were working in my repository till yesterday. I didnt make any changes in .github/workflows/dev.yml file or in DockerFile.
But, suddenly in recent pushes, my Github Actions fail with the error
Setup, Build, Publish, and Deploy
...
ANSWER
Answered 2021-Jul-27 at 13:24I fixed it by changing uses
value to
uses: google-github-actions/setup-gcloud@master
QUESTION
I searched a lot how to authenticate/authorize Google's client libraries and it seems no one agrees how to do it.
Some people states that I should create a service account, create a key out from it and give that key to each developer that wants to act as this service account. I hate this solution because it leaks the identity of the service account to multiple person.
Others mentioned that you simply log in with the Cloud SDK and ADC (Application Default Credentials) by doing:
...ANSWER
Answered 2021-Oct-02 at 14:00You can use a new gcloud feature and impersonate your local credential like that:
QUESTION
I'm trying to publish a npm package on GAR (Google Artifact Registry) through github using google-github-actions/auth@v0
and google-artifactregistry-auth
For the authentication to google from github here is what I did to use the Federation Workload Identity:
...ANSWER
Answered 2022-Feb-11 at 12:44I finally find out !!! BUT I'm not sure in term of security if there is any risk or not so if anyone can advice I'll edit the answer !
What is changing but I'm not sure in term of security is here :
QUESTION
With the upgrade to Google Cloud SDK 360.0.0-0 i started seeing the following error when running the dev_appserver.py
command for my Python 2.7 App Engine project.
ANSWER
Answered 2022-Feb-08 at 08:52This issue seems to have been resolved with Google Cloud SDK version 371
On my debian based system i fixed it by downgrading the app-engine-python
component to the previous version
QUESTION
I am trying to submit google dataproc batch job. As per documentation Batch Job, we can pass subnetwork
as parameter. But when use, it give me
ERROR: (gcloud.dataproc.batches.submit.spark) unrecognized arguments: --subnetwork=
Here is gcloud command I have used,
...ANSWER
Answered 2022-Feb-01 at 11:28According to dataproc batches docs, the subnetwork URI needs to be specified using argument --subnet
.
Try:
QUESTION
I'm trying to deploy a GCloud App Engine Flexible service. I have a yaml file, in which it has the Node.js runtime and the env
specified.
ANSWER
Answered 2021-Oct-06 at 00:48So take a look at this doc, with particular attention to this line
The engines.node property is optional, but if present, the value must be compatible with the Node.js version specified in your app.yaml file. For example:
I believe the default version is 12 (i.e. runtime: nodejs
) to correct this in your app.yaml
file set runtime as follows runtime: nodejs14
or newer
Also bear in mind that minor patches are updated automatically so you can only specify the major version i.e. 14.X.X. Additionally if your stated version is not available the build process will fail.
Note: If you are using cloud build with cloudbuild.yaml
and a flex environment you may get a build error, move cloudbuild.yaml
into its own folder to prevent this error and use --config option to state the location of the yaml. See this doc for further guidance
QUESTION
We use to spin cluster with below configurations. It used to run fine till last week but now failing with error ERROR: Failed cleaning build dir for libcst Failed to build libcst ERROR: Could not build wheels for libcst which use PEP 517 and cannot be installed directly
ANSWER
Answered 2022-Jan-19 at 21:50Seems you need to upgrade pip
, see this question.
But there can be multiple pip
s in a Dataproc cluster, you need to choose the right one.
For init actions, at cluster creation time,
/opt/conda/default
is a symbolic link to either/opt/conda/miniconda3
or/opt/conda/anaconda
, depending on which Conda env you choose, the default is Miniconda3, but in your case it is Anaconda. So you can run either/opt/conda/default/bin/pip install --upgrade pip
or/opt/conda/anaconda/bin/pip install --upgrade pip
.For custom images, at image creation time, you want to use the explicit full path,
/opt/conda/anaconda/bin/pip install --upgrade pip
for Anaconda, or/opt/conda/miniconda3/bin/pip install --upgrade pip
for Miniconda3.
So, you can simply use /opt/conda/anaconda/bin/pip install --upgrade pip
for both init actions and custom images.
QUESTION
- standard dataproc image 2.0
- Ubuntu 18.04 LTS
- Hadoop 3.2
- Spark 3.1
I am testing to run a very simple script on dataproc pyspark cluster:
testing_dep.py
...ANSWER
Answered 2022-Jan-19 at 21:26The error is expected when running Spark in YARN cluster mode but the job doesn't create Spark context. See the source code of ApplicationMaster.scala.
To avoid this error, you need to create a SparkContext or SparkSession, e.g.:
QUESTION
I would like to connect to a Cloud SQL instance from Cloud Run, using a service account. The connection used to be created within the VPC and we would just provide a connection string with a user
and a password
to our PostgreSQL client. But now we want the authentication to be managed by Google Cloud IAM, with the service account associated with the Cloud Run service.
On my machine, I can use the enable_iam_login
argument to use my own service account. The command to run the Cloud SQL proxy would look like this:
ANSWER
Answered 2021-Nov-18 at 20:32Unfortunately, there isn't a way to configure Cloud Run's use of the Cloud SQL proxy to do this for you.
If you are using Java, Python, or Go, there are language specific connectors you can use from Cloud Run. These all have the option to use IAM DB AuthN as part of them.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install gcloud
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page