cloudrun-demo | Cloud Run CD Demo | GCP library
kandi X-RAY | cloudrun-demo Summary
kandi X-RAY | cloudrun-demo Summary
Cloud Run CD Demo
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of cloudrun-demo
cloudrun-demo Key Features
cloudrun-demo Examples and Code Snippets
Community Discussions
Trending Discussions on GCP
QUESTION
I have a pyspark job available on GCP Dataproc to be triggered on airflow as shown below:
...ANSWER
Answered 2022-Mar-28 at 08:18You have to pass a Sequence[str]
. If you check DataprocSubmitJobOperator you will see that the params job
implements a class google.cloud.dataproc_v1.types.Job.
QUESTION
What is the equivalent of header=0
in pandas
, which recognises the first line as a heading in gspread
?
pandas import statement (correct)
...ANSWER
Answered 2022-Mar-16 at 08:12Looking at the API documentation, you probably want to use:
QUESTION
I'm trying to grab the latest secret version. Is there a way to do that without specifying the version number? Such as using the keyword "latest". I'm trying to avoid having to iterate through all the secret versions with a for loop as GCP documentation shows:
...ANSWER
Answered 2021-Sep-12 at 18:54import com.google.cloud.secretmanager.v1.AccessSecretVersionResponse;
import com.google.cloud.secretmanager.v1.SecretManagerServiceClient;
import com.google.cloud.secretmanager.v1.SecretVersionName;
import java.io.IOException;
public class AccessSecretVersion {
public static void accessSecretVersion() throws IOException {
// TODO(developer): Replace these variables before running the sample.
String projectId = "your-project-id";
String secretId = "your-secret-id";
String versionId = "latest"; //<-- specify version
accessSecretVersion(projectId, secretId, versionId);
}
// Access the payload for the given secret version if one exists. The version
// can be a version number as a string (e.g. "5") or an alias (e.g. "latest").
public static void accessSecretVersion(String projectId, String secretId, String versionId)
throws IOException {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests. After completing all of your requests, call
// the "close" method on the client to safely clean up any remaining background resources.
try (SecretManagerServiceClient client = SecretManagerServiceClient.create()) {
SecretVersionName secretVersionName = SecretVersionName.of(projectId, secretId, versionId);
// Access the secret version.
AccessSecretVersionResponse response = client.accessSecretVersion(secretVersionName);
// Print the secret payload.
//
// WARNING: Do not print the secret in a production environment - this
// snippet is showing how to access the secret material.
String payload = response.getPayload().getData().toStringUtf8();
System.out.printf("Plaintext: %s\n", payload);
}
}
}
QUESTION
I'm working on a Terraform project that will set up all the GCP resources needed for a large project spanning multiple GitHub repos. My goal is to be able to recreate the cloud infrastructure from scratch completely with Terraform.
The issue I'm running into is in order to setup build triggers with Terraform within GCP, the GitHub repo that is setting off the trigger first needs to be connected. Currently, I've only been able to do that manually via the Google Cloud Build dashboard. I'm not sure if this is possible via Terraform or with a script but I'm looking for any solution I can automate this with. Once the projects are connected updating everything with Terraform is working fine.
TLDR; How can I programmatically connect a GitHub project with a GCP project instead of using the dashboard?
...ANSWER
Answered 2022-Feb-12 at 16:16Currently there is no way to programmatically connect a GitHub repo to a Google Cloud Project. This must be done manually via Google Cloud.
My workaround is to manually connect an "admin" project, build containers and save them to that project's artifact registry, and then deploy the containers from the registry in the programmatically generated project.
QUESTION
I'm unable to create a Cloud Function in my GCP project using GUI, but have admin roles for GCF, SA and IAM.
Here is the error message:
Missing necessary permission iam.serviceAccounts.actAs for cloud-client-api-gae on the service account serviceaccountname@DOMAIN.iam.gserviceaccount.com. Grant the role 'roles/iam.serviceAccountUser' to cloud-client-api-gae on the service account serviceaccountname@DOMAIN.iam.gserviceaccount.com.
cloud-client-api-gae
is not an SA nor User on my IAM list. It must be a creature living underneath Graphical User Interfrace.
I have Enabled API for GCF, AppEngine and I have Service Account Admin
role.
I had literally 0 search results when googling for cloud-client-api-gae
.
ANSWER
Answered 2022-Jan-18 at 13:53I contacted GCP support and it seems I was missing a single permission for my user:
Service Account User
- that's it.
PS: Person from support didn't know what this thing called "cloud-client-api-gae" is.
QUESTION
I have a TypeScript project that has been deployed several times without any problems to Google App Engine, Standard environment, running Node 10. However, when I try to update the App Engine project to either Node 12 or 14 (by editing the engines.node
value in package.json
and the runtime
value in app.yaml
), the deploy fails, printing the following to the console:
ANSWER
Answered 2022-Jan-16 at 14:32I encountered the exact same problem and just put typescript in dependencies, not devDependencies.
It worked after that, but cannot assure that it is due to this change (since I have no proof of that).
QUESTION
I am following this article ,for submit a job to an existing Dataproc cluster via a Dataproc API
For the following line of code :
...ANSWER
Answered 2022-Jan-14 at 19:22The method com.google.api.gax.core.GoogleCredentialsProvider$Builder com.google.api.gax.core.GoogleCredentialsProvider$Builder.setUseJwtAccessWithScope(boolean)
was introduced in com.google.api:gax
in version 2.3.0.
Can you
run
mvn dependency:tree
and confirm that your version ofcom.google.api:gax
is above version 2.3.0?upgrade all Google libraries to the latest version?
Here is a similar issue found on the internet.
QUESTION
I'm currently building PoC Apache Beam pipeline in GCP Dataflow. In this case, I want to create streaming pipeline with main input from PubSub and side input from BigQuery and store processed data back to BigQuery.
Side pipeline code
...ANSWER
Answered 2022-Jan-12 at 13:12Here you have a working example:
QUESTION
echo Give yearmonth "yyyyMM"
setlocal enabledelayedexpansion
SET /p yearmonth=
SET ClientName[0]=abc
SET ClientName[1]=def
SET i = 0
:myLoop
if defined ClientName[%i%] (
call bq query --use_legacy_sql=false "CREATE EXTERNAL TABLE `test.!ClientName[%%i]!.%yearmonth%` OPTIONS (format = 'CSV',skip_leading_rows = 1 uris = ['gs://test/!ClientName[%%i]!/AWS/%yearmonth%/Metrics/data/*.csv'])"
set /a "i+=1"
GOTO :myLoop
)
...ANSWER
Answered 2022-Jan-09 at 11:04It is bad practice to
set
variables as standalone alphabetical characters likei
. One reason is exactly as you have experienced, you have confusedfor
metavariable%%i
with aset
variable%i%
.You are expanding in the loop, but have not
enabledelayedexpansion
so there are 2 ways, which we will get to in a second.set
ting variables should not have spaces before or after=
excluding the likes ofset /a
So, Method 1, without delayedexpansion
(note how the variables are used with double %%
in the loop with the call
command).
QUESTION
I'm struggling to correctly set Vertex AI pipeline which does the following:
- read data from API and store to GCS and as as input for batch prediction.
- get an existing model (Video classification on Vertex AI)
- create Batch prediction job with input from point 1.
As it will be seen, I don't have much experience with Vertex Pipelines/Kubeflow thus I'm asking for help/advice, hope it's just some beginner mistake. this is the gist of the code I'm using as pipeline
ANSWER
Answered 2021-Dec-21 at 14:35I'm glad you solved most of your main issues and found a workaround for model declaration.
For your input.output
observation on gcs_source_uris
, the reason behind it is because the way the function/class returns the value. If you dig inside the class/methods of google_cloud_pipeline_components
you will find that it implements a structure that will allow you to use .outputs
from the returned value of the function called.
If you go to the implementation of one of the components of the pipeline you will find that it returns an output array from convert_method_to_component
function. So, in order to have that implemented in your custom class/function your function should return a value which can be called as an attribute. Below is a basic implementation of it.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install cloudrun-demo
In this step we will create a GCP Cloud Repository and setup a local git repo around the template we downloaded above.
First we enable the Source Repo API gcloud services enable sourcerepo.googleapis.com
We will create a empty repository called cloud_run_demo gcloud source repos create cloud_run_demo
Now will initalize a git repo git init
Next step we will configure the remote Repositories for what we just built git config credential.helper gcloud.sh
Next we will add the remote repository git remote add google https://source.developers.google.com/p/[PROJECT_NAME]/r/cloud_run_demo Where: [PROJECT_NAME] is the name of your GCP project.
In this stage we will build a sample container which will run our static site.
Create a blank file called Dockerfile
Add the following: FROM nginx:alpine LABEL maintainer="Rgreaves@google.com" COPY Code/ /usr/share/nginx/html RUN sed -i 's/80\;/8080\;/g' /etc/nginx/conf.d/default.conf EXPOSE 8080
Lets walk through each line: FROM nginx:alpine - Our base image, in this case just Nginx running ontop of the alpine OS LABEL maintainer="Rgreaves@google.com" - Using Labels in Dockerfiles organizes information cleanly inside the container and provides a point of contact for future issues COPY Code/ /usr/share/nginx/html - Lets add in our code to the nginx root folder. RUN sed -i 's/80\;/8080\;/g' /etc/nginx/conf.d/default.conf - Cloud Run assumes the pod is accesible on port 8080, by default nginx is set to port 80, this fixes this EXPOSE 8080 - This lets other users know what port to listen too and ensure the container is open on the port we need
We can test running this image by using the following commands localy: docker build -t mysite . && docker run --name test-site -d -p 8080:8080 mysite
Then we can browse the site using localhost:8080
After we are happy the container works we kill the container with: docker kill test-site && docker rm test-site
In this stage we are going to build a sample Cloudbuild config file to turn our code into a container and deploy it on Cloud Run.
First we enable the Cloud Build API: gcloud services enable cloudbuild.googleapis.com
We are going to create a blank file at the root of the repository called cloudbuild.yaml
Inside the file we will add the following content: steps: # Build the container image - name: 'gcr.io/cloud-builders/docker' args: ['build', '-t', 'gcr.io/[PROJECT_ID]/[IMAGE]', '.'] # Push the image to Container Registry - name: 'gcr.io/cloud-builders/docker' args: ['push', 'gcr.io/[PROJECT_ID]/[IMAGE]'] # Deploy image to Cloud Run - name: 'gcr.io/cloud-builders/gcloud' args: ['beta', 'run', 'deploy', '[SERVICE_NAME]', '--image', 'gcr.io/[PROJECT_ID]/[IMAGE]', '--region', 'us-central1', '--platform', 'managed', '--allow-unauthenticated'] images: - gcr.io/[PROJECT_ID]/[IMAGE] Inside this file we need to replace the following values: [PROJECT_ID] - Your project name [IMAGE] - Name of our container, for this purpose call it mysite-public [SERVICE_NAME] - For demo sake we will call this public-site
This stage we tell Cloud Build to listen to our Source Repository, and enabled the Cloud Build Service account to modify Cloud Run.
This step needs to be configured inside the console sadly
Open the console and goto Cloud Build / Triggers page console.cloud.google.com/cloud-build/triggers
Along the top select + CREATE TRIGGER
Select cloud_run_demo
Under Build configuration select Cloud Build configuration file (yaml or json)
Press Create trigger
After this is complete we need to go configure the Cloud Build service account to be able to talk to Cloud Run. To do this inside the Console goto Cloud Build Settings (console.cloud.google.com/cloud-build/settings) Inside here next to Cloud Run Change the Status to ENABLED When prompted for Enabling extended permissions, say Agree This creates a trigger which listens for any push to the repo on any branch, and then looks for a cloudbuild.yaml file in the repository root. It then uses that cloudbuild.yaml file to work out what it needs to do next.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page