Google-Cloud-Storage | Setup Google Cloud Storage to upload and download | Cloud Storage library
kandi X-RAY | Google-Cloud-Storage Summary
kandi X-RAY | Google-Cloud-Storage Summary
Setup Google Cloud Storage to upload and download file using python program
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of Google-Cloud-Storage
Google-Cloud-Storage Key Features
Google-Cloud-Storage Examples and Code Snippets
Community Discussions
Trending Discussions on Google-Cloud-Storage
QUESTION
I want to upload data binary directly to GCP storage, without writing the file to disk. Below is the code snippet I have created to get to the state that I am going to be at.
...ANSWER
Answered 2022-Mar-25 at 12:43As suggested by @stefan, It should be to_send = StringIO.new(data), i.e. without .read (which would return a string again)
QUESTION
I am testing out cloud function and I have things setup, but output is not populating correctly (the output is not being saved into Cloud Storage and my print statements are not populating). Here is my code and my requirements below. I have setup the Cloud Function to just run as a HTTP request trigger type with unauthenticated invocations and having a Runtime service account as a specified account that has write access to Cloud Storage. I have verified that I am calling the correct Entry point.
logs
...ANSWER
Answered 2022-Mar-23 at 01:23As @dko512 mentioned in comments, issue was resolved by recreating and redeploying the Cloud Function.
Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.
Feel free to edit this answer for additional information.
QUESTION
In my application config i have defined the following properties:
...ANSWER
Answered 2022-Feb-16 at 13:12Acording to this answer: https://stackoverflow.com/a/51236918/16651073 tomcat falls back to default logging if it can resolve the location
Can you try to save the properties without the spaces.
Like this:
logging.file.name=application.logs
QUESTION
I searched a lot how to authenticate/authorize Google's client libraries and it seems no one agrees how to do it.
Some people states that I should create a service account, create a key out from it and give that key to each developer that wants to act as this service account. I hate this solution because it leaks the identity of the service account to multiple person.
Others mentioned that you simply log in with the Cloud SDK and ADC (Application Default Credentials) by doing:
...ANSWER
Answered 2021-Oct-02 at 14:00You can use a new gcloud feature and impersonate your local credential like that:
QUESTION
I'm trying to incorporate google-cloud-tasks
Python client within my fastapi app. But it's giving me an import error like this:
ANSWER
Answered 2022-Feb-09 at 17:35After doing some more research online I realized that installation of some packages is missed due to some existing packages. This issue helped me realize I need to reorder the position of google-cloud-tasks
in my requirements.txt. So what I did was pretty simple, created a new virtualenv installed google-cloud-tasks
as my first package and then installed everything else and finally the problem is solved.
Long story short the issue is the order in which packages are installed and that's why some packages are getting missed.
QUESTION
I'm trying to use Gmail api in python to send email but I cant get past importing the Google module despite using "pip install --upgrade google-api-python-client" or "pip install google".
However pip freeze shows:
...ANSWER
Answered 2021-Sep-20 at 10:55Implicit relative imports are not anymore supported as documented:
There is no longer any implicit import machinery
So if Google.py
is in the same directory as the code you pasted, you have to reference it's realtive location explicitly.
QUESTION
My target is to upload a file to GCP bucket from cloudhub and I am using service account and my only option is to use Json Key file.
This is what I've done:
- I have referred to this GCP information and this post to upload a file to GCP cloud storage and I am able to push the file via local but in Cloudhub I am not able to push the file to bucket.
I am receiving 401 unauthorized error
.
In both the links a step is mentioned to set GOOGLE_APPLICATION_CREDENTIALS = "path-to-json-key"
in environment variables.
When I try passing an absolute path(/Users/..json-key
) locally it works and the file gets uploaded to bucket but when I try in cloudhub runtime manager passing ${app.home}/"json-key"
or ${mule.home}/apps/${app.name}/"json-key"
as value and it fails.
The file is located in src/main/resources
directory and I am using java class (invoke static as mentioned in 2nd link) to upload the file. Need some guidance regarding this.
An alternative approach I thought, is it possible somehow to use this json key file in Java class itself rather than passing it via environment variable?
If at all the above two option does not work, any other approach wrt service account for app deployment in cloudhub ? (APIkey is an approach I tried but it does not restrict APIKey to a particular bucket)
Java Class
...ANSWER
Answered 2021-Dec-16 at 12:41If the problem is that the file can not be located in CloudHub try instead this method to reference it as described in this KB article:
QUESTION
I'm using a juicefs-csi in GKE. I use postgre as meta-store and GCS as storage. The corresponding setting is as follow:
...ANSWER
Answered 2021-Dec-15 at 13:53Ok I misunderstood you at the beginning.
When you are creating GKE
cluster you can specify which GCP Service Account
will be used by this cluster, like below:
By Default
it's Compute Engine default service account
(71025XXXXXX-compute@developer.gserviceaccount.com) which is lack of a few Cloud Product permissions (like Cloud Storage
, it has Read Only
). It's even described in this message.
If you want to check which Service Account
was set by default to VM, you could do this via
Compute Engine > VM Instances > Choose one of the VMs from this cluster > In details find API and identity management
So You have like 3 options to solve this issue:
1. During Cluster creation
In Node Pools
> Security
, you have Access scopes
where you can add some additional permissions.
Allow full access to all Cloud APIs
to allow access for all listed Cloud APIsSet access for each API
In your case you could just use Set access for each API
and change Storage
to Full
.
2. Set permissions with a Service Account
You would need to create a new Service Account
and provide proper permissions for Compute Engine
and Storage
. More details about how to create SA
you can find in Creating and managing service accounts.
3. Use Workload Identity
Workload Identity on your Google Kubernetes Engine (GKE) clusters. Workload Identity allows workloads in your GKE clusters to impersonate Identity and Access Management (IAM) service accounts to access Google Cloud services.
For more details you should check Using Workload Identity.
Useful links
- Configuring Velero - Velero is software for backup and restore, however steps 2 and 3 are mentioned there. You would just need to adjust commands/permissions to your scenario.
- Authenticating to Google Cloud with service accounts
QUESTION
I'm trying to deploy a cloud function with Python 3.9 but when I run
...ANSWER
Answered 2021-Dec-10 at 16:03In the thread you linked, there are several solutions, and an interesting one seems to be that a package named fitz
conflicts with PyMuPDF
, as they both use the same top name inside a script (being fitz
). I see both libraries are in your requirements.txt
, so this could be the cause of this error. I tested adding both libraries inside a Cloud Function and received the same error, which was resolved after removing fitz 0.0.1.dev2
from the file, and using only PyMuPDF
.
You can see another example of this behavior from this GitHub issue.
QUESTION
I'm writing some code for a class project that sends jobs to a dataproc cluster in GCP. I recently ran into an odd error and I'm having trouble wrapping my head around it. The error is as follows:
...ANSWER
Answered 2021-Dec-01 at 19:46Using mvn dependency:tree
you can discover there's a mix of grpc-java 1.41.0 and 1.42.1 versions in your dependency tree. google-cloud-datastore:2.2.0 brings in grpc-api:1.42.1 but the other dependencies bring in grpc version 1.40.1.
grpc-java recommends always using requireUpperBoundDeps
from maven-enforcer to catch Maven silently downgrading dependencies.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install Google-Cloud-Storage
You can use Google-Cloud-Storage like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page