google-drive-upload | Bash scripts to upload files to google drive | REST library
kandi X-RAY | google-drive-upload Summary
kandi X-RAY | google-drive-upload Summary
Google drive upload is a collection of shell scripts runnable on all POSIX compatible shells ( sh / ksh / dash / bash / zsh / etc ). It utilizes google drive api v3 and google OAuth2.0 to generate access tokens and to authorize application for uploading files/folders to your google drive.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of google-drive-upload
google-drive-upload Key Features
google-drive-upload Examples and Code Snippets
Community Discussions
Trending Discussions on google-drive-upload
QUESTION
Goal
Download and upload a file to Google Drive purely in-memory using Google Drive APIs Resumable URL.
Challenge / Problem
I want to buffer the file as its being downloaded to memory (not filesystem) and subsequently upload to Google Drive. Google Drive API requires chunks to be a minimum length of
256 * 1024, (262144 bytes)
.
The process should pass a chunk from the buffer to be uploaded. If the chunk errors, that buffer chunk is retried up to 3 times. If the chunk succeeds, that chunk from the buffer should be cleared, and the process should continue until complete.
Background Efforts / Research (references below)
Most of the articles, examples and packages I've researched and tested have given some insight into streaming, piping and chunking, but use the filesystem
as the starting point from a readable stream.
I've tried different approaches with streams like passthrough
with highWaterMark
and third-party libraries such as request
, gaxios
, and got
which have built in stream/piping support but with no avail on the upload end of the processes.
Meaning, I am not sure how to structure the piping
or chunking
mechanism, whether with a buffer
or pipeline
to properly flow to the upload process until completion, and handle the progress and finalizing events in an efficient manner.
Questions
With the code below, how do I appropriately buffer the file and
PUT
to the google provided URL with the correctContent-Length
andContent-Range
headers, while having enough buffer space to handle 3 retries?In terms of handling back-pressure or buffering, is leveraging
.cork()
and.uncork()
an efficient way to manage the buffer flow?Is there a way to use a
Transform
stream withhighWaterMark
andpipeline
to manage the buffer efficiently? e.g...
ANSWER
Answered 2021-Jan-06 at 01:51I believe your goal and current situation as follows.
- You want to download a data and upload the downloaded data to Google Drive using Axios with Node.js.
- For uploading the data, you want to upload using the resumable upload with the multiple chunks by retrieving the data from the stream.
- Your access token can be used for uploading the data to Google Drive.
- You have already known the data size and mimeType of the data you want to upload.
In this case, in order to achieve the resumable upload with the multiple chunks, I would like to propose the following flow.
- Download data from URL.
- Create the session for the resumable upload.
- Retrieve the downloaded data from the stream and convert it to the buffer.
- For this, I used
stream.Transform
. - In this case, I stop the stream and upload the data to Google Drive. I couldn't think the method that this can be achieved without stopping the stream.
- I thought that this section might be the answer for your question 2 and 3.
- For this, I used
- When the buffer size is the same with the declared chunk size, upload the buffer to Google Drive.
- I thought that this section might be the answer for your question 3.
- When the upload occurs an error, the same buffer is uploaded again. In this sample script, 3 retries are run. When 3 retries are done, an error occurs.
- I thought that this section might be the answer for your question 1.
When above flow is reflected to your script, it becomes as follows.
Modified script:Please set the variables in the function main()
.
QUESTION
I'm trying to upload media files to Google Drive using the REST API and service account. I have a Cloud Functions backend where I authenticate with the right scopes for Google DRIVE API and return the access token (shown in the snippet below) to the client which could then make an upload request to Google Drive.
...ANSWER
Answered 2020-Oct-23 at 11:06If you upload with the service account files to a Drive that is not his, you need to set the parameter
supportsAllDrives
totrue
Alternatively, use domain-wide delegation with impersonation to make the service account upload files on your behalf - in this case you do not need to share your folder with the service account.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install google-drive-upload
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page