drive-uploader | User-friendly tool to upload directories into Google Drive | REST library
kandi X-RAY | drive-uploader Summary
kandi X-RAY | drive-uploader Summary
User-friendly tool to upload directories into Google Drive
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Creates a resumable upload for the given file
- Get the resumable URI for resumable upload
- Gets the http client
- Start the application
- Sets proxy system property
- Loads the configuration file
- Starts the app
- Display a confirm dialog
- Authorizes the user secret
- Compares two pair
- Returns the data store directory
- The progress of the uploader
- Event handler
- Creates a resumable upload
- Returns a unique hash code for this instance
- Initialize the title field
- Initialize the columns
- Reads all bytes from the InputStream and returns the MD5 hash value
- Compares this proxy with the specified proxy object
- Initialize columns
- Initializes this TreeView
- Add a drive task
- Sets the configuration fields
- Set proxy
- Handles an HTTP response
- Get the temporary directory for the application
drive-uploader Key Features
drive-uploader Examples and Code Snippets
Community Discussions
Trending Discussions on drive-uploader
QUESTION
Goal
Download and upload a file to Google Drive purely in-memory using Google Drive APIs Resumable URL.
Challenge / Problem
I want to buffer the file as its being downloaded to memory (not filesystem) and subsequently upload to Google Drive. Google Drive API requires chunks to be a minimum length of
256 * 1024, (262144 bytes)
.
The process should pass a chunk from the buffer to be uploaded. If the chunk errors, that buffer chunk is retried up to 3 times. If the chunk succeeds, that chunk from the buffer should be cleared, and the process should continue until complete.
Background Efforts / Research (references below)
Most of the articles, examples and packages I've researched and tested have given some insight into streaming, piping and chunking, but use the filesystem
as the starting point from a readable stream.
I've tried different approaches with streams like passthrough
with highWaterMark
and third-party libraries such as request
, gaxios
, and got
which have built in stream/piping support but with no avail on the upload end of the processes.
Meaning, I am not sure how to structure the piping
or chunking
mechanism, whether with a buffer
or pipeline
to properly flow to the upload process until completion, and handle the progress and finalizing events in an efficient manner.
Questions
With the code below, how do I appropriately buffer the file and
PUT
to the google provided URL with the correctContent-Length
andContent-Range
headers, while having enough buffer space to handle 3 retries?In terms of handling back-pressure or buffering, is leveraging
.cork()
and.uncork()
an efficient way to manage the buffer flow?Is there a way to use a
Transform
stream withhighWaterMark
andpipeline
to manage the buffer efficiently? e.g...
ANSWER
Answered 2021-Jan-06 at 01:51I believe your goal and current situation as follows.
- You want to download a data and upload the downloaded data to Google Drive using Axios with Node.js.
- For uploading the data, you want to upload using the resumable upload with the multiple chunks by retrieving the data from the stream.
- Your access token can be used for uploading the data to Google Drive.
- You have already known the data size and mimeType of the data you want to upload.
In this case, in order to achieve the resumable upload with the multiple chunks, I would like to propose the following flow.
- Download data from URL.
- Create the session for the resumable upload.
- Retrieve the downloaded data from the stream and convert it to the buffer.
- For this, I used
stream.Transform
. - In this case, I stop the stream and upload the data to Google Drive. I couldn't think the method that this can be achieved without stopping the stream.
- I thought that this section might be the answer for your question 2 and 3.
- For this, I used
- When the buffer size is the same with the declared chunk size, upload the buffer to Google Drive.
- I thought that this section might be the answer for your question 3.
- When the upload occurs an error, the same buffer is uploaded again. In this sample script, 3 retries are run. When 3 retries are done, an error occurs.
- I thought that this section might be the answer for your question 1.
When above flow is reflected to your script, it becomes as follows.
Modified script:Please set the variables in the function main()
.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install drive-uploader
You can use drive-uploader like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the drive-uploader component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page