s3-stream-upload | Manages streaming of data to AWS S3
kandi X-RAY | s3-stream-upload Summary
kandi X-RAY | s3-stream-upload Summary
This library allows you to efficiently stream large amounts of data to AWS S3 in Java without having to store the whole object in memory or use files. The S3 API requires that a content length be set before starting uploading, which is a problem when you want to calculate a large amount of data on the fly. The standard Java AWS SDK will simply buffer all the data in memory so that it can calculate the length, which consumes RAM and delays the upload. You can write the data to a temporary file but disk IO is slow (if your data is already in a file, using this library is pointless). This library provides an OutputStream that packages data written to it into chunks which are sent in a multipart upload. You can also use several streams and upload the data in parallel. The entrypoint is the class StreamTransferManager. Read more in the javadoc, including a usage example. This is available from maven central.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Waits for the multipart upload
- Uploads a stream part
- Calculates MD5 digest value
- Generate MD5
- Gets the output stream to write to
- Submits a task
- Throws an InterruptedException when interrupted
- Aborts the upload
- Details about this upload operation
- Removes the middle of the string
- Sets the number of streams to write
- Ensures that the queue can be set
- Puts any remaining data into the stream
- Puts the current stream
- Provides an iterator over the items in the queue
- Aborts the task
- Configures the data integrity check
- Sets the number of threads to upload
- Sets the queue capacity
- Appends the given stream to this one
- Inserts an item into the queue waiting if necessary
- String representation of this OutputOutputStream
- String representation of this object
- Sets the size of the part
s3-stream-upload Key Features
s3-stream-upload Examples and Code Snippets
Community Discussions
Trending Discussions on s3-stream-upload
QUESTION
I download a ziparchive from an api, which contains gzipped files, and i need to take the gz files and save to s3. Don't want to uncompress or anything. Just move to S3.
When i open the archive, it has a folder w/ random numbers, /12345/file1.gz, and many files, /12345/file2.gz, etc.
I've tried yauzl and adm-zip, but don't understand how to take each entry in the archive and just send to s3. I have s3-stream-upload package, which i can use to send. Just can't get it right. Thanks for any help
...ANSWER
Answered 2018-Apr-19 at 14:55Answer was doing a straight s3 put with readStream as the body of the object...
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install s3-stream-upload
You can use s3-stream-upload like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the s3-stream-upload component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page