laravel-chunk-upload | The basic implementation for chunk upload with multiple providers support like jQuery-file-upload, p | File Upload library
kandi X-RAY | laravel-chunk-upload Summary
kandi X-RAY | laravel-chunk-upload Summary
Supports Laravel from 5.2 to 7 (covered by integration tests on all versions). Easy to use service/library for chunked upload with supporting multiple JS libraries on top of Laravel's file upload with low memory footprint in mind. Supports feature as cross domains requests, automatic clean schedule and easy usage. Example repository with integration tests can be found in laravel-chunk-upload-example. Before adding pull requests read CONTRIBUTION.md. Help me fix your bugs by debugging your issues using XDEBUG (and try to do a fix - it will help you become better).
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Create chunk file name
- Handles the chunk storage .
- Get the old chunk files .
- Boot the application .
- Check request parameters
- Create full chunked file .
- Get the handler class from the request .
- Append a file to the destination
- Handle uploaded file
- Tries to parse the content range .
laravel-chunk-upload Key Features
laravel-chunk-upload Examples and Code Snippets
Community Discussions
Trending Discussions on laravel-chunk-upload
QUESTION
We've got a set up using google appengine with a docker container running a laravel application. Our users need to upload large video files (max 1028MB) to the server which in turn is stored in GCS. But GAE gives an error 413 Request entity too large nginx. I've confirmed this is not an issue on our server configs but a restriction on GAE This is a pretty common requirement. How do you guys get around this?
What i've tried:
- Chunking using this package https://github.com/pionl/laravel-chunk-upload and dropzone.js to break down the file when sending (Still results in 413)
- Blobstore API is not applicable for us as we need to constantly retrieved and play the files.
ANSWER
Answered 2021-Jan-14 at 11:53As mentioned by @GAEfan, you can't change this limit on GAE. The recommended approach would be to upload your files to Google Cloud Storage and then process the file from Google Cloud Storage.
QUESTION
I am using laravel-chuck-upload in combination with dropzone.js. Everything works fine, the chunks are uploading and when they are uploaded the final file will be saved in S3. The problem is that 1 chunk is always missing. Sometimes .8.part is missing, sometimes .7.part is missing (they remain in the chunks directory after uploading). When I upload a video that has a total size of 9.7MB the file in S3 is 8.7MB. That is 1mb missing, the same size as the missing part. All the chunks are 1MB in size.
What could be the problem and how can I fix this?
Edit: I think I have found the problem, but not the fix. When the last chunk (10th) is uploaded it thinks all the chunks are uploaded but the 8th chunk isn't finished uploading yet.
...ANSWER
Answered 2020-Apr-01 at 13:14Extended the DropZoneUploadHandler with my own isLastChunk method. This way the file gets uploaded when it's really the latest chunk. The only problem is that dropzone.js triggers the chunksUploaded event for the last chunk call and not for the last completed chunk.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install laravel-chunk-upload
PHP requires the Visual C runtime (CRT). The Microsoft Visual C++ Redistributable for Visual Studio 2019 is suitable for all these PHP versions, see visualstudio.microsoft.com. You MUST download the x86 CRT for PHP x86 builds and the x64 CRT for PHP x64 builds. The CRT installer supports the /quiet and /norestart command-line switches, so you can also script it.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page