rclone | cloud storage '' - Google Drive | Cloud Storage library
kandi X-RAY | rclone Summary
kandi X-RAY | rclone Summary
Rclone ("rsync for cloud storage") is a command-line program to sync files and directories to and from different cloud storage providers.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of rclone
rclone Key Features
rclone Examples and Code Snippets
Community Discussions
Trending Discussions on rclone
QUESTION
I want to migrate files from Digital Ocean Storage into Google Cloud Storage programatically without rclone.
I know the exact location file that resides in the Digital Ocean Storage(DOS), and I have the signed url for the Google Cloud Storage(GCS).
How can I modify the following code so I can copy the DOS file directly into GCS without intermediate download to my computer ?
...ANSWER
Answered 2022-Mar-27 at 16:18Google's Storage Transfer Servivce should be an answer for this type of problem (particularly because DigitalOcean Spaces like most is S3-compatible. But (!) I think (I'm unfamiliar with it and unsure) it can't be used for this configuration.
There is no way to transfer files from a source to a destination without some form of intermediate transfer but what you can do is use memory rather than using file storage as the intermediary. Memory is generally more constrained than file storage and if you wish to run multiple transfers concurrently, each will consume some amount of storage.
It's curious that you're using Signed URLs. Generally Signed URLs are provided by a 3rd-party to limit access to 3rd-party buckets. If you own the destination bucket, then it will be easier to use Google Cloud Storage buckets directly from one of Google's client libraries, such as Python Client Library.
The Python examples include uploading from file and from memory. It will likely be best to stream the files into Cloud Storage if you'd prefer to not create intermediate files. Here's a Python example
QUESTION
I am currently working on SFTP load to GCS bucket. However, I am able to do it for a limited number of files in any given SFTP directory by getting the list of files & iterating the absolute path of files. However, if the directory has too many files (or files within another folder), I am not able to do a simple ls & get the list of files to download from SFTP. Following is the working code to get the list of files in any given directory recursively from sftp:
...ANSWER
Answered 2022-Jan-31 at 17:00You can get a filelist quickly using the find(1) executing the find command in ssh:
QUESTION
Windows 11/ Powershell 7.2.1
I've added the following variables to user PATH and system PATH.
C:\Program Files\rclone\rclone-v1.57.0-windows-amd64\rclone.exe
When I try to run rclone
from Powershell or cmd I get the following message:
PS C:\Windows\System32> rclone rclone: The term 'rclone' is not recognized as a name of a cmdlet, function, script file, or executable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Sucessfully ran refreshenv
and just to be sure I restarted Windows.
After running $env:path -split ";"
I can see C:\Program Files\rclone\rclone-v1.57.0-windows-amd64\rclone.exe
is set correctly.
I can run rclone
from within the program folder I get this notice.
PS C:\Program Files\rclone\rclone-v1.57.0-windows-amd64> rclone rclone: The term 'rclone' is not recognized as a name of a cmdlet, function, script file, or executable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Suggestion [3,General]: The command rclone was not found, but does exist in the current location. PowerShell does not load commands from the current location by default. If you trust this command, instead type: ".\rclone". See "get-help about_Command_Precedence" for more details.
After setting rclone on PATH it still isn't "seen", what am I doing wrong here?
...ANSWER
Answered 2021-Dec-24 at 06:22You have to specify the directory containing the rclone.exe not the path of the executable. You should add C:\Program Files\rclone\rclone-v1.57.0-windows-amd64 to the PATH enveronement not C:\Program Files\rclone\rclone-v1.57.0-windows-amd64\rclone.exe
QUESTION
I would like to copy a directory (with all its sub files, folders etc.) from azure file storage (not azure blob storage) to an aws s3 bucket on powershell.
So : Azure Files -> Amazon Web Services (AWS) S3
What I tried :
using Rclone but rclone only takes into account blob and not file storage for the moment (see here)
use of azcopy but azcopy does not allow the following combination Azure Files (SAS) -> Amazon Web Services (AWS) S3 (Access Key)
The process must not go through a local location (Virtual machine).
Any Ideas ? Thanks !
...ANSWER
Answered 2021-Dec-23 at 14:39I thought of an alternative that works.
The idea is
- To mount the azure file storage on disk. So it's not really "local" but rather a shared file. (here)
- I use Rclone to copy to S3 from the local path to the mounted disk.
QUESTION
Question: I have two Google Shared Drives (Team Drives), let's say Movies and MoviesBackup. I need to backup whatever I upload to Movies into MoviesBackup. For this, I need an unidirectional sync from Movies to MoviesBackup, once daily.
What I have tried: I know of of Rclone, and have used its Command Line Interface to sync between two Shared Drives. Maybe if Rclone can be used from Google AppScript, I would set a daily trigger. But I seem to find no way to do so.
Any other solutions that work will be appreciated.
...ANSWER
Answered 2021-Dec-18 at 05:13Although I'm not sure whether this is the direct solution of your issue, in your situation, how about the following sample script? This sample scripts uses a Google Apps Script library. Ref
When this library is used, a source folder can be copied to the specific destination folder. And, when the files in the source folder are updated, the updated files are copied to the destination folder as the overwrite.
Usage: 1. Install library.Please install the library to your Google Apps Script project. The method of installing it can be seen at here.
2. Sample script.Please copy and paste the following script to the script editor of your Google Apps Script project. And save it. And, in this library, Drive API is used. So please enable Drive API at Advanced Google services.
And, please set the source and destination folder IDs to the following object
.
QUESTION
This question assumes you have used Google Drive Sync or at least have knowledge of what files it creates in your cloud drive
While using rclone to sync a local ubuntu directory to a Google Drive (a.k.a. gdrive) location, I found that rclone wasn't able to (error googleapi: Error 500: Internal Error, internalError
; the Google Cloud Platform API console revealed that the gdrive API call drive.files.create
was failing)
By location I mean the root of the directory structure that the Google Drive Sync app creates on the cloud (eg. emboldened of say: Computers/laptopName/(syncedFolder1,syncedFolder2,...)). In the current case, the gdrive sync app (famously unavailable on Linux) was running from a separate Windows machine. It was in this location that rclone wasn't able to create a dir.
Forget rclone. Trying to manually create the folder in the web app also fails as follows.
Working...
Could not execute action
Why is this happening and how to achieve this - making a directory in the cloud region where gdrive sync has put all my synced folders?
...ANSWER
Answered 2021-Dec-16 at 19:33Basically you can't. I found an explanation here
If I am correct in my suspicion, there are a few things you have to understand:
- Even though you may be able to create folders inside the Computers isolated containers, doing so will immediately create that folder not only in your cloud, but on that computer/device. Any changes to anything inside the Computers container will automatically be synced to the device/computer the container is linked to- just like any change on the device/computer side is also synced to the cloud.
- It is not possible to create anything at the "root" level of each container in the cloud. If that were permitted then the actual preferences set in Backup & Sync would have to be magically altered to add that folder to the preferences. Thus this is not done.
So while folders inside the synced folders may be created, no new modifications may be made in the "root" dir
QUESTION
Before this, I checked this, snakemake's documentation, this,and this. Maybe they actually answered this question but I just didn't understand it.
In short, I create in one rule a number of files from other files, that both conform to a wildcard format. I don't know how many of these I create, since I don't know how many I originally download.
In all of the examples I've read so far, the output is directory("the/path"), while I have a "the/path/{id}.txt. So this I guess modifies how I call the checkpoints in the function itself. And the use of expand.
The rules in question are:
download_mv
textgrid_to_ctm_txt
get_MV_IDs
merge_ctms
The order of the rules should be:
download_mv (creates {MV_ID}.TEX and .wav (though not necessarily the same amount)
textgrid_to_ctm_txt (creates from {MV_ID}.TEX matching .txt and .ctm)
get_MV_IDs (should make a list of the .ctm files)
merge_ctms (should concatenate the ctm files)
kaldi_align (from the .wav and .txt directories creates one ctm file)
analyse_align (compares ctm file from kaldi_align the the merge_ctms)
upload_print_results
I have tried with the outputs of download_mv being directories, and then trying to get the IDs but I had different errors then. Now with snakemake --dryrun
I get
ANSWER
Answered 2021-Dec-07 at 05:19I can see the reason why you got the error is:
You use input function in rule merge_ctms
to access the files generated by checkpoint. But merge_ctms
doesn't have a wildcard in output file name, snakemake didn't know which wildcard should be filled into MV_ID
in your checkpoint.
I'm also a bit confused about the way you use checkpoint, since you are not sure how many .TEX
files would be downloaded (I guess), shouldn't you use the directory that stores .TEX
as output of checkpoint, then use glob_wildcards
to find out how many .TEX
files you downloaded?
An alternative solution I can think of is to let download_mv
become your checkpoint and set the output as the directory containing .TEX
files, then in input function, replace the .TEX
files with .ctm
files to do the format conversion
QUESTION
I am trying to exec an rclone command via a PHP script. The plain text version of the call looks like this:
...ANSWER
Answered 2021-Dec-04 at 18:02If the below is working fine
rclone copy /media/storage/Events/01//999/001 events:testing-events-lowres/Events/01/999/001 --size-only --exclude /HiRes/* --include /Thumbs/* --include /Preview/* -P --checkers 64 --transfers 8 --config /home/admin/.config/rclone/rclone.conf -v --log-file=/www/html/admin/scripts/upload_status/001.json --use-json-log
Then below is the related PHP code, assuming your variables will contain correct values. You are doing some mistake while concatenating, not using proper . (dots) and " (quotes).
exec("rclone copy ".$baseDir."/".$mainEventCode."/".$eventCode." events:testing-events-lowres/Events/01/".$mainEventCode."/".$eventCode." --size-only --exclude /HiRes/* --include /Thumbs/* --include /Preview/* -P --checkers 64 --transfers 8 --config /www/html/admin/scrips/rclone.conf -v --log-file=$directoryName/".$eventCode.".json --use-json-log");
QUESTION
I have an Alpine Docker image I'm deploying using Fargate. This is the command it runs:
...ANSWER
Answered 2021-Dec-03 at 19:10I figured out a solution with help from this answer to a similar question, though I'm still unsure why tail -f
wasn't working for me.
In order to ensure that Fargate instantiates the image with a virtual terminal, I added this line to the ContainerDefinition
inside my TaskDefinition
in my CloudFormation template:
QUESTION
I'm on OSX Big Sur, previously using Google Backup and Sync to sync files between my computer and google drive.
I have setup Backup and Sync to sync any files in the folder /Users/doe/ODrive
which contains 16GB file size.
After migrating to the new google drive since backup and sync got deprecated, I see a different behaviour.
The new google drive by default works like rclone. It creates a virtual drive under /Volumes/GoogleDrive
and at the same time makes a symbolic link to /Users/doe/Google Drive
for quick access.
Here's my problem:
If I choose to access any files offline it starts downloading them on disk taking unnecessary disk space since I already have all files downloaded on disk but on a different location
/Users/doe/ODrive
. How do I tell google drive to use those files and not download anything?Theres a preference settings in the new google drive allowing to choose your desired directory location for google drive. If I set up that preference from the current setting
/Volumes/GoogleDrive
--->/Users/doe/ODrive
will that mess myODrive
folder and its content? I'd rather die than loose its content.What's the difference between
Folders from my computer
andFolders from Drive
. Isn't this two way communication likebackup and sync
was?
ANSWER
Answered 2021-Nov-15 at 20:53I did a bit of research & testing on my end and here's what I have found:
- If I choose to access any files offline it starts downloading them on disk taking unnecessary disk space since I already have all files downloaded on disk but on a different location
/Users/doe/ODrive
. How do I tell google drive to use those files and not download anything?
It seems like this is not possible. If you want to tell Google Drive to use those files and not download anything, the only option that you can do is to select the
Stream Files
option & then add the folder/Users/doe/ODrive
on theMy MacBook Pro
preferences. This way, the files from yourODrive
will be uploaded back to your drive instead. But, there's a catch as the uploaded files will be now be a duplicate because the Google Drive app will treat this as a new upload. And also, if you have Google Docs, Sheets, Slides or Forms on yourODrive
, the app seems to not upload these files back & it will show you an error on the app's activity screen.
Once the folder
/Users/doe/ODrive
on theMy MacBook Pro
preferences has been successfully added & synced, you will then see theODrive
folder on your drive.google.com > Computers (left side) > My MacBook Pro > ODrive. At the same time, theODrive
files are backed up and synced from your Drive to your computer and will also be available for offline use
- There's a preference settings in the new google drive allowing to choose your desired directory location for google drive. If I set up that preference from the current setting
/Volumes/GoogleDrive
--->/Users/doe/ODrive
will that mess myODrive
folder and its content? I'd rather die than loose its content.
No. The Google Drive app will show you a message to reset the folder back to default because it has to be an empty folder before you attempt to change & save the default directory folder
See this result on my end:
- What's the difference between
Folders from my computer
andFolders from Drive
. Isn't this two way communication likebackup and sync
was?
On my observation,
Folders from my computer
is the section where you can see/access all of the synced folders that you've added from the Google Drive app, on theMy Macbook Pro
preferences. You can then view these folders and their synced files at drive.google.com > Computers (left side option) > My MacBook Pro
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install rclone
Installation
Documentation & configuration
Changelog
FAQ
Storage providers
Forum
...and more
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page