AzureRmStorageTable | sample module to manipulate Azure Storage Table rows
kandi X-RAY | AzureRmStorageTable Summary
kandi X-RAY | AzureRmStorageTable Summary
Repository for a sample module to manipulate Azure Storage Table rows/entities. For a complete documentation with examples, troubleshooting guide, etc. , please refer to this link.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of AzureRmStorageTable
AzureRmStorageTable Key Features
AzureRmStorageTable Examples and Code Snippets
Community Discussions
Trending Discussions on AzureRmStorageTable
QUESTION
I need an advice how to upload a large csv file (about one minion lines) into a Azure table storage with PowerShell.
I am aware about Add-AzTableRow -table $using:table -partitionKey $partitionKey -rowKey $rowKey -property $tableProperty
https://github.com/paulomarquesc/AzureRmStorageTable/tree/master/docs
Based on tests with a smaller file with 10 000 lines it takes about 10 min to upload, processing sequentially. (It should be about 16-20 hours for 1 000 000 lines)
I have tried the smaller file (10 000 lines) with PowerShell 7 to use Foreach-Object -Parallel
, but the funny thing is that it takes about 3 times more. From three tests, two uploaded 10 000 lines of file for about 30 mins, one for 47 mins.
Just for comparison, I took less than an hour to upload the one minion lines file with Storage Explorer! So I was wondering what is the process they are using and could it be used with PowerShell?
I have reviewed the following article:
Which should be working, but it returns an error with: $table.CloudTable.ExecuteBatch($batchOperation)
So my question would be: Is there any way to load data into Azure table storage in parallel?
As requested, adding the code used.
Note: Code works just fine, but it takes time and I believe It could be faster. Looking for suggestions how to improve.
...ANSWER
Answered 2020-May-18 at 03:39According to the script you provide, you use the command Get-AzStorageTable -Name $tableName -Context $saContext).CloudTable
to get a CouldTable
Object. Its type is Microsoft.Azure.Cosmos.Table.CloudTableClient
.
So if we want to execute batch operations with the client, we need to set the type of batch operations as Microsoft.Azure.Cosmos.Table.TableBatchOperation
. Besides, please note that all entities in a batch must have the same PartitionKey. If your entities have different PartitionKeys, they need to be in separate batches
For example
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install AzureRmStorageTable
In a Windows Server 2016/Windows 10 execute the following cmdlets in order to install required modules Install-Module Az.Resources -AllowClobber -Force Install-Module Az.Storage -AllowClobber -Force
Install AzureRmStorageTable Install-Module AzureRmStorageTable
Add-AzTableRow
Get-AzTableRow
Get-AzTableRowAll
Get-AzTableRowByColumnName
Get-AzTableRowByCustomFilter
Get-AzTableRowByPartitionKey
Get-AzTableRowByPartitionKeyRowKey
Get-AzTableTable
Remove-AzTableRow
Update-AzTableRow
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page