dbatools | About-MySQL/Linux/Redis Tools | Development Tools library
kandi X-RAY | dbatools Summary
kandi X-RAY | dbatools Summary
About-MySQL/Linux/Redis Tools
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Transform the table into SQLAlchemy statements
- Inserts a new child element at position
- Return a new instance of the given value
- Return the last descendant of this node
- Transform an event
- Transform a Routine
- Parse the logs
- Determine if a record is within a datetime range
- Add a new connection_id to the tracker
- Return the value of a key
- Detect the encoding of the given XML data
- Transform the view into SQL
- Compare two databases
- Transform rows into SQLAlchemy rows
- Parse a slowquery log entry
- Execute the query
- Transform the CREATE INDRA statement into a list of statements
- Print a list of dictionaries
- Check wix install
- Execute the sql
- Set up the substitution for the given tag
- Handle multiple lines
- Transform the database
- Start a new tag
- Convert a document to HTML
- Convert a list of dictionaries to a list of lists
dbatools Key Features
dbatools Examples and Code Snippets
Community Discussions
Trending Discussions on dbatools
QUESTION
I am getting the following error when running this command:
...ANSWER
Answered 2022-Mar-13 at 14:32This has been acknowledged as a bug in the function by the dbatools team. This is fairly new functionality that was implemented in late 2021. This should get fixed in a future update to dbatools.
QUESTION
I am getting following error now:
...ANSWER
Answered 2021-Dec-08 at 17:11I added the following statements at the start of the PowerShell script file, which solved the error:
QUESTION
I am writing a dbatools script, for exporting various items from the database schema. When using Export-DbaScript
, the tables, indexes, PK/FK are dumped as expected.
However, the ordering of the tables and it's constraints are dumped in the wrong order. For example, table Foo are dumped with FK constraints to a table that doesn't appear until later in the script. This causes the dump to be useless for execution.
Note that this probably applies to the SMO API as well, as dbatools is basically a wrapper as far as I know. I also tried to fiddle with the various ScriptingOptions, without luck.
Example pseudo script:
...ANSWER
Answered 2021-Oct-20 at 10:16Answering my own question: I acted on the assumption that Export-DbaScript handles this based on the configuration object. That's not the case, so the export has to be done in two iterations: Table objects first, then the FK.
Sample code:
QUESTION
I am applying log shipping to a database using Invoke-DbaDbLogShipping
, and i am following the example in the documentation. The jobs get created and the Database is restored on the secondary server instance from what it appears, yet the jobs are failing. This is what the Backup job log states:
Backing up transaction log. Primary Database: 'Test14' Log Backup File: 'C:\Users...\Documents\DB Log Shipping\Backups\Local\Test14\Test14_20210822193000.trn' Error: Could not log history/error message.(Microsoft.SqlServer.Management.LogShipping) Error: Failed to convert parameter value from a SqlGuid to a String.(System.Data) Error: Object must implement IConvertible.(mscorlib) First attempt to backup database 'Test14' to file 'C:\Users...\Documents\DB Log Shipping\Backups\Local\Test14\Test14_20210822193000.trn' failed because Cannot open backup device 'C:\Users...\Documents\DB Log Shipping\Backups\Local\Test14\Test14_20210822193000.trn
Why is it failing?
Here is my script:
...ANSWER
Answered 2021-Sep-08 at 05:14I resolved the issue!
At C:\Users\...\Documents\DB Log Shipping\Backups\Local
there was no Test14
directory created. I created the folder Test18 manually, and the backup is now succeeding.
However, this is insane, because the Invoke-DbaDbLogShipping
should have created the Test18 folder AUTOMATICALLY when i executed the command. Apparently its only creating the folder under the $SharedPath
and $TransactionLogsCopyPath
but NOT $LocalPath
.
For automation purposes, I ended up just setting the local path to the $SharedPath
QUESTION
I am trying to come up with a Powershell script to dynamically do 'Restore Database' in SQL Server 2019 with multiple TRN (or BAK in my case) files that are located in one folder on a daily basis.
I will manually do the full backup first, and this task will be scheduled to run after (once on a daily basis).
So, a Python script will grab only yesterday's files from another folder into this folder, and this Powershell script will execute to run to restore a database using these TRN / BAK files (located in this folder).
The plan is go thru each TRN files (located in the same folder) sequentially (not with the time files were created, but by file name).
For example, it will start from "..04" --> "..12" in this case.
I found some examples from this site, but I was not sure how to code where it recognize the sequence ("..04" --> "..12") to run.
...ANSWER
Answered 2021-Apr-29 at 00:59So, by default, I think Get-ChildItem
should be already displaying the files starting from lowest to highest but if you want to make sure you could try something like this and see if the output fits your case.
For starting the test I'll create files using the same names as yours:
QUESTION
I am having a difficult time installing/updating my powershell modules. I noticed this when I tried installing DBA Tools moudle. Reference links are https://dbatools.io/download/ and https://github.com/sqlcollaborative/dbatools.
It's a corporate PC. But I know that I have installed other modules before in the past. Does anyone have any idea what's going on?
PS (Admin)>
Install-Module DBATools
- NOTE: The Install-Module command pauses for many minutes before the command returns a warning message.
WARNING: Unable to resolve package source 'https://www.powershellgallery.com/api/v2'. ERROR: "PackageManagement\Install-Package : No match was found for the specified search criteria and module name 'PowerShellGet'".
Update-Module PowerShellGet
ERROR: "Update-Module : Module 'PowerShellGet' was not installed by using Install-Module, so it cannot be updated.".
Update-Module PowerShellGet -Force
ERROR: "Update-Module : Module 'PowerShellGet' was not installed by using Install-Module, so it cannot be updated.".
Find-Module dbatools
- NOTE: The Find-Module command pauses for many minutes before the command returns an error message.
ERROR: "No match was found for the specified search criteria and module name 'dbatools'. Try Get-PSRepository to see all available registered module repositories."
Get-PSRepository | fl *
Name : PSGallery
SourceLocation : https://www.powershellgallery.com/api/v2
Trusted : False
Registered : True
InstallationPolicy : Untrusted
PackageManagementProvider : NuGet
PublishLocation : https://www.powershellgallery.com/api/v2/package/
ScriptSourceLocation : https://www.powershellgallery.com/api/v2/items/psscript
ScriptPublishLocation : https://www.powershellgallery.com/api/v2/package/
ProviderOptions : {}
Get-Module PackageManagement -ListAvailable
Directory: C:\Program Files\WindowsPowerShell\Modules
ModuleType Version Name ExportedCommands
Binary 1.0.0.1 PackageManagement {Find-Package, Get-Package, Get-PackageProvider, Get-Packa...
Binary 1.0.0.1 PackageManagement {Find-Package, Get-Package, Get-PackageProvider, Get-Packa...
...ANSWER
Answered 2020-Sep-21 at 13:07Try running Register-PSRepository -Default
QUESTION
some background: currently I am querying 4Mio rows (with 50 columns) from a MS SQL server with dbatools into a PSObject (in Batch 10.000 rows each query), processing the data with PowerShell (a lot of RegEx stuff) and writing back into a MariaDb with SimplySql. In average i get approx. 150 rows/sec. Had to use a lot of tricks (Net's Stringbuilder etc.) for this performance, its not that bad imho
As new requirements I want to detect the language of some text cells and I have to remove personal data (name & address). I found some good python libs (spacy and pycld2) for that purpose. I made tests with pycld2 - pretty good detection.
Simplified code for clarification (hint:I am a python noob):
...ANSWER
Answered 2020-Nov-29 at 21:30The following simplified example shows you how you can pass multiple [pscustomobject]
([psobject]
) instances from PowerShell to a Python script (passed as a string via -c
in this case):
by using JSON as the serialization format, via
ConvertTo-Json
...... and passing that JSON via the pipeline, which Python can read via stdin (standard input).
Important:
Character encoding:
PowerShell uses the encoding specified in the
$OutputEncoding
preference variable when sending data to external programs (such as Python), which commendably defaults to BOM-less UTF-8 in PowerShell [Core] v6+, but regrettably to ASCII(!) in Windows PowerShell.Just like PowerShell limits you to sending text to an external program, it also invariably interprets what it receives as text, namely based on the encoding stored in
[Console]::OutputEncoding
; regrettably, both PowerShell editions as of this writing default to the system's OEM code page.To both send and receive (BOM-less) UTF-8 in both PowerShell editions, (temporarily) set
$OutputEncoding
and[Console]::OutputEncoding
as follows:
$OutputEncoding = [Console]::OutputEncoding = [System.Text.Utf8Encoding]::new($false)
If you want your Python script to also output objects, again consider using JSON, which on the PowerShell you can parse into objects with
ConvertFrom-Json
.
QUESTION
I have two SQL Server 2019 instances running on Linux. These two instances both contain a single database which is synchronized using AlwaysOn Availability Group. Data in the database is synchronized, but the problem is that the SQL Agent jobs are not part of the database itself.
Therefore, when I create a SQL Server Agent job on the primary replica, this configuration does not copy to the secondary replica. So, after creating each job, I always have to also go to the secondary and create the job there as well. And I have to keep track of all the changes I make all the time.
Is there a built-in way to automate this cross-replica synchronization of SQL Server jobs on Linux when using availability groups? Job synchronization across AG replicas seems like something that should already be natively supported by SQL Server/SQL Server Agent tools, but I found nothing from Microsoft, only a third-party tool for called DBA Tools that I can use to write my own automation scripts in PowerShell.
...ANSWER
Answered 2020-Aug-26 at 06:02dbatools can sync them but I haven't tried it on an AG running on linux. Let me know if it works or not! The first parameter is the name of your AG, the second is the virtual network name of your cluster.
QUESTION
I am trying to use Analysis Services Cmdlets to process partitions on a Power BI Premium Model. My PowerShell script works fine when run from ISE, the command line, and when scheduled using windows task scheduler; however, when I try to schedule the PowerShell script using a SQL Server 2019 Agent job using a step type of Operating System (CmdExec)" the following error message is encountered.
Message Executed as user: MyDomain\MyUser. Invoke-ProcessPartition : The connection string is not valid. At C:\Users\MyUser\Desktop\PS1\SSAS\wtf.ps1:15 char:11 + $asResult = Invoke-ProcessPartition -Credential $UserCredential -Server...+
CategoryInfo : NotSpecified: (:) [Invoke-ProcessPartition], ConnectionException + FullyQualifiedErrorId : Microsoft.AnalysisServices.ConnectionException,Microsoft.AnalysisServices.PowerShell.Cmd lets.ProcessPartition.
I have followed the steps in this blog article to setup the job. The same windows user is used in all three run scenarios. The SQL server is my local development SQL server of which the windows user is SA on the SQL Server and Windows Admin. The same machine the SQL Instance is on is being used for successfully executing the other three ways of running the PS scripts, (ISE, Command Line, & Windows Task Scheduler)
If I run the following from the command line on the same machine as the SQL server, my local host, the PowerShell script runs successfully.
...ANSWER
Answered 2020-Sep-08 at 23:06I have found a solution to the issue. The resolution was twofold.
First problem was that when PowerShell was being run from the SQL Server Agent, the Version of the SqlServer module was an older outdated version. I found this out by executing a ps1 file from the SQL Server Agent Job using the following code and looking in the job history results.
QUESTION
How to migrate SQL Server Instance from Local Instance to Azure VM SQL Server Instance?
Seeking for experts support resolving following issue.
Scenario:
LocalInstance: SQLSRV01
Azure VM: 23.96.20.20
-Local SQL SERVER and Azure VM SQL SERVER Instance are of SQLSERVER 2017 (14.0)
-Added Inbound port rule for sql server
-SharedPath accessible from both sides ( Local computer as well as from Azure VM: 23.96.20.20 )
-DBSERVER17 instance is accessible and connected from local computer
-Same command worked well at my local computer with two different SQL SERVER Instances.
Power Shell Script:
...ANSWER
Answered 2020-Jul-16 at 06:17Supplied parameters $scred, $dcred then passed $scred object to the SourceSqlCredential and passed $dcred object to the DestinationSqlCredential that's it.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install dbatools
You can use dbatools like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page