odbc | Connect to ODBC databases | SQL Database library

 by   r-dbi C++ Version: v1.3.4 License: Non-SPDX

kandi X-RAY | odbc Summary

kandi X-RAY | odbc Summary

odbc is a C++ library typically used in Database, SQL Database, MariaDB, Oracle applications. odbc has no bugs, it has no vulnerabilities and it has low support. However odbc has a Non-SPDX License. You can download it from GitHub.

The goal of the odbc package is to provide a DBI-compliant interface to Open Database Connectivity (ODBC) drivers. This allows for an efficient, easy to setup connection to any database with ODBC drivers available, including SQL Server, Oracle, MySQL, PostgreSQL, SQLite and others. The implementation builds on the nanodbc C++ library.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              odbc has a low active ecosystem.
              It has 350 star(s) with 102 fork(s). There are 31 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 71 open issues and 387 have been closed. On average issues are closed in 805 days. There are 5 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of odbc is v1.3.4

            kandi-Quality Quality

              odbc has 0 bugs and 0 code smells.

            kandi-Security Security

              odbc has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              odbc code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              odbc has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              odbc releases are available to install and integrate.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of odbc
            Get all kandi verified functions for this library.

            odbc Key Features

            No Key Features are available at this moment for odbc.

            odbc Examples and Code Snippets

            No Code Snippets are available at this moment for odbc.

            Community Discussions

            QUESTION

            How do I install the ODBC driver for Snowflake successfully on an M1 Apple Silicon Mac?
            Asked 2022-Mar-30 at 15:00

            I'm having issues getting the ODBC driver for Snowflake to work on an M1 Apple Silicon Mac running Big Sur.

            Successfully following the instructions on Snowflake's website gets me to the point where testing the driver from the command line (using iodbctest) using the DSN results in the following error:

            ...

            ANSWER

            Answered 2021-Sep-21 at 20:50

            Big Sur is macOS v11.n

            Snowflake supports macOS 10.14 and 10.15 Supported OSs

            So what you are trying to do is not supported and is unlikely to work

            Source https://stackoverflow.com/questions/69275130

            QUESTION

            Download DML and DDL sql from Teradata using python
            Asked 2022-Mar-23 at 11:14

            What approach should I follow to download DDL, DML and Stored Procedures from the teradata database using python.

            I have created the sample code but what is the approach to download these sql files for data migration process.

            ...

            ANSWER

            Answered 2022-Mar-23 at 11:14

            Happy to share that I got the solution for this approach. In order to get the files in sql format use the given code to extract DDL and DML Code.

            The given code is for sample database dbc.

            Source https://stackoverflow.com/questions/70331340

            QUESTION

            MS Access / SQL Server - VBA: Upload local file to filestream on remote SQL server
            Asked 2022-Mar-13 at 23:33

            I need to upload a file (<10 MB) around once a week to a SQL Server 2016 database on a remote server in the same network. Until now it was all within a Access FE/BE but I want to migrate to SQL Server as backend.

            The attachments I had in MS Access so need to be handled now on the SQL database as I do not want to do this on a fileshare.

            I found many threads about using something like this from SQLShack

            ...

            ANSWER

            Answered 2022-Mar-13 at 23:33

            I may have found a solution using the links from @AlwaysLearning. The first sub actually answers my question to upload a file to a remote FILESTREAM SQL Server. The second sub downloads all uploaded files into a given directory.

            Source https://stackoverflow.com/questions/71448275

            QUESTION

            Hadoop to SQL through SSIS Package : Data incorrect format
            Asked 2022-Mar-13 at 20:05

            I am using an ODBC source connected to the Hadoop system and read the column PONum with value 4400023488 of datatype Text_Stream DT_Text]. Data is converted into string [DT_WSTR] using a data conversion transformation and then inserted into SQL Server using an OLE DB Destination. (destination column's type is a Unicode string DT_WSTR)

            I am able to insert Value to SQL Server table but with an incorrect format 㐴〰㌵㠵㔹 expected value is 4400023488.

            Any suggestions?

            ...

            ANSWER

            Answered 2022-Mar-13 at 20:04

            I have two suggestions:

            1. Instead of using a data conversion transformation, use a derived column that convert the DT_TEXT value to DT_STR before converting it to unicode:

            Source https://stackoverflow.com/questions/71451745

            QUESTION

            python pyodbc: Closing the cursor before conn.commit()?
            Asked 2022-Mar-08 at 16:07

            I have an issue with inserting data into a database using the python package pyodbc and since I am pretty new to pyodbc & databases in general, I might lack some basic understanding.

            I open a connection, and then I want the execute my query. Actually, in this query I call a stored procedure (which I didn't write and I am not allowed to change!). This procedure does "one or two" inserts. When I use pyodbc like this

            ...

            ANSWER

            Answered 2022-Mar-08 at 16:07

            You seem to have encountered a quirk in MySQL Connector/ODBC's handling of result sets from a stored procedure. For this example procedure:

            Source https://stackoverflow.com/questions/71392955

            QUESTION

            Oracle to Postgres Table load using SSIS load performance improvement suggestions?
            Asked 2022-Feb-20 at 09:22

            I'm trying to load a table from Oracle to Postgres using SSIS, with ~200 million records. Oracle, Postgres, and SSIS are on separate servers.

            Reading data from Oracle

            To read data from the Oracle database, I am using an OLE DB connection using "Oracle Provider for OLE DB". The OLE DB Source is configured to read data using an SQL Command.

            In total there are 44 columns, mostly varchar, 11 numeric, and 3 timestamps.

            Loading data into Postgres

            To lead data into Postgres, I am using an ODBC connection. The ODBC destination component is configured to load data in batch mode (not row-by-row insertion).

            SSIS configuration

            I created an SSIS package that only contains a straightforward Data Flow Task.

            Issue

            The load seems to take many hours to reach even a million count. The source query is giving results quickly while executing in SQL developer. But when I tried to export it threw limit exceeded error.

            In SSIS, when I tried to preview the result of the Source SQL command it returned: The system cannot find message text for message number 0x80040e51 in the message file for OraOLEDB. (OraOLEDB)

            Noting that the source(SQL command) and target table don't have any indexes.

            Could you please suggest any methods to improve the load performance?

            ...

            ANSWER

            Answered 2022-Feb-20 at 09:22

            I will try to give some tips to help you improve your package performance. You should start troubleshooting your package systematically to find the performance bottleneck.

            Some provided links are related to SQL Server. No worries! The same rules are applied in all database management systems.

            1. Available resources

            First, you should ensure that you have sufficient resources to load the data from the source server into the destination server.

            Ensure that the available memory on the source, ETL, and destination servers can handle the amount of data you are trying to load. Besides, make sure that your network connection bandwidth is not decreasing the data transfer performance.

            Make sure that the following hardware issues are not occurring in any of the servers:

            1. Drive out of storage
            2. Server is out of memory

            Make sure that your machine is not running out of memory. You can simply use the Task Manager to identify the amount of available memory.

            2. The Data Source Make sure that the table is not a heap

            After checking the available resources, you should ensure that your data source is not a heap. At least it would be best if you had a primary key created on that table.

            Indexes

            If your SQL Command contains any filtering, ordering, or Joins, you should create the indexes needed by those operations.

            OLE DB provider

            Instead of using OLE DB source to connect to Oracle, try using the Microsoft Connector for Oracle (Previously known as Attunity connectors). Microsoft previously mentioned that it should provide faster performance than traditional OLE DB providers.

            Use the Oracle connection manager rather than OLE DB connection manager.

            To be honest, I am not sure if this component can make the data load faster since I didn't test it before

            Removing the destination and adding a dummy task

            The last thing to try is to remove the ODBC destination and add any dummy task. For example, use a row count component.

            Run the package; if the data is loaded faster, then loading data from Oracle is not decreasing the performance.

            3. SSIS configuration

            Now, let us start troubleshooting the SSIS package.

            Running in 64-bit mode

            First, try to execute the package in 64-bit mode. You can change this from the package configuration. Make sure that the Run64bitRuntime property is set to True.

            Data flow task buffer size/limits

            Using SSIS, data is loaded in memory while being transferred from source to destination. There are two properties in the data flow task that specifies how much data is transferred in memory buffers used by the SSIS pipelines.

            Based on the following Integration Services performance tuning white paper:

            DefaultMaxBufferRows – DefaultMaxBufferRows is a configurable setting of the SSIS Data Flow task that is automatically set at 10,000 records. SSIS multiplies the Estimated Row Size by the DefaultMaxBufferRows to get a rough sense of your dataset size per 10,000 records. You should not configure this setting without understanding how it relates to DefaultMaxBufferSize.

            DefaultMaxBufferSize – DefaultMaxBufferSize is another configurable setting of the SSIS Data Flow task. The DefaultMaxBufferSize is automatically set to 10 MB by default. As you configure this setting, keep in mind that its upper bound is constrained by an internal SSIS parameter called MaxBufferSize which is set to 100 MB and can not be changed.

            You should try to change those values and test your package performance each time you change them until the package performance increases.

            4. Destination Indexes/Triggers/Constraints

            You should make sure that the destination table does not have any constraints or triggers since they significantly decrease the data load performance; each batch inserted should be validated or preprocessed before storing it.

            Besides, the more you have indexes the lower is the data load performance.

            ODBC Destination custom properties

            ODBC destination has several custom properties that can affect the data load performance such as BatchSize (Rows per batch), TransactionSize (TransactionSize is only available in the advanced editor).

            Source https://stackoverflow.com/questions/71175983

            QUESTION

            select into temporary table
            Asked 2022-Feb-19 at 20:28

            I believe I should be able to do select * into #temptable from othertable (where #temptable does not previously exist), but it does not work. Assuming that othertable exists and has valid data, and that #sometemp does not exist,

            ...

            ANSWER

            Answered 2022-Feb-19 at 20:28

            There are few different approaches:

            1. Use the immediate arg in your DBI::dbExecute statement

            Source https://stackoverflow.com/questions/71188354

            QUESTION

            BIML: Issues about Datatype-Handling on ODBC-Source Columns with varchar > 255
            Asked 2022-Feb-18 at 07:48

            I'm just getting into BIML and have written some Scripts to creat a few DTSX-Packages. In general the most things are working. But one thing makes me crazy.

            I have an ODBC-Source (PostgreSQL). From there I'm getting data out of a table using an ODBC-Source. The table has a text-Column (Name of the column is "description"). I cast this column to varchar(4000) in the query in the ODBC-Source (I know that there will be truncation, but it's ok). If I do this manually in Visual Studio the Advanced Editor of the ODBC-Source is showing "Unicode string [DT_WSTR]" with a Length of 4000 both for the External and the Output-Column. So there everything is fine. But if I do the same things with BIML and generate the SSIS-Package the External-Column will still say "Unicode string [DT_WSTR]" with a Length of 4000, but the Output-Column is telling "Unicode text stream [DT_NTEXT]". So the mapping done by BIML differs from the Mapping done by SSIS (manually). This is causing two things (warnings):

            1. A Warning that metadata has changed and should be synced
            2. And a Warning that the Source uses LOB-Columns and is set to Row by Row-Fetch..

            Both warnings are not cool. But the second one also causes a drasticaly degredation in Performance! If I set the cast to varchar(255) the Mapping is fine (External- and Output-Column is then "Unicode string [DT_WSTR]" with a Length of 255). But as soon as I go higher, like varchar(256) it's again treated as [DT_NTEXT] in the Output.

            Is there anything I can do about this? I invested days in the Evaluation of BIML and find many things an increase in Quality of Life, but this issue is killing it. It defeats the purpose of BIML if I have to correct the Errors of BIML manually after every Build.

            Does anyone know how I can solve this Issue? A correct automatic Mapping between External- and Output-Columns would be great, but at least the option to define the Mapping myself would be ok.

            Any Help is appreciated!

            Greetings Marco

            Edit As requested a Minimal Example for better understanding:

            • The column in the ODBC Source (Postegres) has the type "text" (Columnname: description)
            • I select it in a ODBC-Source with this Query (DirectInput): SELECT description::varchar(4000) from mySourceTable
            • The ODBC-Source in Biml looks like this: SELECT description::varchar(4000) from mySourceTable
            • If I now generate the dtsx-Package the ODBC-Source throws the above mentioned warnings with the above mentioned Datatypes for External and Output-Column
            ...

            ANSWER

            Answered 2022-Feb-18 at 07:48

            As mentioned in the comment before I got an answer from another direction:

            You have to use DataflowOverrides in the ODBC-Source in BIML. For my example you have to do something like this:

            Source https://stackoverflow.com/questions/71162537

            QUESTION

            Avoid dynamic SQL generation when calling stored procedure on SQL server from R/DBI/ODBC
            Asked 2022-Feb-16 at 22:26

            Using the profiler on SQL Server to monitor a stored procedure call via DBI/odbc, shows that dynamic SQL / prepared statement is generated :

            ...

            ANSWER

            Answered 2022-Feb-16 at 22:26

            I found what I was looking for in odbc package documentation : direct execution.

            The odbc package uses Prepared Statements to compile the query once and reuse it, allowing large or repeated queries to be more efficient. However, prepared statements can actually perform worse in some cases, such as many different small queries that are all only executed once. Because of this the odbc package now also supports direct queries by specifying immediate = TRUE.

            This will use a prepared statement:

            Source https://stackoverflow.com/questions/71052779

            QUESTION

            Getting a warning when using a pyodbc Connection object with pandas
            Asked 2022-Feb-11 at 16:32

            I am trying to make sense of the following error that I started getting when I setup my python code to run on a VM server, which has 3.9.5 installed instead of 3.8.5 on my desktop. Not sure that matters, but it could be part of the reason.

            The error

            ...

            ANSWER

            Answered 2022-Feb-11 at 16:30

            Is pyodbc becoming deprecated?

            No. For at least the last couple of years pandas' documentation has clearly stated that it wants either a SQLAlchemy Connectable (i.e., an Engine or Connection object) or a SQLite DBAPI connection. (The switch-over to SQLAlchemy was almost universal, but they continued supporting SQLite connections for backwards compatibility.) People have been passing other DBAPI connections (like pyodbc Connection objects) for read operations and pandas hasn't complained … until now.

            Is there a better way to achieve similar results without warning?

            Yes. You can take your existing ODBC connection string and use it to create a SQLAlchemy Engine object as described here:

            Source https://stackoverflow.com/questions/71082494

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install odbc

            For Unix and MacOS ODBC drivers should be compiled against unixODBC. Drivers compiled against iODBC may also work, but are not fully supported. After installation of the driver manager and driver, you will have to register the driver in a odbcinst.ini file for it to appear in odbc::odbcListDrivers(). odbc and it’s dependencies use C++11 features. Therefore you need gcc 4.8, clang 3.3 or Rtools 3.3 or later.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries

            Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link