0 / 0
Microsoft SQL Server connection

Microsoft SQL Server connection

Create a connection asset for Microsoft SQL Server.

Microsoft SQL Server is a relational database management system.

Supported versions

  • Microsoft SQL Server 2000+
  • Microsoft SQL Server 2000 Desktop Engine (MSDE 2000)
  • Microsoft SQL Server 7.0

Create a connection to Microsoft SQL Server

To create the connection asset, specify these connection details:

  • Database name You don't have to specify the database. With no database specified, you can import metadata from every database that is available for that connection.

  • Hostname or IP address

  • Port number or Instance name If the server is configured for dynamic ports, use the Instance name.

  • Username and password

  • Domain name If the Microsoft SQL Server has been set up in a domain that uses NTLM (New Technology LAN Manager) authentication, select Use Active Directory and enter the name of the domain that is associated with the username and password.

  • SSL certificate If required by the database server.

For Private connectivity, to connect to a database that is not externalized to the internet (for example, behind a firewall), you must set up a secure connection.

Choose the method for creating a connection based on where you are in the platform

In a project
Click Assets > New asset > Connect to a data source. See Adding a connection to a project.
In a catalog
Click Add to catalog > Connection. See Adding a connection asset to a catalog.
In a deployment space
Click Import assets > Data access > Connection. See Adding data assets to a deployment space.
In the Platform assets catalog
Click New connection. See Adding platform connections.

Next step: Add data assets from the connection

Where you can use this connection

You can use Microsoft SQL Server connections in the following workspaces and tools:

Projects

  • Cognos Dashboards (Cognos Dashboard Embedded service)
  • Data quality rules (IBM Knowledge Catalog)
  • Data Refinery (Watson Studio or IBM Knowledge Catalog)
  • DataStage (DataStage service). For more information, see Connecting to a data source in DataStage.
  • Decision Optimization (Watson Studio and Watson Machine Learning)
  • Metadata enrichment (IBM Knowledge Catalog)
  • Metadata import (IBM Knowledge Catalog)
  • Notebooks (Watson Studio). Click Read data on the Code snippets pane to get the connection credentials and load the data into a data structure. For more information, see Load data from data source connections.
  • SPSS Modeler (Watson Studio)

Catalogs

  • Platform assets catalog

  • Other catalogs (IBM Knowledge Catalog)

Data lineage

  • Metadata import (lineage) (IBM Knowledge Catalog and Manta Data Lineage)
Watson Query service
You can connect to this data source from Watson Query.

Microsoft SQL Server setup

Microsoft SQL Server installation

Restriction

Except for NTLM authentication, Windows Authentication is not supported.

Running SQL statements

To ensure that your SQL statements run correctly, refer to the Transact-SQL Reference for the correct syntax.

Configuring lineage metadata import for Microsoft SQL Server

When you create a metadata import for the Microsoft SQL Server connection, you can set options specific to this data source, and define the scope of data for which lineage is generated. For details about metadata import, see Designing metadata imports.

Scope of lineage metadata import

Include and exclude lists
You can include or exclude assets up to the schema level. Provide databases and schemas in the format database/schema. Each part is evaluated as a regular expression. Assets which are added later in the data source will also be included or excluded if they match the conditions specified in the lists. Example values:
  • myDB/: all schemas in myDB database.
  • myDB2/.*: all schemas in myDB2 database.
  • myDB3/mySchema1: mySchema1 schema from myDB3 database.
  • myDB4/mySchema[1-5]: any schema in my myDB4 database with a name that starts with mySchema and ends with a digit between 1 and 5.
External inputs
If you use external Microsoft SQL Server SQL scripts and T-SQL scripts which are not extracted directly from the connected server, you can add them in a ZIP file as an external input. You can organize the structure of a ZIP file as subfolders that represent databases and schemas. After the scripts are scanned, they are added under respective databases and schemas in the selected catalog or project. The ZIP file can have the following structure:
    <database_name>
        <schema_name>
           <script_name.sql>
    <database_name>
        <script_name.sql>
    <script_name.sql>
    replace.csv
    linkedServerConnectionsConfiguration.prm

The replace.csv file contains placeholder replacements for the scripts that are added in the ZIP file. For more information about the format, see Placeholder replacements.

The linkedServerConnectionsConfiguration.prm file contains linked server connection definitions. The following structure defines a single connection:

[{Shortcut_Name}] Type={connection_type}
Connection_String={connection_string}
Server_Name={server_name}
Database_Name={database_name}
Schema_Name={schema_name}
User_Name={user_name}

Advanced metadata import options

Extract extended attributes
You can extract extended attributes like primary key, unique and referential integrity constraints of columns. By default these attributes are not extracted.
Extraction mode
You can decide which extraction mode to run for the imported metadata. You have the following options:
  • Prefetch: use it for relational databases.
  • Parallel bulk: use it for analytical processing engines.
  • Single-thread: use it to avoid parallelism and large queries during extraction. When you select this mode, performance might be low.
Transformation logic extraction
You can enable building transformation logic descriptions from SQL code in SQL scripts.

Learn more

Parent topic: Supported connections

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more