SAP IQ OCF Connector: Install and Configure¶
Network Connectivity¶
Open outbound TCP port 2638 to the SAP IQ server.
Service Account¶
Create a service account for Alation and grant it the permissions to perform metadata extraction, sampling and profiling, and query log ingestion.
Example:
CREATE USER alation IDENTIFIED BY 'password';
Permissions for Metadata Extraction¶
GRANT SELECT ON SYS.SYSUSER to alation GRANT SELECT ON SYS.SYSTABLE to alation GRANT SELECT ON SYS.SYSCOLUMNS to alation GRANT SELECT ON SYS.SYSUSERPERMS to alation GRANT SELECT ON SYS.SYSINDEX to alation GRANT SELECT ON SYS.SYSINDEXES to alation
Permission |
Purpose |
---|---|
GRANT SELECT ON SYS.SYSUSER |
Required for schema extraction |
GRANT SELECT ON SYS.SYSTABLE |
Required for table extraction |
GRANT SELECT ON SYS.SYSCOLUMNS |
Required for column extraction |
GRANT SELECT ON SYS.SYSUSERPERMS |
Required for view extraction |
GRANT SELECT ON SYS.SYSINDEX |
Required for index extraction |
GRANT SELECT ON SYS.SYSINDEXES |
Required for index, primary key, and foreign key extraction. |
Permissions for Sampling and Profiling¶
Grant the service account the SELECT
permissions on all schemas and tables you want to sample in Alation. For views, grant SHOW VIEW
on all views you want to sample.
Example:
GRANT SELECT ON [database].* TO alation; GRANT SHOW VIEW ON [database].* TO alation;
Permissions for Query Log Ingestion¶
Grant the service account the SELECT
permissions on the table that stores query history. Refer to Query Log Ingestion below for detailed information on how to configure QLI.
JDBC URI¶
Format¶
Use the following format to build the JDBC URI:
sybase:Tds://<hostname_or_ip>:<port>?ServiceName=<database_name>
Example¶
sybase:Tds:10.13.9.72:2638?ServiceName=alation
Configuration in Alation¶
STEP 1: Install the Connector¶
Alation On-Prem¶
Important
Installation of OCF connectors requires Alation Connector Manager to be installed as a prerequisite.
To install an OCF connector:
If this has not been done on your instance, install the Alation Connector Manager: Install Alation Connector Manager.
Ensure that the OCF connector Zip file that you received from Alation is available on your local machine.
Install the connector on the Connectors Dashboard page using the steps in Manage Connectors.
Alation Cloud Service¶
Connecting to Cloud Databases¶
Note
On Alation Cloud Service instances, Alation Connector Manager is available by default.
To install an OCF connector:
Ensure that the OCF connector Zip file that you received from Alation is available on your local machine.
Install the connector on the Connectors Dashboard page using the steps in Manage Connectors.
Connecting to On-Prem Databases¶
Important
To connect to an on-prem database from an Alation Cloud Service instance, you may need to use Alation Agent.
Ensure that the Alation Agent functionality is enabled on your Alation instance. Create a Support ticket with Alation for an Alation representative to enable the Alation Agent feature on your instance and to receive the Alation Agent installer.
Install Alation Agent using the information in Alation Agent.
Install the OCF connector on Alation Agent.
STEP 2: Create and Configure a New Data Source¶
In Alation, add a new data source:
Log in to Alation as a Server Admin.
Expand the Apps menu on the right of the main toolbar and select Sources.
On the Sources page, click +Add on the top right of the page and in the list that opens, click Data Source. This will open the Add a Data Source wizard.
On the first screen of the wizard, specify a name for your data source, assign additional Data Source Admins, if necessary, and click the Continue Setup button on the bottom. The Add a Data Source screen will open.
On the Add a Data Source screen, the only field you should populate is Database Type. From the Database Type dropdown, select the connector name. After that you will be navigated to the Settings page of your new data source.
The name of this connector is SybaseIQ OCF Connector.
Access¶
On the Access tab, set the data source visibility using these options:
Public Data Source—The data source will be visible to all users of the catalog.
Private Data Source—The data source will be visible to the users allowed access to the data source by Data Source Admins.
You can add new Data Source Admin users in the Data Source Admins section.
General Settings¶
Application Settings¶
Specify Application Settings if applicable. Click Save to save the changes after providing the information.
Parameter |
Description |
---|---|
BI Connection Info |
This parameter is used to generate lineage between a BI source and this data source. Specify the host and the port used by BI connections that pull data from this data source. Use the following format: You can provide multiple values as a comma-separated list:
The host and port details can also be found under Properties > Database connections on the BI Datasource page > Connections tab. |
Disable Automatic Lineage Generation |
Select this checkbox to disable automatic lineage generation from QLI, MDE, and Compose queries. By default, automatic lineage generation is enabled. |
Connector Settings¶
Data Source Connection¶
Populate the data source connection information and save the values by clicking Save in this section.
Parameter |
Description |
---|---|
JDBC URI |
Specify the JDBC URI in the required format. |
Username |
Specify the service account username. |
Password |
Specify the service account password. |
Logging Configuration¶
Select the logging level for the connector logs and save the values by clicking Save in this section. The available log levels are based on the Log4j framework.
Parameter |
Description |
---|---|
Log level |
Select the log level to generate logs. The available options are INFO, DEBUG, WARN, TRACE, ERROR, FATAL, ALL. |
You can view the connector logs in Admin Settings > Manage Connectors > SybaseIQ OCF Connector.
Obfuscate Literals¶
Obfuscate Literals—Enable this toggle to hide actual values in the query statements that are ingested during query log ingestion or executed in Compose. This toggle is disabled by default.
Test Connection¶
Under Test Connection, click Test to validate network connectivity.
Metadata Extraction¶
You can configure metadata extraction (MDE) for an OCF data source on the Metadata Extraction tab of the Settings page. Refer to Configure Metadata Extraction for OCF Data Sources for information about the available configuration options.
The default queries that the connector uses to extract metadata can be found in Extraction Queries for SAP IQ. You can customize these queries to adjust the extraction to your needs.
Sampling and Profiling¶
Sampling and profiling is supported. For details, see Configure Sampling and Profiling for OCF Data Sources.
Compose¶
For details about configuring the Compose tab of the Settings, refer to Configure Compose for OCF Data Sources.
Query Log Ingestion¶
Configure QLI in SAP IQ¶
Step 1: Check Request Logging State¶
Begin with checking the state of the Request Logging and Request Log files. By default, the value of Request Logging is none and the value of Request Log is empty. Run the following query to check the current state:
SELECT property('RequestLogFile'), property('RequestLogging');
Step 2: Enable SQL Log Storage¶
Set Request Logging to SQL.
CALL sa_server_option('RequestLogging','SQL');
Execution of this call enables SQL log storage. Set the log file to
sqllog.txt
.CALL sa_server_option('RequestLogFile', 'sqllog.txt');
Note
Setting the log file to
sqllog.txt
must be done every time the database server is restarted.
Step 3: Create a Table for Logging Events¶
Create a table to store query history in a schema of your choice. In the example below, we are creating the table user_connections_event_table
.
SELECT * INTO user_connections_event_table FROM sp_iqconnection() WHERE 1=2Note
If the query errors out, run the query
SELECT * FROM user_connections_event_table
to make sure that the table exists.
Step 4: Enable Query History¶
Use the following example queries to enable storage of query history and grant the service account permissions to select from the QLI table.
CREATE EVENT user_connections_event TYPE CONNECT HANDLER BEGIN INSERT INTO user_connections_event_table SELECT * FROM sp_iqconnection() a WHERE a.Name != 'user_connections_event' AND datediff(Second, now(), a.conncreatetime) < 1 END; GRANT SELECT ON user_connections_event_table TO alation;
Configure QLI in Alation¶
You can configure QLI on the Query Log Ingestion tab of the data source Settings page. Select a QLI option for your data source and schedule the QLI job if necessary.
You can choose to create a table or a view on your database to store the query history data (table-based QLI) or to provide a custom query to retrieve query history every time you run QLI (custom query-based QLI).
Table-Based QLI¶
Use the query format below to create a view and flush the query history from the data source:
Substitute the placeholder view name
schema.view
with your actual values.This example selects from
schema.user_connections_event_table
. Substitute it with the name of your QLI table.
CREATE VIEW schema.view AS
SELECT * FROM
(SELECT
u.Userid as username,
u.ConnCreateTime AS sessionStartTime,
c.start_time AS startTime,
trim(u.Userid) || '\/' || trim(CAST( DATEFORMAT( u.ConnCreateTime , 'yyyy-mm-ddHH:NN:SS.SS' ) AS CHAR(25)))
|| '\/' || trim(c.conn_id) AS sessionId,
c.req_id AS idx,
c.millisecs/1000.0 AS seconds,
c.stmt AS queryString,
c.rowcount AS num_result_rows,
'N' as cancelled,
DB_NAME() AS defaultDatabases,
RANK() over
(PARTITION BY c.req_id, c.start_time, c.stmt
ORDER BY u.ConnCreateTime desc) AS RANK
FROM satmp_request_time c
INNER JOIN
(SELECT DISTINCT
ConnCreateTime,
ConnHandle,
Userid
FROM schema.user_connections_event_table) u
ON c.conn_handle = u.ConnHandle
AND c.start_time > u.ConnCreateTime ) AS T
WHERE T.RANK = 1
AND username IS NOT NULL;
On the Query Log Ingestion tab under Connector Settings > Query Extraction, in the Table Name field specify the name of the view in which the query logs are available. Make sure that the service account has the permissions to access this view. The table name must be provided in the format schema.view
.
Custom Query-Based QLI¶
When for some reason you cannot create a table or view on the database, you can use a custom QLI query to perform QLI. In the Custom QLI Query field, provide the expected query structure as shown below and click Save.
This example selects from schema.user_connections_event_table
. Substitute it with the name of your schema and QLI table.
SELECT * FROM
(SELECT
u.Userid as username,
u.ConnCreateTime AS sessionStartTime,
c.start_time AS startTime,
trim(u.Userid) || '\/' || trim(CAST( DATEFORMAT( u.ConnCreateTime , 'yyyy-mm-ddHH:NN:SS.SS' ) AS CHAR(25)))
|| '\/' || trim(c.conn_id) AS sessionId,
c.req_id AS idx,
c.millisecs/1000.0 AS seconds,
c.stmt AS queryString,
c.rowcount AS num_result_rows,
'N' AS cancelled,
DB_NAME() AS defaultDatabases,
RANK() over
(PARTITION BY c.req_id, c.start_time, c.stmt
ORDER BY u.ConnCreateTime desc) AS RANK
FROM satmp_request_time c
INNER JOIN
(SELECT DISTINCT
ConnCreateTime,
ConnHandle,
Userid
FROM schema.user_connections_event_table) u
ON c.conn_handle = u.ConnHandle AND c.start_time > u.ConnCreateTime) AS T
WHERE T.RANK = 1
AND username IS NOT NULL
AND startTime >=DATEFORMAT(STARTTIME, 'YYYY-MM-DD HH:mm:ss.SSS')
AND startTime < DATEFORMAT(ENDTIME , 'YYYY-MM-DD HH:mm:ss.SSS');
Automated and Manual QLI¶
You can either perform QLI manually on demand or enable automated QLI:
To perform manual QLI, under the Automated and Manual Query Log Ingestion section of the Query Log Ingestion tab, ensure that the Enable Automated Query Log Ingestion toggle is disabled.
Click Preview to get a sample of the query history data to be ingested.
Click the Import button to perform QLI on demand.
Note
Metadata extraction must be completed first before running QLI.
To schedule QLI, enable the Enable Automated Query Log Ingestion toggle.
Set a schedule under Automated Query Log Ingestion Time by specifying values in the week, day and time fields. The next QLI job will run on the schedule you have specified.
Note
The hourly schedule for automated QLI is not supported.
Log Location¶
Refer to Troubleshooting for information about logs.