Teradata OCF Connector: Install and Configure¶
Prerequisites¶
Ports¶
Open outbound TCP port 1025 to the Teradata server.
Create Service Account¶
Sample SQL to create an account:
CREATE USER alation FROM [database] AS PASSWORD = [password] PERM = 20000000;
Permissions for Metadata Extraction¶
Ensure the user alation
has the following privileges on system tables.
GRANT SELECT ON dbc.dbase TO alation;
GRANT SELECT ON dbc.databasesv TO alation;
GRANT SELECT ON dbc.tablesv TO alation;
GRANT SELECT ON dbc.tabletextv TO alation;
GRANT SELECT ON dbc.tablesizev TO alation;
GRANT SELECT ON dbc.columnsv TO alation;
GRANT SELECT ON dbc.indicesv TO alation;
GRANT SELECT ON dbc.all_ri_parentsv to alation;
GRANT SELECT ON dbc.indexconstraintsv TO alation;
GRANT SELECT ON dbc.tvm TO alation;
GRANT SELECT ON dbc.tvfields TO alation;
If only databasesv is allowed, (dbase cannot be granted), then run the following command to enable the following feature flag in the Alation shell, refer to alation_conf:
In alation_conf, set the following flag to True:
alation.feature_flags.teradata_use_databasesv -s TrueRestart the celery.
alation_supervisor restart celery:celery-beat
Permissions for Sampling and Profiling¶
For sampling and profiling, grant the service account the SELECT rights on all schemas and tables that you want to retrieve samples from.
Access Rules¶
Make sure that the service account user has the following access rules granted:
GRANT SELECT ON dbc.rolemembersv TO alation;
GRANT SELECT ON dbc.allRoleRightsv TO alation;
GRANT SELECT ON dbc.databasesv TO alation;
GRANT SELECT ON dbc.allrightsv TO alation;
JDBC URI¶
Format¶
teradata://<hostname_or_ip>/DBS_PORT=<port>,COP=OFF,CHARSET=UTF8
No additional parameters are required in the URI string when using an LDAP account
Example¶
teradata://10.13.75.49/DBS_PORT=1025,COP=OFF,CHARSET=UTF8
Installation¶
Step 1: Install the Connector¶
Alation On-Prem¶
Important
Installation of OCF connectors requires Alation Connector Manager to be installed as a prerequisite.
To install an OCF connector:
If this has not been done on your instance, install the Alation Connector Manager: Install Alation Connector Manager.
Ensure that the OCF connector Zip file that you received from Alation is available on your local machine.
Install the connector on the Connectors Dashboard page using the steps in Manage Connectors.
Alation Cloud Service¶
Connecting to Cloud Databases¶
Note
On Alation Cloud Service instances, Alation Connector Manager is available by default.
To install an OCF connector:
Ensure that the OCF connector Zip file that you received from Alation is available on your local machine.
Install the connector on the Connectors Dashboard page using the steps in Manage Connectors.
Connecting to On-Prem Databases¶
Important
To connect to an on-prem database from an Alation Cloud Service instance, you may need to use Alation Agent.
Ensure that the Alation Agent functionality is enabled on your Alation instance. Create a Support ticket with Alation for an Alation representative to enable the Alation Agent feature on your instance and to receive the Alation Agent installer.
Install Alation Agent using the information in Alation Agent.
Install the OCF connector on Alation Agent.
Step 2: Create and Configure a New Data Source¶
In Alation, add a new data source:
Log in to Alation as a Server Admin.
Expand the Apps menu on the right of the main toolbar and select Sources.
On the Sources page, click +Add on the top right of the page and in the list that opens, click Data Source. This will open the Add a Data Source wizard.
On the first screen of the wizard, specify a name for your data source, assign more Data Source Admins, if necessary, and click the Continue Setup button on the bottom of the screen. The Add a Data Source screen will open.
The only field that you need to populate on the Add a Data Source screen is Database Type. From the Database Type dropdown, select the connector name. You will be navigated to the Settings page of your new data source.
The name of this connector is Teradata OCF Connector.

Access¶
On the Access tab, set the data source visibility using these options:
Public Data Source—The data source will be visible to all users of the catalog.
Private Data Source—The data source will be visible to the users allowed access to the data source by Data Source Admins.
You can add new Data Source Admin users in the Data Source Admins section.
General Settings¶
Perform the configuration on the General Settings tab:
Application Settings¶
Specify Application Settings if applicable. Click Save to save the changes after providing the information.
Parameter |
Description |
---|---|
BI Connection Info |
This parameter is used to generate lineage between a BI source and this data source. Specify the host and the port used by BI connections that pull data from this data source. Use the following format: You can provide multiple values as a comma-separated list:
The host and port details can also be found under Properties > Database connections on the BI Datasource page > Connections tab. |
Disable Automatic Lineage Generation |
Select this checkbox to disable automatic lineage generation from QLI, MDE, and Compose queries. By default, automatic lineage generation is enabled. |
Connector Settings¶
Data Source Connection¶
Populate the data source connection information and save the values by clicking Save in this section.
Parameter
Description
Data Source Connection
JDBC URI
Specify the JDBC URI in the required format.
Use LDAP Server
Select the Use LDAP Server checkbox to authenticate with the LDAP protocol.
Username
Specify the service account username.
Password
Specify the service account password.
Logging Configuration¶
Select the logging level for the connector logs and save the values by clicking Save in this section. The available log levels are based on the Log4j framework.
Parameter |
Description |
---|---|
Log level |
Select the log level to generate logs. The available options are INFO, DEBUG, WARN, TRACE, ERROR, FATAL, ALL. |
Obfuscate Literals¶
Obfuscate Literals — Enable this toggle to hide the details of the queries in the catalog page that are ingested via QLI or executed in Compose. This toggle is disabled by default.
Test Connection¶
Under Test Connection, click Test to validate network connectivity.
Metadata Extraction¶
You can configure metadata extraction (MDE) for an OCF data source on the Metadata Extraction tab of the Settings page. Refer to Configure Metadata Extraction for OCF Data Sources for information about the available configuration options.
Metadata Extraction Queries¶
To use query-based metadata extraction, you will need to write custom queries to extract the metadata. Alation expects that these queries conform to a specific structure and use some reserved identifiers. After providing custom queries, save them by clicking Save in this section.
Refer to Extraction Queries for Teradata for information about query customization.

Compose¶
For details about configuring the Compose tab of the Settings, refer to Configure Compose for OCF Data Sources.
Note
Compose is not available for on-prem databases connected to the Alation Cloud Service instances through Alation Agent.
Sampling and Profiling¶
Sampling and profiling is supported. For details, see Configure Sampling and Profiling for OCF Data Sources.
Per-Object Parameters¶
Refer to Sampling and Profiling.
Query Log Ingestion¶
You can choose to create a table or a view on your database to store the query history data or to provide a custom query to retrieve the query history data.
Table-Based QLI¶
Use the query format to create a new view and flush the query history from the data source.
CREATE VIEW <View_Name> AS
SELECT
s.ProcID,
s.CollectTimeStamp,
s.QueryID,
UserID,
AcctString,
ExpandAcctString,
SessionID,
LogicalHostID,
RequestNum,
InternalRequestNum,
LogonDateTime,
AcctStringTime,
AcctStringHour,
AcctStringDate,
LogonSource,
AppID,
ClientID,
ClientAddr,
QueryBand,
ProfileID,
StartTime,
FirstStepTime,
FirstRespTime,
LastStateChange,
NumSteps,
NumStepswPar,
MaxStepsInPar,
NumResultRows,
TotalIOCount,
AMPCPUTime,
ParserCPUTime,
UtilityByteCount,
UtilityRowCount,
ErrorCode,
ErrorText,
WarningOnly,
((firstresptime - starttime) hour(4) to second) (Named ElapsedTime),
DelayTime,
AbortFlag,
CacheFlag,
StatementType,
StatementGroup,
sqltextinfo AS QueryText,
NumOfActiveAMPs,
MaxAMPCPUTime,
MaxCPUAmpNumber,
MinAmpCPUTime,
MaxAmpIO,
MaxIOAmpNumber,
MinAmpIO,
SpoolUsage,
WDID,
OpEnvID,
SysConID,
LSN,
NoClassification,
WDOverride,
ResponseTimeMet,
ExceptionValue,
FinalWDID,
TDWMEstMaxRows,
TDWMEstLastRows,
TDWMEstTotalTime,
TDWMAllAmpFlag,
TDWMConfLevelUsed,
TDWMRuleID,
UserName,
DefaultDatabase,
AMPCPUTimeNorm,
ParserCPUTimeNorm,
MaxAMPCPUTimeNorm,
MaxCPUAmpNumberNorm,
MinAmpCPUTimeNorm,
EstResultRows,
EstProcTime,
EstMaxRowCount,
ProxyUser,
ProxyRole,
SessionTemporalQualifier,
CalendarName,
SessionWDID,
DataCollectAlg,
ParserExpReq,
CallNestingLevel,
NumRequestCtx,
KeepFlag,
QueryRedriven,
ReDriveKind,
CPUDecayLevel,
IODecayLevel,
TacticalCPUException,
TacticalIOException,
SeqRespTime,
ReqIOKB,
ReqPhysIO,
ReqPhysIOKB,
r.sqlrowno
FROM <Schema_Name>,
<Schema_Name>
WHERE r.queryid = s.queryid;
In the Table Name field, specify the name of the view in which the query logs are available. Make sure that the service account has the permissions to access this table or view. The table name must be provided in the following format:
schema.table_name or schema.view_name
Custom Query-Based QLI¶
When for some reason you cannot create a table or view on the database, you can use a custom QLI query to perform QLI. In the Custom QLI Query field, provide the expected query structure as shown below and click Save.
SELECT
ProfileId AS profileId,
TRIM(AbortFlag) AS cancelled,
NumOfActiveAmps AS numActiveNodes,
ClientAddr AS clientAddress,
EstProcTime AS estProcessingTime,
LogicalHostId AS logicalHostId,
MaxAmpCPUTime AS maxAmpCpuTime,
QueryID AS queryIndex,
UtilityByteCount AS bytesTransferred,
QueryBand AS queryBand,
RequestNum AS requestId,
(CASE ErrorCode WHEN 0 THEN NULL ELSE ErrorCode END) AS executionErrorCode,
Trim(UserName) || '/' || Trim(CAST(CAST(LogonDateTime AS FORMAT 'yyyy-mm-ddbhh:mi:ss.s(2)') AS CHAR(25))) || '/' || Trim(CAST(CAST(SessionId AS INTEGER) AS CHAR(10))) AS sessionId,
sqlrowno - 1 AS sequence,
(extract(hour from elapsedtime)*3600)+(extract(minute from elapsedtime)*60)+extract(second from elapsedtime) AS seconds,
SpoolUsage AS spoolUsage,
ClientId AS clientId,
startTime AS startTime,
LogonDateTime AS sessionStartTime,
UtilityRowCount AS rowsTransferred,
AMPCPUTime AS totalCpuTime,
ErrorText AS executionError,
Hashamp() + 1 AS numTotalNodes,
EstResultRows AS estResultRows,
TotalIOCount AS totalIoCount,
DefaultDatabase AS defaultDatabases,
NumResultRows AS numResultRows,
DelayTime AS delayTime,
querytext AS queryString,
MaxAMPIO AS maxAmpIo,
AppId AS appId,
LogonSource AS logonSource,
trim(userName) AS userName
FROM
DBC.QRYLOG_FULL
WHERE
querytext is not null and AMPCPUTime > 0
AND LogonDateTime >= STARTTIME AND LogonDateTime < ENDTIME
Automated and Manual Query Log Ingestion¶
You can either perform QLI manually on demand or enable automated QLI:
To perform manual QLI, under the Automated and Manual Query Log Ingestion section of the Query Log Ingestion tab, ensure that the Enable Automated Query Log Ingestion toggle is disabled.
Click Preview to get a sample of the query history data to be ingested.
Click the Import button to perform QLI on demand.
Note
Metadata extraction must be completed first before running QLI.
To schedule QLI, enable the Enable Automated Query Log Ingestion toggle.
Set a schedule under Automated Query Log Ingestion Time by specifying values in the week, day and time fields. The next QLI job will run on the schedule you have specified.
Note
The hourly schedule for automated QLI is not supported.
Custom Settings¶
This configuration option is available if Profiling V2 is enabled.
To profile a column, Alation runs a default query if no custom query is specified for this column. Default queries for column profiling differ depending on the data type: numeric or non-numeric.
The default query supplied by Alation can be modified based on the user requirements and datasource type. The following default query template can be used to revert back to the default query from a custom query:
Numeric columns:
SELECT MIN({column_name}) AS MIN, MAX({column_name}) AS MAX, AVG({column_name}) AS MEAN, (COUNT(*) - COUNT({column_name})) AS "#NULL", (CASE WHEN COUNT(*) > 0 THEN ((COUNT(*) - COUNT({column_name})) * 100.0 / COUNT(*)) ELSE 0.0 END) AS "%NULL" FROM {schema_name}.{table_name};
Non numeric columns:
SELECT ((SUM(CASE WHEN {column_name} IS NULL THEN 1 ELSE 0 END))) AS "#NULL", (CASE WHEN COUNT(*) > 0 THEN ((((SUM(CASE WHEN {column_name} IS NULL THEN 1 ELSE 0 END))) * 100.0 / count(*)) ) ELSE 0.0 END ) AS "%NULL" FROM {schema_name}.{table_name};
Important
The profiling queries MUST contain {column_name}
, {schema_name}
, and {table_name}
. Users must check the compatibility of the default query based on the datasource type and modify it if required.
The default profiling query calculates the Profiling stats that are displayed on the Overview tab of the Column catalog page. When you customize the query, you can also customize the statistics that should be calculated and displayed:
Troubleshooting¶
Refer to Troubleshooting.