Supported connectors

When you create and edit transform jobs with the DataStage® service, you have numerous connectors to work with. You use the connectors to access data sources. The connectors are easy to add to the job by dragging them from the palette to the canvas. You can also configure the connectors by clicking them to open the Details card on the Transform data page of DataStage.

Connectors

The following connectors are supported:
  • Amazon Redshift
  • Amazon S3
  • Apache Cassandra
  • Apache Hbase
  • Apache Kafka
  • Azure Data Lake Storage
  • Azure Storage
  • Big Data File stage (BDFS)
  • BigQuery Connector
  • Classic Federation
  • Cloud Object Storage
  • Complex flat file
  • Data Set
  • Db2®
  • Db2 for z/OS
  • Db2 on Cloud
  • Db2 Warehouse
  • Distributed Transaction
  • DRS Connector
  • External source
  • External target
  • File
  • File Set
  • File System
  • FTP Enterprise
  • FTP (remote file system transfer)
  • Google Cloud Storage
  • Greenplum
  • Google BigQuery (BigQuery)
  • Hierarchical
    • JSON composer step
    • JSON parser step
    • REST step
    • Test assembly
    • Details inspector
    • Create / view contract libraries
    • Administration
  • Hive JDBC
  • Hive JDBC - CDH
  • Hive JDBC - HDP
  • Informix Enterprise
  • Informix Load
  • ISD Input
  • ISD Output
  • Java Integration
  • Generic JDBC
  • Lookup File Set
  • Microsoft Azure (Blob and File)
  • Netezza®
  • ODBC
  • Oracle
  • Pivotal Greenplum (Greenplumb)
  • Salesforce.com
  • SAP OData
  • SAP Packs (requires license)
  • Sequential File
  • Snowflake
  • Sybase Enterprise
  • Sybase IQ Load
  • Sybase OC
  • Teradata
  • WebSphere® MQ
  • z/OS file