As of 1 July 2020, Qlik Core is no longer available to new customers. No further maintenance will be done in this repository.
This connector exemplifies how a JDBC gRPC Connector can be written. It contains examples that includes a MySQL Database, a PostgreSQL Database, a QIX Engine and the JDBC gRPC connector.
Go to the examples folder and run the following:
ACCEPT_EULA=<yes/no> docker-compose up --build -d
Then follow the instructions for either corectl or node.
Head into /examples/node and install the dependencies and then run the script index.js using the following commands:
npm install
npm start
To run integration tests you can use:
npm test
To run the corectl example head into /examples/corectl. If you do not yet have corectl installed just follow the download instructions from corectl.
Once installed you can build using either the postgres or mysql database with the following commands, respectively:
corectl build --script mysql.qvs
corectl build --script postgres.qvs
Take a peek at corectl.yml to see how the connections are set up for corectl. To view the tables you can then simply type:
corectl get tables
The perfomance of the JDBC gRPC connector can be tweaked with a few different environment settings.
You can use the DATABASE_FETCH_SIZE
command to limit the memory consumption in the connector when fetching data from the database.
DATABASE_FETCH_SIZE
sets the amount of rows fetched from the database loaded into memory in batches.
The default DATABASE_FETCH_SIZE
is 100000.
If DATABASE_FETCH_SIZE
is not set, the entire database query is loaded into the memory of the connector.
You can use the MAX_DATA_CHUNK_SIZE
command to tweak the size of the data chunks sent over gRPC to QIX Engine.
The MAX_DATA_CHUNK_SIZE
represents how many fields can be batched together in one package.
This setting is highly dependant on the content of the fields and the package should be keept below the default 4MB gRPC package size limit.
The default MAX_DATA_CHUNK_SIZE
is set to 300
These settings can be changed in the example in docker-compose.yml file.
- Java JDK 8.0
- Maven 3.3.9
mvn install
java -jar ./target/core-grpc-jdbc-connector.jar
Other JDBC Drivers can be added to the pom.xml file in the following section:
<dependencies>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.2.2</version>
</dependency>
</dependencies>
Make sure you start your Qlik Associative Engine with the proper gRPC connector string to enable your JDBC driver. See an example here.
The AWS Athena driver is not officially deployed on a Maven repository, so you have to download the jar file and place it in the connector project manually.
You can download the driver here.
pom.xml
entry:
<dependency>
<groupId>com.amazonaws.athena.jdbc</groupId>
<artifactId>jdbcdriver</artifactId>
<version>2.0.5</version>
</dependency>
Put the following lines in your Dockerfile
before the RUN mvn install
command:
COPY AthenaJDBC42_2.0.5.jar /usr/src/app
RUN mvn install:install-file -Dfile=/usr/src/app/AthenaJDBC42_2.0.5.jar -DgroupId=com.amazonaws.athena.jdbc -DartifactId=jdbcdriver -Dversion=2.0.5 -Dpackaging=jar
Connection string:
{
qType: 'jdbc',
qName: 'jdbc',
qConnectionString: 'CUSTOM CONNECT TO "provider=jdbc;driver=awsathena;AwsRegion=eu-central-1;S3OutputLocation=s3://aws-athena-query-results-athenatest1-eu-central-1"',
qUserName: 'AWS Key',
qPassword: 'AWS Token',
}
LOAD statement:
sql SELECT * FROM yourathendatabase.yourathenatable;
This repository is licensed under MIT but components used in the Dockerfile examples are under other licenses. Make sure that you are complying with those licenses when using the built images.