IBM Security Discover and Classify-V4.0.0-Integrations-Guardium Data Protection Integration Admin Guide
IBM Security Discover and Classify-V4.0.0-Integrations-Guardium Data Protection Integration Admin Guide
PROTECTION
INTEGRATION ADMIN
GUIDE
VERSION 3.10.0
GUARDIUM DATA PROTECTION INTEGRATION ADMIN GUIDE 2
TABLE OF CONTENTS
Table of Contents 2
IBM Security Guardium Data Protection Integration 3
Chapter 1: Guardium Data Protection Installation 4
Prerequisites 4
Installation Instructions 4
Chapter 2: Guardium Data Protection Configuration 8
Chapter 4: Guardium Data Protection Usage 9
Guardium Data Protection Data Source Sync 9
Prerequisites 9
Workflow 9
Guardium Groups and Policies Sync 9
Prerequisites 9
Workflow 9
Reporting 9
Prerequisites 9
Workflow 10
API endpoints usage 17
Get import_repositories_data Task Status 17
Import Data Sources Data to ISDC 17
Get Data Sources Data from Database 18
3.2. Add a new route under services section. Pay attention to spacing. Helm applies only spaces, so you
may need to remove all tabs and do spaces instead:
- name: guardium-data-protection
url: http://guardium-data-protection-app:8000
plugins:
routes:
- name: guardium-data-protection
paths:
- /integration/guardium-data-protection/*
INSTALLATION INSTRUCTIONS
1. For ISO deployment, copy the installation tar into /home/admin directory or download it from the storage.
cp /opt/install/guardium-data-protection-0.0.1.tgz /home/admin/
cd guardium-data-protection
4. Create values file or edit values-test.yaml and add/edit the content below:
guardium-data-protection-app:
global:
imageRegistry: "repo-docker.1touch.io"
appType: "subscription-plane-plane"
imagePullSecrets: [cm-docker-registry, control-plane-docker-registry]
kafka:
connectionCm: "cm-kafka-configuration"
credsSecretName: "kafka-cm-user"
tlsSecretName: "cm-cluster-ca-cert"
postgres:
connectionCm: "cm-postgres-configuration"
rootCredsSecret: "cm-postgres"
image:
repository: "integrations/guardium-data-protection"
tag: "0.0.2-18-g55994c17-mr"
pullPolicy: IfNotPresent
initContainers:
- name: init-db
image:
repository: "integrations/guardium-data-protection-init"
tag: "0.0.2-18-g55994c17-mr"
pullPolicy: IfNotPresent
env:
INVENTA_IP: <isdc_hostname>
INVENTA_LOGIN: <username>
INVENTA_PASSWORD: <password>
ITERATION: 1
REDIS_URL: "redis://int-redis-master.<k8s-namespace>.svc.cluster.local:6379"
POSTGRES_SCHEMA: guardium_data_protection
POSTGRES_DB: guardium_data_protection
POSTGRES_DB_NAME: guardium_data_protection
POSTGRES_POOL_MAX_CONN: 3
TASK_ENABLE: True
TASK_DELAY_IN_SECONDS: 3600
GUARDIUM_HOST: "10.192.191.43"
GUARDIUM_PORT: 8443
GUARDIUM_CLI_PASSWORD: guardium
GUARDIUM_CLIENT_ID: integration
GUARDIUM_CLIENT_SECRET: "06fae4c7-7b47-487d-b7c1-2c5e46d7d705"
GUARDIUM_USERNAME: admin
GUARDIUM_PASSWORD: guardium
GUARDIUM_SCHEME: "https"
APPLIANCEID: "7708f98b-77ad-437c-a832-492cbce74819"
https://<cm_hostname>/integration/guardium-data-protection/docs
/connection_statuses
Sample response:
{
"database_connection_status": "Connected"
}
9. Run the endpoint below. Confirm you can see if any repos from GPD come through into ISDC:
Endpoint:
/get_repositories_data
ISDC configuration
SFTP/SMB configuration
Guardium configuration
GUARDIUM_URL=guardium_url
GUARDIUM_CLIENT_ID=guardium_client_id
GUARDIUM_CLIENT_SECRET=guardium_client_secret
GUARDIUM_USERNAME=guardium_user
GUARDIUM_PASSWORD=guardium_pass
REDIS_URL=redis://127.0.0.1:6379
REDIS_POOL_MAX_CONN=10
WR_REDIS_QUEUE_DB=1
WR_REDIS_TIMEOUT=5
PREREQUISITES
TASK_ENABLED property is set to True.
WORKFLOW
The process is automated. It is triggered on app startup and is resynced once an hour (by default).
Controlled by TASK_DELAY_IN_HOURS parameter.
The process can also be triggered manually using Integration API. It checks the data sources supported by
ISDC and imports them into ISDC via Kafka API connection. Credentials are not imported, meaning that
after the data source is created, the credentials have to be set additionally.
PREREQUISITES
The properties listed below control the groups created based on ISDC discovery results. If a group with
such name exists, it will be updated, if not – it will created.
DB_VENDOR_GROUP_NAME=Inventa_db_vendor
HOST_GROUP_NAME=Inventa_hosts
DB_GROUP_NAME=Inventa_db_name
TABLENAME_GROUP_NAME=Inventa_db_tablename
WORKFLOW
The process is automated. It is triggered on app startup and is resynced once an hour (by default).
Controlled by TASK_DELAY_IN_HOURS parameter. In the next versions, it will be controlled via API.
Policies are installed based on the Policies templates from Guardium Data Protection. The integration
automatically assigns the required groups to those policies, which ensures that data imported into ISDC
related groups are monitored.
REPORTING
PREREQUISITES
Table 1: Prerequisites for security dashboard
PREREQUISITE DESCRIPTION
GDP instance up and running Prerequisites: Preconfigured Oauth2 app in Guardium CLI. See instructions in the
Guardium documentation.
Parameters to be shared with 1touch for integration:
• clientID
• clientSecret
• username/password of user with admin permissions (Guardium Data Protection
web UI)
Integration with ISDC Connecting GDP to ISDC requires setting up an extension, which is a set of python
containers orchestrated by docker-compose. Extension takes one config file as an
input.
PREREQUISITE DESCRIPTION
Config file contains mostly predefined values, but user has to specify Guardium
connection details (OAuth2 details, credentials), ISDC Kafka configuration
(host,port, security protocol etc) and app configs ( schedule etc).
Enabled and configured Report is an XML file generated manually in Security assessment GDP
reporting functionality. You can configure assessment, that contains a lot or all CVEs GDP
knows about. After running it, a user can click, view results, export those resuts as
an XML and upload the XML into Integration Extension API. This will generate a
DPS topic message, containing aggregated assessment results and all the info on
the data source, if there are policies or S-TAP associated with it.
To transfer info on data sources from GDP to ISDC:
1. On GDP side, configure and export an XML report on data sources.
2. In integration API UI, upload the xml report and execute POST/parse_xml_
report.
Data source analysis by ISDC Run the data source analysis in the ISDC Data Source Catalog (CM > Inventory >
Data source catalog).
WORKFLOW
1. Log into IBM GDP. Go to Harden > Vulnerability Assessment > Assessment Builder.
Output of this command is the token to authorize the requests in Integration API endpoint. Sample output:
13. Click the Authorize button on top right and copy-paste the token you generated in shell. Then click
Authorize.
16. After request is done, you'll see the JSON representation of XML report. The data has been moved to
ISDC kafka.
curl -X 'GET' \
'http://127.0.0.1:5005/import_repositories_data' \
-H 'accept: application/json' \
-H 'Authorization: Bearer <jwt_token>'
Request URL
http://127.0.0.1:5005/import_repositories_data
Response
Response body:
{
"success": true,
"task_state": "done"
}
curl -X 'POST' \
'http://127.0.0.1:5005/import_repositories_data' \
-H 'accept: application/json' \
-H 'Authorization: Bearer <jwt_token>'
Request URL
http://127.0.0.1:5005/import_repositories_data
Response
Response body:
{
"success": true,
"task_state": "in progress"
}
curl -X 'GET' \
'http://127.0.0.1:5005/get_repositories_data' \
-H 'accept: application/json' \
-H 'Authorization: Bearer <jwt_token>'
Request URL
http://127.0.0.1:5005/get_repositories_data