Deploy DataPower on CP4I using Jenkins

Kok Sing Khong
5 min readMay 30, 2022

--

This write-up is the 3rd installment of the series. The first 2 series are about Deploying MQ and Deploying ACE. Before we read about the steps to set up the Jenkins pipeline, let us review the development approach of DataPower and the relationship between DataPower Kubernetes custom resources.

In the traditional world, where DataPower runs on physical or virtual appliances, the developer will define the services and configuration within a DataPower domain using the DataPower WebUI. The changes are applied and saved; and because service becomes active immediately. In the container world, the development process will be slightly different. The following diagram illustrates this process:

DataPower development on Kubernetes/OpenShift

Development Process

  1. The DataPower developer could use a DataPower running on a Docker container to develop the DataPower services. There is a couple of good articles on how to do this https://github.com/fxnaranjo/datapower-operator or https://github.com/ibm-datapower/datapower-tutorials/blob/master/getting-started/start-with-docker.md.
  2. Once, the development work is completed and unit tested, the developer can export the configuration to an export file.
  3. The developer can use the migrate-backup.sh script to extract the configuration from the export file into YAML files (in a folder). These generated YAML files can be applied directly to create config maps.
  4. The developer will then check-in the export folder into a Git repository. The check-in event can be configured to trigger a Jenkins Pipeline.

The detailed steps are found in this IBM documentation.

Jenkins Pipeline

  1. Pre-deploy stage: creates the Namespace, pull Secret for IBM entitlement key, admin credential, for TLS keys and certificates (not shown in the article), and ConfigMaps for configurations.
  2. Deploy stage: creates DataPowerService (which in turn creates DataPowerMonitor), Service that sends requests to the defined DataPower service endpoint, and Route that sends requests to the Service.
  3. Test service stage: Runs a curl command to test the endpoint (via the Route).

Step by Step:

Before you begin, you must have the following setup

  • Jenkins — see my previous article on installing Jenkins on the same OpenShift cluster of the CP4I.
  • Red Hat OpenShift cluster installed (my version is 4.8.36), Also tested on v4.10.17.
  • Cloud Pak for Integration is installed on the OpenShift cluster (my version is 2021.4.1). Also tested on v2022.2.1.
  • Github account. Updated to v2022.2.1 (using DataPower v10.5.0.0).
  • Clone/fork this repository.

Create a credential in Jenkins to store the IBM entitlement key and DataPower admin password (from Jenkins Console)

  1. Create Global credentials (kind: Secret text) to store the IBM entitlement key and DataPower admin password. See the detailed steps in the previous article.
  2. These two credentials will be referenced in the Jenkinsfile to be used by the pipeline.
Global credentials

Configure a pipeline to deploy DataPower service (from Jenkins Console)

  1. Click on Dashboard > New Item > Pipeline. Provide a name (e.g. mq-pipeline) and click Ok.
  2. Go to Pipeline section, specify the following and click Save.
Setup Jenkins Pipeline

Perform test build (from Jenkins Console)

  1. Click Build Now.
  2. Click Build History > #1 > Console Output
  3. When the pipeline is completed, you should see the following output, with the test results of the curl command.
Successful Pipeline Execution

Pre Deploy Output

Pre-Deploy ~ setup configuration before deploy
[Pipeline] sh
+ ./scripts/01-pre-deploy.sh **** hello dp ****
----------------------------------------------------------------------
INFO: Pre deploy
----------------------------------------------------------------------
Namespace dp found
create_pull_secret ibm-entitlement-key, dp, cp.icr.io, cp, ****, khongks@gmail.com
Secret ibm-entitlement-key already created
Apply all YAMLs from the backup
export-output/default-cfg.yaml
oc apply -f export-output/default-cfg.yaml -n dp
configmap/default-cfg configured
export-output/default-local.yaml
oc apply -f export-output/default-local.yaml -n dp
configmap/default-local configured
export-output/hello-cfg.yaml
oc apply -f export-output/hello-cfg.yaml -n dp
configmap/hello-cfg configuredDelete and Create admin-credentials secretsecret "admin-credentials" deletedsecret/admin-credentials created

Deploy Output

Deploy ~ deploy DataPowerService
[Pipeline] sh
+ ./scripts/02-deploy.sh hello dp L-RJON-C5SF54 production 10.0.4.0
----------------------------------------------------------------------
INFO: Deploy
----------------------------------------------------------------------
datapowerservice.datapower.ibm.com/hello unchanged
service/hello-service unchanged
route.route.openshift.io/hello-route unchanged
Waiting for [hello] of type [DatapowerService] in namespace [dp] to be in [Running] status
Installation status: Failed
=== Installation has failed ===
=== Retry again ===
Sleeping 30 seconds...
Installation status: Running

Test Service Output

Test Service
[Pipeline] sh
+ ./scripts/03-test-service.sh hello
----------------------------------------------------------------------
INFO: Test endpoint https://hello.itzroks-3100015379-wxzo6e-6ccd7f378ae819553d37d5f2ee142bd6-0000.au-syd.containers.appdomain.cloud/api/v1/users
----------------------------------------------------------------------
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0...{"createdAt":"2022-05-27T09:40:32.058Z","name":"Anthony Gorczany","avatar":"https://cloudflare-ipfs.com/ipfs/Qmd3W5DuhgHirLHGVixi6V76LhCkZUz6pnFt5AJBiyvHye/avatar/409.jpg","id":"45"},{"createdAt":"202
100 8562 0 8562 0 0 325 0 --:--:-- 0:00:26 --:--:-- 1756
100 9084 0 9084 0 0 345 0 --:--:-- 0:00:26 --:--:-- 2346
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

References

https://github.com/fxnaranjo/datapower-operator

https://github.com/ibm-datapower/datapower-tutorials/blob/master/getting-started/start-with-docker.md

https://www.ibm.com/docs/en/cloud-paks/cp-integration/2021.4?topic=integration-datapower-gateway-transformation-guide

https://www.ibm.com/docs/en/cloud-paks/cp-integration/2022.2?topic=integration-datapower-gateway-transformation-guide

Disclaimer:

All opinions expressed here are very much my own and not of IBM. All codes/scripts/artifacts are provided as-is with no support unless otherwise stated.

--

--

Kok Sing Khong
Kok Sing Khong

No responses yet