How to integrate IBM ACE with ELK
February 2, 2021
Andrei Postoiu

The ELK stack has slowly become the world’s most used log management platform, mostly because it is fast, reliable and open-source.

According to the official description, "ELK" is the acronym for three open source projects: Elasticsearch, Logstash, and Kibana. Elasticsearch is a search and analytics engine. Logstash is a server‑side data processing pipeline that ingests data from multiple sources simultaneously, transforms it, and then sends it to a "stash" like Elasticsearch. Kibana lets users visualize data with charts and graphs in Elasticsearch. ELK evolved into the Elastic stack with the addition of Beats - lightweight agents that are installed on hosts to collect data and send it into the stack, to Elasticsearch or Logstash.

A good news for IBM App Connect Enterprise (ACE) developers is that fix pack V11.0.0.8 brought ELK connectivity. Of course, this is fabulous news for Windows-based ACE developers, having the opportunity to get rid of Windows Event Viewer.

More precisely, from version onwards ACE can publish logging data to a Logstash input in an ELK stack. Let’s examine the steps needed to have the logging data (BIP messages) show up in a Kibana dashboard.

I assume that you have the ELK stack installed and running (Elasticsearch + Logstash + Kibana). Latest version when this article was created is 7.10.0. That’s what I’ll use for the examples shown here.

First we need to set up Logstash to listen for the ACE logging data. Go to your Logstash config folder and edit pipelines.yml to add anew pipeline:

Create a new pipeline configuration file “ace_logs.conf” and specify the port used by Logstash to listen for ACE data (any unused port) and the Elasticsearch URL. You can also specify the index name where the new data will be stored and if you want the data to be printed in the Logstash console (stdout):

Start or restart Logstash to activate the new pipeline. Logstash will now wait for logging data to be sent to the specified port (5040in my case).  

Now, let’s configure ACE to send logging data. There’s a nice explanation in the knowledge center. You can send data for a specific integration server or for the whole integration node. I chose to have the ELK connection at node level, so I edited the node.conf.yaml file found in the ACE working directory. First, enable sending the logs to ELK:

Then define the connection details:

Restart the integration node and the logging data should start to flow to Logstash. Check the ACE system logs for the confirmation that the connection is functional.

Check the Elasticsearch indices summary (http://elastic_url:9200/_cat/indices).Your new index should appear.

Now the fun stuff begins… Open your Kibana main page (http://kibana_url:5601/app/home) and choose Stack Management from the main menu. Go to Data / Index Management and you should see the new index:

Go to Kibana / Index Patterns and create a new pattern based on the new index. Now you can see the actual logging data with the available fields:

You can afterwards create different charts or tables based on the index fields, like the one below that shows the log entries grouped by log level and integration server:

You can then filter on the specific integration server, date and log level and choose “Explore underlying data” that will show you all the available fields for the data that matches that filter.

Hopefully, the steps provided here gave you an idea on how you can use the new ELK connectivity added to ACE. If you already have the ELK stack set up in your infrastructure, you should consider sending the ACE logging data to it. And if your ACE environment is Windows-based, you really have no other choice – imagine trying to find in Event Viewer the ACE errors from last week!!





Talk to the team