How to send Mule logs to Elasticsearch with Log4j


In our previous posts, we learnt how to install and setup Elasticsearch and Kibana to collect, analyze and visualize our mule logs. 

There are a few ways to send the logs of our Mule Apps to ELK. One of them is using the Log4j framework. If you want to know more about it, have a look at the post Log4j in Mule - What you need to know


In this post, we’ll see what we need to configure in our mule apps to send logs to Elasticsearch using a Log4j appender.

Prerequisites

For this tutorial, we need an Elasticsearch and a Kibana instances. Follow these posts to get your Elasticsearch and Kibana instances for our Mule logs:
For our mule app we’ll use Mule runtime version 4.6.8, with Java 17

Create a Mule app for testing

The first thing we need is an app for testing. Head over to Anypoint Studio, create a New Mule Project and drag & drop the following elements to our flow:
  • An HTTP listener - A simple GET /hello
  • A Logger processor to show how the app writes to the log. Write any text in the message that can help you identify the log is coming from this component when we’ll see the logs in ELK
  • A Set Payload processor to create a response for our test endpoint. Enter any text that confirms the app is running well


Set up of the HTTP appender

In this tutorial, we’ll be using an HTTP Appender to send the logs to Splunk. This appender will use the Elastic REST API. For that, we’ll need to provide our Elastic instance details:
  • The URL and port
  • The index - Remember how we created a specific index in Elasticsearch for our mule logs in our post - How to Set Up Elastic for our Mule Logs
  • The authentication details. Remember that we also created a custom role and a user with permissions only to our mule logs index
For the authentication, we’ll pass the username and password in the HTTP header. For that, we need to encode the credentials with Base64. From a terminal, run the following command:

echo -n "username:password" | base64

In our case:

echo -n "mule:Mule1234" | base64

Copy the output of the command, we’ll paste it in the appender configuration.


Next, open the log4j2.xml file. It is located under src/main/resources. Insert the following within the Appenders section and provide the previous details:

<Appenders>
...
<Http name="ELK"
url="http://[YOUR_SERVER]:9200/mule-logs/_doc" >
<Property name="kbn-xsrf" value="true" />
<Property name="Content-Type" value="application/json" />
<Property name="Authorization" value="Basic bXVsZS11c2VyOk11bGUxMjM0" />
<JsonLayout compact="false" eventEol="true" properties="true" />
</Http>
...
</Appenders>

Notice that:

And lastly, we need to add the Appender Ref in our root logger. For that, we'll include inside the Loggers element the following:

<Loggers>
...
<AsyncRoot level="INFO">
<AppenderRef ref="ELK" />
</AsyncRoot>
...
</Loggers>

This is how our log4j2.xml file looks like in our example:



Test from Anypoint Studio

Run the app in studio, send a few requests to the endpoint of our app and head over to Kibana. We should start seeing some logs.
To see the logs, go to Kibana > Analytics > Discover and select the Mule Logs data view we created during the setup.


Next steps - Deployment

With that, we’ve verified that the configuration of the appender is correct and the app can send logs to our Elastic instance. We are now ready to deploy the app to our Deployment Target. But be careful, depending on what your Deployment Target is, you might need to do some extra steps:
Previous Post Next Post