In our previous posts, we learnt how to install and setup Elasticsearch and Kibana to collect, analyze and visualize our mule logs.
There are a few ways to send the logs of our Mule Apps to ELK. One of them is using the Log4j framework. If you want to know more about it, have a look at the post Log4j in Mule - What you need to know
In this post, we’ll see what we need to configure in our mule apps to send logs to Elasticsearch using a Log4j appender.
In our case:
Copy the output of the command, we’ll paste it in the appender configuration.
Next, open the log4j2.xml file. It is located under src/main/resources. Insert the following within the Appenders section and provide the previous details:
Notice that:
This is how our log4j2.xml file looks like in our example:
To see the logs, go to Kibana > Analytics > Discover and select the Mule Logs data view we created during the setup.
Prerequisites
For this tutorial, we need an Elasticsearch and a Kibana instances. Follow these posts to get your Elasticsearch and Kibana instances for our Mule logs:- How to Install Elasticsearch and Kibana on Linux - Part I
- How to Install Elasticsearch and Kibana on Linux - Part II
- How to Install Elasticsearch on Docker
- How to Install Kibana on Docker
- Install Elasticsearch and Kibana with Docker Compose
Create a Mule app for testing
The first thing we need is an app for testing. Head over to Anypoint Studio, create a New Mule Project and drag & drop the following elements to our flow:- An HTTP listener - A simple GET /hello
- A Logger processor to show how the app writes to the log. Write any text in the message that can help you identify the log is coming from this component when we’ll see the logs in ELK
- A Set Payload processor to create a response for our test endpoint. Enter any text that confirms the app is running well
Set up of the HTTP appender
In this tutorial, we’ll be using an HTTP Appender to send the logs to Splunk. This appender will use the Elastic REST API. For that, we’ll need to provide our Elastic instance details:- The URL and port
- The index - Remember how we created a specific index in Elasticsearch for our mule logs in our post - How to Set Up Elastic for our Mule Logs
- The authentication details. Remember that we also created a custom role and a user with permissions only to our mule logs index
echo -n "username:password" | base64
echo -n "mule:Mule1234" | base64
Next, open the log4j2.xml file. It is located under src/main/resources. Insert the following within the Appenders section and provide the previous details:
<Appenders>
...
<Http name="ELK"
url="http://[YOUR_SERVER]:9200/mule-logs/_doc" >
<Property name="kbn-xsrf" value="true" />
<Property name="Content-Type" value="application/json" />
<Property name="Authorization" value="Basic bXVsZS11c2VyOk11bGUxMjM0" />
<JsonLayout compact="false" eventEol="true" properties="true" />
</Http>
...
</Appenders>
- url - contains the details of your Elasticsearch server: the public DNS name of the instance, the port (9200) and the index (mule-logs)
- The
kbn-xsrf
header is a security measure used in Kibana to prevent Cross-Site Request Forgery (CSRF) attacks. - The
Authorization
header contains the base64 username:password encoded string - We are using the JSONLayout so that our logs are directly sent in JSON format to Elasticsearch. Check out these posts to understand better Log4j Layouts to use and how to set them up:
<Loggers>
...
<AsyncRoot level="INFO">
<AppenderRef ref="ELK" />
</AsyncRoot>
...
</Loggers>
Test from Anypoint Studio
Run the app in studio, send a few requests to the endpoint of our app and head over to Kibana. We should start seeing some logs.To see the logs, go to Kibana > Analytics > Discover and select the Mule Logs data view we created during the setup.
Next steps - Deployment
With that, we’ve verified that the configuration of the appender is correct and the app can send logs to our Elastic instance. We are now ready to deploy the app to our Deployment Target. But be careful, depending on what your Deployment Target is, you might need to do some extra steps:- If you’re deploying this app to a Standalone instance you don’t need to do anything else.
- If you’re deploying this app to CH1.0 follow the post - How to externalize logs in Cloudhub 1.0
- If you’re deploying this app to CH2.0 follow the post - How to externalize logs in Cloudhub 2.0
- If you’re deploying this app to RTF follow the post - How to Externalize Logs with Log4j in Runtime Fabric