With Log4j in Mule we can:
- Set up what information will be logged - log message, timpestamp, correlation Ids...
- Define the format of our log messages - JSON, XML, plain text
- Specify where the generated logs will be send:
- Basic destinations - a file, a database, a queue.
- Advanced destinations - Log aggregation systems like Splunk or ELK
The component responsible for defining where to send the logs is what Log4j calls Appender. As we mentioned, there are multiple appenders. Have a look at this post to understand in detail How to set up Log4j Appenders for your Mule apps
In this post, we will see how we can externalize the logs of our Mule apps to Splunk using an HTTP Appender.
Prerequisites
First of all, let’s see what we need to have in place before setting up the appender:- A Splunk instance: If you don’t have one, you can quickly spin up an Enterprise trial instance in Docker. Check out this post on How to install Splunk Enterprise in a Docker Container
- Set up your Splunk instance - Before sending any logs to Splunk we need to:
- Create an Index
- Create an HTTP Event Collector (HEC)
Create a Mule app for testing
Once we’ve got our Splunk instance up and running with an Index and a HEC, we’re ready to start the Appender setup. For that, the first thing we need is an app. In this tutorial we’ll create a very basic app for testing with the following elements:- An HTTP listener - A simple GET /hello
- Two Logger processors to show how the app writes to the log
- First flow to inform of the start of the flow
- Second flow to inform of the end of the flow
- Head over to Anypoint Studio
- Create a new flow. Drag and drop from the Mule palette to the canvas: an HTTP listener and two Loggers.
- Configure the HTTP listener to be listening on the endpoint http://localhost:8081/hello for GET requests.
- For the loggers
- The first logger will include the custom message - “The flow has started”
- The second logger will include the custom message - “The flow has ended”
- For example:
Set up of the HTTP appender
The HTTP Appender in log4j will send logs over HTTP to the HEC that we defined in Splunk. This is not a specific appender for Splunk, it's just a generic appender that sends logs by just sending a POST request to a specified URL.So, for Splunk, we can use this appender and match the HTTP appender details of the HTTP Event collector. For that, we just need to provide the URL (with the port included) and the authorization token as a Property element.
Make sure you get the following details from your HEC
- The port on which your HEC is listening. By default, Splunk uses 8088
- The URL - It should something like http://[YOUR_SPLUNK_HOSTNAME]:8088/services/collector/raw (for the default port 8088)
- The Authorization token
Open the log4j2.xml file. It is located under src/main/resources. There are multiple options to configure an Appender. The minimum configuration to export logs is:
- Insert the following within the Appenders section and provide the previous details:
<Http name="Splunk" url="http://[YOUR_SPLUNK_HOSTNAME:YOUR_EVENT_COLLECTOR_PORT]/services/collector/raw">
<Property name="Authorization" value="Splunk [YOUR_HEC_TOKEN]">
</Property>
<PatternLayout pattern="%m%n"></PatternLayout>
</Http>
- Make sure you enter the splunk details previously mentioned. Notice that the In the property tag, in value you need to enter the token of the event collector we created before
- Notice that the XML Property item is used to provide the value of your HEC token
- We also include the PatternLayout property, which is the way we can specify in the log4j framework the format and content of our logs. In here, we use the PatternLayout for simplicity but remember that the JSONLayout is a better option. Check out these posts to understand better which Layout to use and how to set it up:
- Breaking Down the Log4j JSONLayout
- Deep Dive into Log4j Layouts
- Why and When we should use the Log4j JSONLayout for our Mule Apps
- Breaking down the Log4j PatternLayout
- And lastly, we need to add the Logger for Splunk. For that, we'll include inside the Loggers element the following:
<AsyncRoot level="INFO">
<AppenderRef ref="Splunk" />
</AsyncRoot>
For example, this is the content of my log4j2.xml file that sends logs to a Splunk instance installed on an AWS EC2 instance:
Test from Anypoint Studio
Before we deploy our app to our target deployment, let's check if our configuration is correct and Splunk can get the logs from our app.Run the app in studio, send a few request to the entry endpoint and head over to splunk. We should start seeing some logs.
To see the logs, go to Apps > Search & Reporting (or to your Splunk app you've created a different App for your logs). In the search bar we'll search for the events in our index. Type index="YOUR INDEX" and hit enter.
Next steps - Deploy to our Target Deployment
There might be some extra steps to allow the log forwarding depending on where we are deploying our mule app.- If you’re deploying this app to CH1.0 check out this post - How to Externalize Logs in Cloudhub 1.0
- If you’re deploying this app to CH2.0 check out this post - How to Externalize Logs in Cloudhub 2.0
- If you’re deploying this app to RTF check out this post - How to Externalize Logs in Runtime Fabric