How to setup log appenders for Splunk - Splunk Appender


In the previous post, we've seen how we can set up an HTTP Appender in log4j to send our mule apps logs to Splunk. The HTTP appender is just a generic appender to send logs over HTTP that we've adapted with the HTTP configuration of our HTTP Event collector in Splunk.


There is also a specific Splunk appender, which basically allows us to do the same (an HTTP request to the HTTP event collector endpoint) but with little differences. In this post, we'll see how to use the specific Splunk Appender.

As in the previous post, we assume you're familiar with the log4j framework and that you've already created your Splunk instance and one HTTP Event Collector. If not, have a look at these posts:



Prerequisites

To follow this tutorial we need:
  • A Splunk instance: If you don’t have one, you can quickly spin up an Enterprise trial instance in Docker. Check out this post on How to install Splunk Enterprise in a Docker Container
  • Set up your Splunk instance - Before sending any logs to Splunk we need to: 
    • Create an Index
    • Create an HTTP Event Collector (HEC)
Check out this post to learn How to set up your Splunk instance to get your Mule logs


Create a Mule app for testing

Once we’ve got our Splunk instance up and running with an Index and a HEC, we’re ready to start the Appender setup. For that, the first thing we need is an app. In this tutorial we’ll create a very basic app for testing with the following elements:
  • An HTTP listener - A simple GET /hello
  • Two Logger processors to show how the app writes to the log
    • First flow to inform of the start of the flow
    • Second flow to inform of the end of the flow
For that:
  • Head over Anypoint Studio
  • Create a new flow. Drag and drop from the Mule palette to the canvas: an HTTP listener and two Loggers.
  • Configure the HTTP listener to be listening on the endpoint http://localhost:8081/hello for GET requests.
  • For the loggers
    • The first logger will include the custom message - “The flow has started”
    • The second logger will include the custom message - “The flow has ended”
  • For example:


Configure the Splunk Appender

The Splunk appender requires some extra steps comparing to the generic HTTP appender. Let’s see them:


POM file changes - dependencies

To use the splunk appender we need to add a dependency and a repository in the pom file. The dependency includes the necessary code (classes, methods, resources) for:
  • Formatting logs
  • Send logs to Splunk - handle http connections, HEC tokens and batch processing
  • Integration with the log4j framework
The repository tells Maven where to download the dependency from. In this tutorial we’re using JFrog artifactory.
To include the dependency and repository open the pom.xml file and:
  • Within the dependencies node add:
  • <dependency>
    <groupId>com.splunk.logging</groupId>
    <artifactId>splunk-library-javalogging</artifactId>
    <version>x.x.x</version>
    </dependency>
    Enter the version of the plugin. For the latest version of Splunk, see GitHub
    • Within the repositories node add the following:
    <repository>
    <id>splunk-artifactory</id>
    <name>Splunk Releases</name>
    <url>https://splunk.jfrog.io/splunk/ext-releases-local</url>
    </repository>


Changes to the log4j2.xml file

As we’ve seen in previous posts, log4j appenders are configured in the log4j2.xml file located under the src/main/resources in your mule project. Open the file and:
  • Change the configuration root item to include the splunk package:
<Configuration status="INFO" name="cloudhub" packages="com.splunk.logging,org.apache.logging.log4j">

The packages attribute in the configuration root item specifies a comma-separated list of package names where Log4j will search for custom plugins. This is how we’re letting log4j know that we’re using a custom appender.


  • Add the Splunk appender with the details for url, port, token and index of your Splunk instance. Check out our previous post on How to Setup... to understand how to get these values
<SplunkHttp name="SPLUNK"
source="${env:APP_NAME}"
host="${env:POD_NAME}"
sourceType="mule-app"
url="http://[YOUR_SPLUNK_HOSTNAME]:[YOUR_EVENT_COLLECTOR_PORT]/"
token="[YOUR_EVENT_COLLECTOR_TOKEN]"
index="[YOUR_INDEX]">
<PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n" />
</SplunkHttp>
  • Notice that, just for semplicity, we’ve added the PatternLayout. However, if we’re using Splunk for logging aggregation the JSONLayout is probably a better layout. Check out these articles to understand it in detail:
  • Your final log4j2.xml file should look like this:


Test in Anypoint Studio

  • Before we deploy our app to Cloudhub let's check if our configuration is correct
  • Run the app in studio, send a few request to the entry endpoint and head over to splunk. We should start seeing some logs
    • To see the logs, go to Apps > Search & Reporting (or to your Splunk app you've created a different App for your logs)
    • In the search bar we'll search for the events in our index. Type index="YOUR INDEX" and hit enter.

Next step - Deploy to your Target deployment

There might be some extra steps to allow the log forwarding depending on where we are deploying our mule app. 
  • If you’re deploying this app to CH1.0 check out this post - How to Externalize Logs in Cloudhub 1.0
  • If you’re deploying this app to CH2.0 check out this post - How to Externalize Logs in Cloudhub 2.0
  • If you’re deploying this app to RTF check out this post - How to Externalize Logs in Runtime Fabric
Previous Post Next Post