JSON logging in Mule: How to get the most out of your logs

json mule logging

Logging is arguably one of the most neglected tasks on any given project. It’s not uncommon for teams to consider logging as a second-class citizen and it is usually taken for granted; until, of course, the team goes live and try to troubleshoot anything.

Once this happens, upper management might start doing some finger pointing:

This scenario, interestingly enough, has exacerbated in the last few years due to the rise of data aggregation platforms like Splunk, ELK, and others, and their ability to do full-text searches across any piece of “raw” data that developers manage to push into them.

Rather than encouraging better logging practices, this scenario gives developers the misconception that they don’t need to put any effort as long as they are able to aggregate their logs.

More often that I’d like to admit, the answer to the question: “How do you troubleshoot in production?” Is: “We just type error in the Splunk search box to find if something is wrong.”

What many teams don’t understand, however, is that text search is just a very basic and ad-hoc capability. These data aggregation frameworks only reach their full potential when they are able to understand, index, and exploit the data they are gathering. For this to happen, the data formats need to be somewhat normalized.

In my career, I’ve had the pleasure to work with very mature companies and brilliant architects who have the vision to understand the benefits of standardizing the way data gets logged across any business application.

A popular choice for most companies is obviously JSON, because its flexibility, efficiency, and human readability probably go unchallenged.

So “Let’s do everything in JSON!”

Now, hold your horses (or Mules?), because the decision to use JSON for logging is only the first step of getting the most out of your log data. The next important decision is: “What should I log?”

In a world where everything is becoming an API or microservice, chances are that any action performed on any application (e.g. placing an order) will span tens of different applications, systems, and devices. With data coming from such disparate sources, how do we ensure we get end-to-end visibility across every touch point?

To make things a bit more challenging, we architects typically put the burden of these decisions on our developers, without even empowering them with the necessary tools or frameworks to do so.

The result is that simple, core best practices – such as embedding correlation IDs on every log entry (for transaction correlation) – are not always followed, especially because they depend on the developer’s willingness to follow them. Failure to follow these best practices dramatically impacts the DevOps teams’ ability to troubleshoot down the road.

In this article, I’ll present an approach to logging in Mule applications and some best practices around defining the right data structure.

JSON logging in Mule

The out-of-the-box logger component offered by Mule runtime is really easy to use and understand, but it hardly enforces any standards and does not facilitate the creation of structured logs (nor does the underlying log4j, for that matter).

What if there was a way to keep the logging experience as simple as writing a one text entry, while having a full fledge JSON structure as the output?

What if we could automagically prepopulate logs with default values and dynamic expressions?

What if developers won’t need to worry about setting correlation IDs, tracepoints, calculate elapsed times, and every other single piece of data that can be derived from the environment or from MEL expressions?

Meet the JSON logger module

In order to fill this gap, I created a generic JSON Logger module (based on our DevKit SDK), which is considered a drop-in replacement of the existing Logger component.  At the core of this implementation is the assumption that not every company will want the exact same data structure. Thus, the module was architected in a way that lets you customize the output JSON log by defining a matching JSON Schema.

Once you are satisfied with your JSON schema, all of the module’s code can be dynamically re-generated by building and packaging the module. The outcome will be a new JSON logger module that developers can install in Anypoint Studio and leverage from the palette like any other component.

How do I use the JSON logger module?

As you can see in the image above, once installed, the JSON logger will show in the Studio Palette as any standard component; it can also be dragged-and-dropped into the canvas.

From a usage perspective, the out-of-the-box implementation can be configured as follows:

While it may look a bit more daunting than the base Logger component, the only part that a developer would define most of the time is the one highlighted in blue above.

In this scenario, we are logging the current #[payload]. We are also using the START Trace Point. Trace Points aims to facilitate the standardization of certain “check points” throughout the flow execution, which, in turn, enable more intelligent dashboards and reporting based on this data (e.g. to determine all the in-flight transactions we could query, what are all the transactions without an END trace point?).

The current JSON logger ships with the following Trace Points:

    • START (Default)
    • END

Last, but not least, below is an example of what a flow may look like using JSON Logger:

And this would be the JSON logs that are generated:

As you can see, while the developer only had to worry about changing the trace point and what to log in the message (e.g. #[payload]) there is all this other data that gets generated for him or her.

How can I download the JSON logger module?

If you like what you see and want to try it for yourself, you can get the default version from our MuleSoft Services account for Anypoint Exchange and install it directly into Anypoint Studio.

To make things even easier, I provisioned a guest account that you can use to install the connector:

User: guest-mule
Pass: Mulesoft1

Just open Exchange from Anypoint Studio:

Log in with the guest credentials, select JSON Logger and click Install (top right):

Then just follow the installation steps and voilà!

What is the default JSON structure?

If you are curious about what the current JSON schema covers out-of-the-box you can find it on GitHub.

How can I configure the JSON logger module?

It should be as straightforward as configuring any Anypoint Connector. You just need to define a Global Configuration and the fields are self-explanatory.

Starting with version 1.1.0, a new configuration option was introduced called: Logger with Anypoint MQ Configuration

As the name implies, this configuration not only generates standard logs but it also forwards them to an Anypoint MQ destination (exchange or queue) asynchronously.


    • Not every user has yet a data aggregator platform
  • We don’t recommend pushing logs directly into a database

Thus, the logical option is to leverage our very own, flexible and scalable messaging solution to provide the perfect persistence layer for these messages. Users can then decide how to process them in any way they see fit (e.g. read them from a queue and push them into a database).

How can I further enforce best practices?

If you have followed this far, I hope this next part won’t disappoint you!

What better way to enforce best practices than using unit tests? What if we define as a developer practice – validated through code-review – that every main flow should have certain minimum tracepoints?

Let’s use the default json-logger-test-app as an example:

Now let’s add an MUnit test that verifies the following:

    • START needs to be there only once
    • AFTER_API_OR_BACKEND needs to be there at least once (we are connecting to something after all right?)
  • END needs to be there only once

This very simple test would look like:

And of course, failing to comply with the test would result in this:


I’m confident some of these practices can help you start creating a better culture around DevOps (read more here) and ensure that you will always have the proper data when it is most needed.

Stay tuned for my next series of articles, on what this process looks like in practice when integrated with Splunk and how you can aggregate the logs into a cloud database using Anypoint MQ.

GitHub repository

Check out the GitHub repository for more information.

UPDATE: check out part 2 of this blog post to learn about JSON logging in Mule 4.

We'd love to hear your opinion on this post

23 Responses to “JSON logging in Mule: How to get the most out of your logs”

  1. Hi Andres! I couldn’t agree more with you on the importance of having best practices around logging. I’ve just downloaded the connector and I noticed:
    1) I cannot download it from Anypoint Studio -> Exchange, I went for the workaround instead (from Exchange -> download the zip -> install as a plugin)
    2) I see that the connector is a community one, it seems that by default we don’t get access to the connectors from “MuleSoft Services” organization, will this be moved to “MuleSoft” later on?
    3) I see we can also use the logger with Anypoint MQ Configuration, could you tell us more about that?
    4) I made a simple flow where I make a HTTP Post followed by the JSON Logger that uses the payload, I believe I am missing something in the project setup. Exception logged: “org.mule.modules.jsonloggermodule.JsonLoggerModuleConnector: Failed parsing expression for key: timestamp” and the message logged by the logger has missing values:
    INFO 2017-12-01 08:06:46,775 [[json-logger-test].HTTP_Listener_Configuration.worker.01] org.mule.modules.JsonLogger: {
    “applicationName” : “${json.logger.application.name}”,
    “applicationVersion” : “${json.logger.application.version}”,
    “environment” : “${mule.env}”,
    “timestamp” : “#[new org.joda.time.DateTime().withZone(org.joda.time.DateTimeZone.forID(\”${json.logger.timezone}\”)).toString(\”${json.logger.dateformat}\”)]”,
    “rootCorrelationId” : “319bc5b0-d666-11e7-875d-448500a76039”,
    “correlationId” : “319bc5b0-d666-11e7-875d-448500a76039”,
    “threadName” : “[json-logger-test].HTTP_Listener_Configuration.worker.01”,
    “priority” : “INFO”,
    “tracePoint” : “START”,
    “message” : ” {\n \”ana\”: \”testing\”\n }”

    Great article btw and such a custom logger component is needed!

    • Hi Ana,

      1. If by “you can’t” you mean you got an error window when trying to install from exchange, that is a known issue in Studio and we are currently working on a fix. Just trying again once or twice should fix the issue and install as usual.

      2. Currently the connector is a personal initiative but I guess depending on traction and usefulness we could move it to the much more open MuleSoft exchange

      3. This is a configuration that on top of doing the regular “local logging” it will also forward the JSON message to an Anypoint MQ destination asynchronously. So basically, the extra info required is the client application credentials for a particular destination (Exchange or Queue) as well as the destination name itself (more info on getting that from Anypoint MQ here: https://docs.mulesoft.com/anypoint-mq/mq-client-apps). The next step would be to do something with those messages, wether that is pushing them to a DB or anything else, its totally up to you.

      4. For a working test application I would just clone this repo: https://github.com/mulesoft-consulting/json-logger and test the “json-logger-test-app”. Just keep in mind that if you are using a Mavenized project (as you should :D) you will need to follow the maven setup pre-requisites also explained in the README of the GitHub repo. Regarding your issue in particular, if you open the json-logger global configuration, you will see a bunch of predefined expressions. Some of those expressions actually depend on environment variables (defined by the ${} syntax) that you should pass to your Mule application either through the default mule-app.properties, or an external property file. As a way to help with that initial setup, in the properties folder (https://github.com/mulesoft-consulting/json-logger/tree/master/properties), I’ve provided a sample mule-app.properties with the basic properties required like the timezone, date format, mule environment, etc. You could also opt to define those in a more global way (e.g. wrapper.conf if it is an on-prem setup) as you probably want the exact same timezone and date format across applications. I also added a mule.dev.properties file. The reason the properties in it look “weird” (setting the value of a property to the value of another property) is because in standard Maven development, the name and version of the application is defined in the pom file. Thus, I think it would be prone to errors, to hardcode the name and version of your application manually when it is already defined and managed in the pom file. With the setup described in the README (https://github.com/mulesoft-consulting/json-logger#access-maven-values-from-property-files) basically we are telling maven to replace the values of property files inside the src/main/resources folder with those inferred from the pom file (e.g. project.artifactId would be the application name and project.version would be the application version. If I just confused you more, the easiest way to just make it work would be defining the following properties in the mule-app.properties file: json.logger.application.name and json.logger.application.version

      Thanks for reading and appreciate the interest 🙂

  2. This is an invaluable asset and I have already used this as part of my current engagement. IT JUST WORKS!!! It’s very easy to use and the best part is that it is very customizable based on one’s custom requirements. JSON is easily parsed by the log management solutions out there which makes this module a natural choice for logging mule events. Also, having a standardized log format is fundamental to any Integration. I am glad that a fundamental best practice has been codified in the form of a module that is easy to plug and play.

  3. The requirement is for ALL log messages coming out of Mule to have a data enriched JSON format; not just the Logger messages. So the change needs to take place in the Mule platform.

    • Hi Phil,

      I agree there should be a way to do this more easily across the platform and we are working toward that goal in Mule 4.

      However, this solution is focused around application logs. For runtime logs in general you can always tinker with the log4j patterns to generate JSON outputs. Granted you won’t have the exact same data available nor a friendly way to do so, but still should give you a standardized structure to log everything.

  4. Wow v1.1 is out.
    i was really waiting for root corelation stuff.
    Let me try it out

    • Hello Andres, great job bud. out of curiosity, how come this project was made a module and not a connector?, is it because of the new exchange module pattern for downloads?, I would like to make it a simple connector but unsure if we can add the devkitannotation class to the module project. I work for a partner and my current client has mule EE

      • Hi Rene,

        This component is actually based on DevKit 3.9.0 SDK (which basically makes it a connector). The “module” name is just a name given it doesn’t really connect to anything (to me it sounded weird to have a logger connector :P) but you can install it either directly from Exchange or as a standard update site in Studio (just like any other connector).



  5. Have been trying to use the module to create a custom connector for a client, but the junit test included has an error (void expected), and I can’t also figure out how to supply input parameters in an Mule Event to send into the junit.
    Can you help with either of these?

  6. Nice job Andres, it seems really useful! 🙂
    I have one question related to our current use cases. We are consuming third party api and sometimes (but luckily not often :), we can receive a response with http status 500 which is not a json (proxy or firewall issue, third party api which is done, etc.).
    So, in our team, we deactivate exception throwing when the http status is > 500. Like that we are able to handle the response body, log it and apply a specific process.
    Could you be able to manage this use case? Could it be possible to enable a default logging behavior in the case of the input data is not a json format?

    Thanks in advance for your answer! 🙂

    • Hi David,

      This logger is meant to be a drop-in replacement for our OOTB logger component. Effectively what “you” (a developer) would populate is the “message” field (e.g. Response from API was: #[payload]). The contents of whatever you set in there will be escaped as a string inside the main JSON logger structure. The main benefit (aside from being an easily parseable JSON) is all the additional metadata (e.g. elapsed time, app name, version, timestamps, etc.) on each log entry. Unfortunately, related to your specific use case, it won’t affect the behavior of other components, meaning that if you let the HTTP endpoint throw the exception, you would still need to handle it in your exception strategy.

      Hope this makes sense,


  7. This looks a great one , tried my hands on it using a simple app. But it shows exception while trying to run the app.

    schema_reference.4: Failed to read schema document ‘http://www.mulesoft.org/schema/mule/json-logger/current/mule-json-logger.xsd’, because 1) could not find the document; 2) the document could not be read; 3) the root element of the document is not .

  8. Hi Andres.
    Great work. We started to use the this connector in our applications, and it works perfectly except a issue in our CI/CD pipeline.
    We get your sourcecode from github and published it to our vsts respository. however we have noticed that the package created is pretty big and it have all the reference libraries in the lib folder. So when we trying to deploy applications that’s using this connector to cloudhub using our ci/cd pipeline, it will take about 20 mins to finish.
    Is there a way to exclude all those jar’s from the lib when creating the package.
    Thank you so much

    • Looks like something might not be packaged correctly. I took a look and the connector generated JAR should be <300 KB. Please feel free to reach out directly if you are still finding any issues.



  9. Hi Andres!
    Much needed emphasis on logging. Most of the developers save this for the end, but do not find time to get back to it.

    I am using apps which are built on mule runtimes 3.8.4. and 3.9.2. Is this compatible with these runtimes?

  10. Hi Andres,

    Coudl you please explain how to use Debug option in json logger, I have added json logger with debug. Also changed the logging level to debug in run time it is not working.

    Prabhu S

  11. Hi Andres!
    I’m trying to remove classic Loggers from some Mule 3 application and leave only JSON Loggers. Could you help me to figure out which specific AsyncLogger is responsible for your JSON Logger and should be placed in log4j2.xml?

    Best Regards

  12. Unable to get github link for mule-3.x. Can you please provide same as we are working on 3.x