Skip to the content.

Home | About | Documentation | Discuss

Rationale

Log messages are quite useful for human consumption but they are not easily machine parseable for further analysis. It is machine parsing of log event data that makes them many times more valuable, especially in this age of information overload. Consider a log message and the same information as a structured log event:

"Order placed vide AXO/41294 with AcmeCorp for part number F-39942, quantity 5 pairs"

;; versus

{:event        "order.placed"
 :order-id     "AXO/41294"
 :customer     "AcmeCorp"
 :part-number  "F-39942"
 :quantity     5
 :denomination "pair"}

Data is a first class concept in Clojure - our logs should be the same. Structured logging means logs as first class data.

What is Cambium?

Cambium is a collection of libraries to help you log events as structured data. Cambium primarily wraps SLF4J and extends clojure/tools.logging to provide a data oriented logging API. SLF4J needs a backing implementation for its API, so Cambium also provides implementation hooks for the data orientation.

You can control the degree of flexibility you need with Cambium’s modularity. It builds just a little over the API exposed by clojure/tools.logging, so it should be familiar enough for most people to adopt easily. Combined with the ubiquity of SLF4J on the JVM, you can tap into all the Java library eco-system without any worry. You may choose any SLF4J backing implementation as long as it is adapted to use Cambium’s data orientation.

How it works

At minimum, you need a Cambium codec and the cambium.core module to start logging. You also need an SLF4j backend configured to use the chosen Cambium codec for the logs to appear at some destination. Cambium makes use of SLF4j Mapped Diagnostic Context (MDC) to communicate the data attributes to the logging backend. Hence, the logging backend must support SLF4j MDC and should be configured to use the codec. The Cambium codec is also used to overcome the lack of builtin support in SLF4j MDC for nested data.

MDC is mutable and thread-local!

The SLF4j MDC is a thread-local map of string keys to string values. This thread-local map is also mutable! Cambium wraps the MDC in a way such that the required mutation is carried out for only as long as required, and then the original value is automatically restored. Since this temporary mutation is thread-local, it does not impact other concurrent code accessing the MDC.

Thread-locality of MDC prevents its propagation to concurrently running code. For example, how do you carry the logging context (MDC) into (future (code-needing-logging-context))? Such carry-forwarding of context needs to be explicitly managed. Cambium has support for explicitly carrying over context.

Backend integration

Cambium comes with Logback backend modules for SLF4j, named cambium/cambium.logback.*. Cambium does not provide any support for configuring or initializing Logback, though it includes all dependencies required to do so. Logback configuration details can be found in the Logback manual.

Fork me on GitHub