MetricsLog

Send metrics as structured logs


Keywords
elasticsearch, kibana, logging, logs, logstash, metrics, rsyslog, syslog
License
BSD-3-Clause
Install
pip install MetricsLog==0.1.3

Documentation

Metrics collection and submission as structured logs. And all you need to setup structured logging for regular application logs.

Examples

  • examples/logstash – send structured logs directly into Logstash via UDP or TCP protocol for further processing using dozens of Logstash plugins, among with elasticsearch output plugin for storing logs in Elasticsearch and analysing them using Kibana;
  • examples/syslog – send structured logs using rsyslog and CEE format for fast and reliable logs collection and processing. With rsyslog you will also have ability to store logs in Elasticsearch and analyse them using Kibana by using omelasticsearch output module;
  • examples/console – stream colored structured logs into stderr, Python tracebacks are also highlighted using Pygments library (if installed).

Issues with sending logs into Elasticsearch

Field types in Elasticsearch are static. Fields can be created dynamically, but it's type can't be changed afterward. This means that once you indexed {"foo": 1} document into Elasticsearch, you wouldn't be able to index document {"foo": "bar"}, because foo field was created with long type, which is incompatible with string type.

In order to overcome this issue, you can use sophisticated mapping template for Elasticsearch:

  • examples/es2-template.yaml - mapping template for Elasticsearch 2.x
  • examples/es5-template.yaml - mapping template for Elasticsearch 5.x

These templates are designed to treat all unknown fields as keyword type, which can accept strings and numbers, and this field can be used for filtering purposes. This field is actually a multi-field, with key, num and time subfields, which can be used for aggregation and sorting purposes, if the indexed value is looking like numbers or timestamps.

So, for example, when we indexed {"foo": "123"} document, we will be able to search this fields with foo:123 query and we will be able to aggregate this field as numeric field using foo.num subfield.

Note: there is one more limitation in Elasticsearch, which can't be fixed by such mapping template. Elasticseach has explicit object type, along with other scalar types. And they all share the same namespace in the fields mapping. This means, that if you indexed object into field foo, you wouldn't be able to index any other data type into this field, only objects. So be very careful with what you're indexing into Elasticsearch, use objects only as namespaces with unique and self-describing names and prefer flat structures with scalar types instead of deeply nested objects.

Installation

pip install metricslog

Example

Regular structured logging for application logs are possible using extra keyword argument:

import logging

log = logging.getLogger('some.app.module.name')

log.info('As you can see, %s and %s logs', 'structured', 'formatted',
         extra={'user': 123})

Logged message will be looking like this in the console:

2017-01-01 00:00:00,000 INFO some.app.module.name As you can see, structured and formatted logs user=123

and like this when sent into syslog (formatted for readability):

<14>app-name: @cee: {"@timestamp": "2015-10-26T09:00:00.000Z",
                     "@version": "1",
                     "message": "As you can see, structured and formatted logs",
                     "logger": "some.app.module.name",
                     "user": 123}