How to set the logging level for the elasticsearch library differently to my own logging?


How can I set the logging level for the elasticsearch library differently to my own logging? To illustrate the issue, I describe the module scenario. I have a module which uses elasticsearch like this:

import logging
logger = logging.getLogger(__name__)
import elasticsearch

def get_docs():
    logger.debug("search elastic")
    es = elasticsearch.Elasticsearch('http://my-es-server:9200/')
    res ='myindex', body='myquery')
    logger.debug("elastic returns %s hits" % res['hits']['total'])

Then in my main file I do

import logging


I get lots of debug messages from inside the Elasticsearch object. How can I suppress them with some code in without suppressing the debug messages in itself? The Elasticsearch class seems to have a logger object; I I tried to set it to None, but this didn’t change anything.

Asked By: halloleo



The following two lines have done the trick for me to suppress excessive logging from the es library.

es_logger = logging.getLogger('elasticsearch')

Answered By: Thom

I have been using this:

from elasticsearch import logger as es_logger

Answered By: A. P.

As a supplementary answer, if you have several loggers in your project, you can do this way:

import elasticsearch

es_logger = elasticsearch.logger

The pros:

  • You have full control and make sure that the elastic search logger won’t collide with other loggers
  • You may set WARNING/ERROR or DEBUG levels without remembering what number will mean

The cons:

  • You are dependent on internal logger implementation of elastic search
Answered By: Cookie

In recent (v8.5) versions of elasticsearch, the chatty POST log seems to have moved to elastic_transport:


Answered By: Rob Flickenger
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.