How to set the logging level for the elasticsearch library differently to my own logging?
Question:
How can I set the logging level for the elasticsearch library differently to my own logging? To illustrate the issue, I describe the module scenario. I have a module lookup.py
which uses elasticsearch
like this:
import logging
logger = logging.getLogger(__name__)
import elasticsearch
def get_docs():
logger.debug("search elastic")
es = elasticsearch.Elasticsearch('http://my-es-server:9200/')
res = es.search(index='myindex', body='myquery')
logger.debug("elastic returns %s hits" % res['hits']['total'])
.
.
.
Then in my main file I do
import logging
import lookup.py
logging.root.setLevel(loglevel(args))
get_docs()
.
.
.
I get lots of debug messages from inside the Elasticsearch object. How can I suppress them with some code in lookup.py
without suppressing the debug messages in lookup.py
itself? The Elasticsearch
class seems to have a logger
object; I I tried to set it to None
, but this didn’t change anything.
Answers:
The following two lines have done the trick for me to suppress excessive logging from the es library.
es_logger = logging.getLogger('elasticsearch')
es_logger.setLevel(logging.WARNING)
I have been using this:
from elasticsearch import logger as es_logger
LOGLEVEL = 50
es_logger.setLevel(LOGLEVEL)
As a supplementary answer, if you have several loggers in your project, you can do this way:
import elasticsearch
es_logger = elasticsearch.logger
es_logger.setLevel(elasticsearch.logging.WARNING)
The pros:
- You have full control and make sure that the elastic search logger won’t collide with other loggers
- You may set WARNING/ERROR or DEBUG levels without remembering what number will mean
The cons:
- You are dependent on internal logger implementation of elastic search
In recent (v8.5) versions of elasticsearch, the chatty POST log seems to have moved to elastic_transport:
logging.getLogger('elastic_transport.transport').setLevel(logging.CRITICAL)
How can I set the logging level for the elasticsearch library differently to my own logging? To illustrate the issue, I describe the module scenario. I have a module lookup.py
which uses elasticsearch
like this:
import logging
logger = logging.getLogger(__name__)
import elasticsearch
def get_docs():
logger.debug("search elastic")
es = elasticsearch.Elasticsearch('http://my-es-server:9200/')
res = es.search(index='myindex', body='myquery')
logger.debug("elastic returns %s hits" % res['hits']['total'])
.
.
.
Then in my main file I do
import logging
import lookup.py
logging.root.setLevel(loglevel(args))
get_docs()
.
.
.
I get lots of debug messages from inside the Elasticsearch object. How can I suppress them with some code in lookup.py
without suppressing the debug messages in lookup.py
itself? The Elasticsearch
class seems to have a logger
object; I I tried to set it to None
, but this didn’t change anything.
The following two lines have done the trick for me to suppress excessive logging from the es library.
es_logger = logging.getLogger('elasticsearch')
es_logger.setLevel(logging.WARNING)
I have been using this:
from elasticsearch import logger as es_logger
LOGLEVEL = 50
es_logger.setLevel(LOGLEVEL)
As a supplementary answer, if you have several loggers in your project, you can do this way:
import elasticsearch
es_logger = elasticsearch.logger
es_logger.setLevel(elasticsearch.logging.WARNING)
The pros:
- You have full control and make sure that the elastic search logger won’t collide with other loggers
- You may set WARNING/ERROR or DEBUG levels without remembering what number will mean
The cons:
- You are dependent on internal logger implementation of elastic search
In recent (v8.5) versions of elasticsearch, the chatty POST log seems to have moved to elastic_transport:
logging.getLogger('elastic_transport.transport').setLevel(logging.CRITICAL)