How To Turn Off Logging in Scrapy (Python)
Question:
I have created a spider using Scrapy but I cannot figure out how to turn off the default logging. From the documentation it appears that I should be able to turn it off by doing
logging.basicConfig(level=logging.ERROR)
But this has no effect. From looking at the code for logging.basicConfig() I’m guessing this is because “the root logger has handlers configured” but perhaps I’m wrong about that. At any rate, can anyone explain what I need to do to get Scrapy to not output the usual
2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
etc.?
EDIT: As suggested by sirfz below, the line
logging.getLogger('scrapy').setLevel(logging.WARNING)
can be used to set the logging level. However, it appears that you must do this in the init method (or later) in your spider.
Answers:
You could add -s LOG_ENABLED=False
as a parameter when launching your script. That should do the trick.
Note: For the version 1.1 changed a little bit: -s LOG_ENABLED=0
You can simply change the logging level for scrapy (or any other logger):
logging.getLogger('scrapy').setLevel(logging.WARNING)
This disables all log messages less than the WARNING
level.
To disable all scrapy log messages you can just set propagate
to False
:
logging.getLogger('scrapy').propagate = False
This prevents scrapy’s log messages from propagating to the root logger (which prints to console when configured using basicConfig()
)
logging.basicConfig(**kwargs)
This function does nothing if the root logger already has handlers
configured for it.
Scrapy has handlers configured for it, so this will not work
you can simply add --nolog
as a parameter when launching your spider using scrapy
command
I am using scrapy v1.7.3.
you can see more in help using command:
scrapy --help
This might helpful, mentioned in the document that you can add this to setting file:
LOG_LEVEL = 'WARNING'
I have created a spider using Scrapy but I cannot figure out how to turn off the default logging. From the documentation it appears that I should be able to turn it off by doing
logging.basicConfig(level=logging.ERROR)
But this has no effect. From looking at the code for logging.basicConfig() I’m guessing this is because “the root logger has handlers configured” but perhaps I’m wrong about that. At any rate, can anyone explain what I need to do to get Scrapy to not output the usual
2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
2015-10-18 17:42:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: EF)
2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-10-18 17:42:00 [scrapy] INFO: Optional features available: ssl, http11, boto
etc.?
EDIT: As suggested by sirfz below, the line
logging.getLogger('scrapy').setLevel(logging.WARNING)
can be used to set the logging level. However, it appears that you must do this in the init method (or later) in your spider.
You could add -s LOG_ENABLED=False
as a parameter when launching your script. That should do the trick.
Note: For the version 1.1 changed a little bit: -s LOG_ENABLED=0
You can simply change the logging level for scrapy (or any other logger):
logging.getLogger('scrapy').setLevel(logging.WARNING)
This disables all log messages less than the WARNING
level.
To disable all scrapy log messages you can just set propagate
to False
:
logging.getLogger('scrapy').propagate = False
This prevents scrapy’s log messages from propagating to the root logger (which prints to console when configured using basicConfig()
)
logging.basicConfig(**kwargs)
This function does nothing if the root logger already has handlers
configured for it.
Scrapy has handlers configured for it, so this will not work
you can simply add --nolog
as a parameter when launching your spider using scrapy
command
I am using scrapy v1.7.3.
you can see more in help using command:
scrapy --help
This might helpful, mentioned in the document that you can add this to setting file:
LOG_LEVEL = 'WARNING'