robots.txt

django serving robots.txt efficiently

django serving robots.txt efficiently Question: Here is my current method of serving robots.txt url(r’^robots.txt/$’, TemplateView.as_view(template_name=’robots.txt’, content_type=’text/plain’)), I don’t think that this is the best way. I think it would be better if it were just a pure static resource and served statically. But the way my django app is structured is that the static root …

Total answers: 2

Robotparser doesn't seem to parse correctly

Robotparser doesn't seem to parse correctly Question: I am writing a crawler and for this I am implementing the robots.txt parser, I am using the standard lib robotparser. It seems that robotparser is not parsing correctly, I am debugging my crawler using Google’s robots.txt. (Following examples are from IPython) In [1]: import robotparser In [2]: …

Total answers: 5

Static files in Flask – robot.txt, sitemap.xml (mod_wsgi)

Static files in Flask – robot.txt, sitemap.xml (mod_wsgi) Question: Is there any clever solution to store static files in Flask’s application root directory. robots.txt and sitemap.xml are expected to be found in /, so my idea was to create routes for them: @app.route(‘/sitemap.xml’, methods=[‘GET’]) def sitemap(): response = make_response(open(‘sitemap.xml’).read()) response.headers[“Content-type”] = “text/plain” return response There …

Total answers: 10