在Express中处理robots.txt最聪明的方法是什么? [英] What is the smartest way to handle robots.txt in Express?
问题描述
我目前正在使用Express(Node.js)构建的应用程序,我想知道什么是最好的方法来处理不同的环境(开发,生产)的不同的robots.txt。
I'm currently working on an application built with Express (Node.js) and I want to know what is the smartest way to handle different robots.txt for different environments (development, production).
这是我现在所在,但我不相信解决方案,我认为这是脏的:
This is what I have right now but I'm not convinced by the solution, I think it is dirty:
app.get '/robots.txt', (req, res) ->
res.set 'Content-Type', 'text/plain'
if app.settings.env == 'production'
res.send 'User-agent: *\nDisallow: /signin\nDisallow: /signup\nDisallow: /signout\nSitemap: /sitemap.xml'
else
res.send 'User-agent: *\nDisallow: /'
(注意:是CoffeeScript)
(NB: it is CoffeeScript)
做一个更好的方法。你如何做?
There should be a better way. How would you do it?
谢谢。
推荐答案
使用一个中间件功能。这样,robots.txt将在任何会话之前处理,cookieParser等:
Use a middleware function. This way the robots.txt will be handled before any session, cookieParser, etc:
app.use(function (req, res, next) {
if ('/robots.txt' == req.url) {
res.type('text/plain')
res.send("User-agent: *\nDisallow: /");
} else {
next();
}
});
使用express 4 app.get
按照它的顺序处理,所以你可以使用它:
With express 4 app.get
now gets handled in the order it appears so you can just use that:
app.get('/robots.txt', function (req, res) {
res.type('text/plain');
res.send("User-agent: *\nDisallow: /");
});
这篇关于在Express中处理robots.txt最聪明的方法是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!