robots.txt: disallow access to snapshots
My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
这个提交包含在:
父节点
830eb6f6ff
当前提交
23debef621
1
Makefile
1
Makefile
@ -78,6 +78,7 @@ install: all
|
||||
$(INSTALL) -m 0644 cgit.css $(DESTDIR)$(CGIT_DATA_PATH)/cgit.css
|
||||
$(INSTALL) -m 0644 cgit.png $(DESTDIR)$(CGIT_DATA_PATH)/cgit.png
|
||||
$(INSTALL) -m 0644 favicon.ico $(DESTDIR)$(CGIT_DATA_PATH)/favicon.ico
|
||||
$(INSTALL) -m 0644 robots.txt $(DESTDIR)$(CGIT_DATA_PATH)/robots.txt
|
||||
$(INSTALL) -m 0755 -d $(DESTDIR)$(filterdir)
|
||||
$(COPYTREE) filters/* $(DESTDIR)$(filterdir)
|
||||
|
||||
|
3
robots.txt
普通文件
3
robots.txt
普通文件
@ -0,0 +1,3 @@
|
||||
User-agent: *
|
||||
Disallow: /*/snapshot/*
|
||||
Allow: /
|
正在加载...
在新工单中引用
屏蔽一个用户