[gimp-web] Adding robots.txt to prevent crawling errors
- From: Michael Schumacher <schumaml src gnome org>
- To: commits-list gnome org
- Cc:
- Subject: [gimp-web] Adding robots.txt to prevent crawling errors
- Date: Mon, 29 Apr 2013 18:45:19 +0000 (UTC)
commit 0ecfa0c27e024c088c9564c816740f3ca29fd9fc
Author: Michael Schumacher <schumaml gmx de>
Date: Mon Apr 29 20:44:40 2013 +0200
Adding robots.txt to prevent crawling errors
robots.txt | 2 ++
1 files changed, 2 insertions(+), 0 deletions(-)
---
diff --git a/robots.txt b/robots.txt
new file mode 100644
index 0000000..084cd33
--- /dev/null
+++ b/robots.txt
@@ -0,0 +1,2 @@
+User-agent: *
+Disallow: /admin/
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]