So apparently our redirect from http:// to https:// overrides Aliases in the http:// vhost.

Two new +1's to move this over to the https:// vhost please.



diff --git a/configs/web/fedorahosted.org.conf b/configs/web/fedorahosted.org.conf
index 8f1f2e1..298d9e2 100644
--- a/configs/web/fedorahosted.org.conf
+++ b/configs/web/fedorahosted.org.conf
@@ -2,9 +2,6 @@
     ServerName fedorahosted.org
     ServerAlias www.fedorahosted.org
 
-    # Make robots.txt be used.
-    Alias /robots.txt /srv/web/fedorahosted.org/robots.txt
-
     Redirect 301 / https://fedorahosted.org/
 </VirtualHost>
 
@@ -18,6 +15,9 @@ Listen 443
         RemoveEncoding .gz
     </files>
 
+    # Make robots.txt be used. For real this time.
+    Alias /robots.txt /srv/web/fedorahosted.org/robots.txt
+
     SSLEngine on
     SSLCertificateFile    /etc/httpd/conf.d/fedorahosted.org/fedorahosted.org.cert
     SSLCertificateKeyFile /etc/httpd/conf.d/fedorahosted.org/fedorahosted.org.key



On Tue, May 10, 2011 at 5:13 PM, Toshio Kuratomi <a.badger@gmail.com> wrote:
On Tue, May 10, 2011 at 03:45:15PM -0400, Ricky Elrod wrote:
> Hm... maybe not. Is that something we want to look into before hand? I'm not
> sure what kind of options exist for that, as I don't use svn that often (or at
> all).
>
I'm +1 even without a separate, indexed-by-google web viewer for svn.

svn can be configured to be browsable by http with just svn itself, not sure
if there's a reason we aren't (apparently) doing that.

-Toshio

> On Tue, May 10, 2011 at 3:01 PM, Toshio Kuratomi <a.badger@gmail.com> wrote:
>
>     On Tue, May 10, 2011 at 02:02:11PM -0400, Ricky Elrod wrote:
>     > Wanted to do this before freeze, but never had a chance -- let's get
>     robots.txt
>     > working on hosted1 and try to bring down cpu load and improve page load
>     times a
>     > bit.
>     >
>     > Thoughts/+1's please. robots.txt is already there and has been for a long
>     time,
>     > but nothing has told apache to use it, because apache requests go
>     straight to
>     > trac.
>     >
>     > robots.txt is set to block crawlers from accessing fh.o/*/browser/*
>     (which is
>     > the trac source code browser) -- as per https://fedorahosted.org/
>     > fedora-infrastructure/ticket/1848
>     >
>     +1
>
>     The web interfaces at git.fedorahosted.org/git/, bzr.fp.o/bzr, and
>     hg.fp.o/hg take care of this I think.  The one question I have in regards
>     to
>     this is do we have an svn web viewer?
>
>     -Toshio
>     >
>     >
>     > diff --git a/configs/web/fedorahosted.org.conf b/configs/web/
>     > fedorahosted.org.conf
>     > index d0f7139..8f1f2e1 100644
>     > --- a/configs/web/fedorahosted.org.conf
>     > +++ b/configs/web/fedorahosted.org.conf
>     > @@ -2,6 +2,9 @@
>     >      ServerName fedorahosted.org
>     >      ServerAlias www.fedorahosted.org
>     >
>     > +    # Make robots.txt be used.
>     > +    Alias /robots.txt /srv/web/fedorahosted.org/robots.txt
>     > +
>     >      Redirect 301 / https://fedorahosted.org/
>     >  </VirtualHost>
>     >
>
>     > _______________________________________________
>     > infrastructure mailing list
>     > infrastructure@lists.fedoraproject.org
>     > https://admin.fedoraproject.org/mailman/listinfo/infrastructure
>
>
>     _______________________________________________
>     infrastructure mailing list
>     infrastructure@lists.fedoraproject.org
>     https://admin.fedoraproject.org/mailman/listinfo/infrastructure
>
>

> _______________________________________________
> infrastructure mailing list
> infrastructure@lists.fedoraproject.org
> https://admin.fedoraproject.org/mailman/listinfo/infrastructure


_______________________________________________
infrastructure mailing list
infrastructure@lists.fedoraproject.org
https://admin.fedoraproject.org/mailman/listinfo/infrastructure