[conspire] (forw) Re: Your web-site Causes Firefox to Download the Pages Instead of Viewing Them
nick at zork.net
Wed Aug 25 01:26:59 PDT 2010
> It's a rather subtle and unexpected bit of breakage that occurred
> silently in consequence of the upgrade from Apache httpd version 2.2.9
> to 2.2.16 on Debian.
Yowch! Was this the result of accepting an upstream config change, or
was it caused by retaining prior system-local changes?
> It's necessary to add the following line to the Directory section of
> /etc/apache2/mods-enabled/userdir.conf :
> DirectoryIndex index.php index.html index.htm index.shtml index.cgi
> This is separate and in addition to needing a DirectoryIndex line in
> /etc/apache2/httpd.conf . That addition to userdir.conf has never
> been necessary before.
That strikes me as worth filing a bug about. Anything that requires
editing mods-available files (where I presume your userdir.conf actually
symlinks) just to retain basic functionality seems wrong to me.
In Debian it is often helpful to keep all your local changes in
sites-available stubs, with symlinks in sites-enabled. I often ditch
the 000default from the start and make my own 00vhost to store the
NameVirtualHost directives, and then do as much as possible per-vhost.
If things get repetitive, you can always rely on the Include directive.
The reasoning behind this is that I never get prompted by apache
upgrades about edits to the core config files like apache2.conf or
ports.conf, and I can be more confident that I haven't just accepted
half of a larger upstream config change that spans multiple files.
This breaks down a bit when you need to do performance tuning
adjustments, but a quick comment can help the merge there.
> After doing so, the sysadmin must (1) restart Apache httpd, and (2)
> clear the browser cache, or you, the sysadmin affected by this
> gratuitous breakage, won't realise you've just fixed the problem, and
> will think it's still broken.
This is one reason I tend to debug apache changes with lynx and
/usr/bin/HEAD from the libwww-perl package, or try a browser I normally
won't use (such as chromium). I also occasionally find myself in a maze
of twisty little redirects, and 301s are damned troublesome to scrape
How do you get mailings?... from the lists
-- Don Saklad
More information about the conspire