2014-03-03 23:13 GMT+01:00 Simo Sorce <simo@redhat.com>:
On Mon, 2014-03-03 at 19:08 +0100, Miloslav Trmač wrote:
> 1) The computer is assumed to be competently administered[1] on a
> homogenous network.  This implies that any service running with an
> open port is intended to run and have that port open, so there is no
> point with restricting it with a firewall.  There is obviously no
> point in restricting closed ports with a firewall.  With this
> assumption, firewall should be either completely absent or permitting
> almost all traffic (or perhaps enforcing some kind of minimal policy,
> filtering out clearly bogus packets) by default.
>
I think that you badly characterize this case (and perhaps 2 too).

What I think you fail to address is the case where the administrator is
competent but the *users* of the system may not be.

In this case services configured and run by the administrator should
poke holes, but in general other ports should be firewalled because
users may inadvertently run services that open ports w/o realizing it.

This is the case where a firewall make sense as a default installation
even though roles are allowed to automatically poke holes at
configuration time.

Even in such a case it would not make sense for the role to decide whether to poke holes for itself: either the system roles are assumed to be competently administered or not, and in both cases all roles on the system should be treated the same.

And I don't think the case you describe is frequent in the firsts place.  It requires:
So, a multi-user server with a public IP address:  Yes, there is a specific and frequent kind of them - web hosting servers; but those are also not something that we really can support as a default setup, and longer-term I'd expect them to go the OpenShift way, giving each user a separate container with a separate network namespace.  Other than web hosting, are public multi-user servers with ability of users to run arbitrary code really that frequent?
    Mirek