I still stand by my conviction that servers should be responsible
for transmitting only fully validated data. This only addresses the
"be conservative in transmitting" dictum. The other side of the
coin, "be liberal in receiving" certainly does require a flexible
parser, but doesn't (currently) demand a full-featured SGML parser,
nor should it.
Regards,
Glenn Adams
I think that most people would agree that in the best of all possible
worlds, servers would transmit only validated data --- but that's
different from saying that the onus for assuring this should fall on
the authors of server software (as opposed to, say, the maintainers
of servers, or the original authors of the documents).
In any case, the same issue arises in the same form with document
types very different from HTML --- there are servers, for instance
(one from a major record company, no less!) which ship completely
bogus audio. (The example I have in mind falsely advertises their 12
kHz linear PCM files as being of MIME type audio/basic, which is
defined in the MIME RFCs as being 8 kHz u-law). Should their server
be responsible for checking that as well? (And how?)
Nor is the issue specific to the Web --- the exact same issues can
arise, in the same form, with FTP servers (f'rinstance), many of which
do ship HTML these days.
I'm not belittling the need for validation, in any format, but the
notion that *server* software should be responsible for it strikes me
as inappropriate --- I just don't think servers, per se, should be in
the business of trying to understand data formats. That functionality
is far better placed in an authoring system, where (among other
advantages) the user can get direct feedback about where they have
gone wrong.
rst