Hacker News new | past | comments | ask | show | jobs | submit login

The problem is that "be liberal in what you accept" is, by definition, saying to go beyond the standards, accepting things that are technically illegal according to the standards.

So different software will necessarily do it differently. For all software to be doing it the same, there would realistically need to be some specified standard on how to do it, and then we're no longer talking about 'be liberal in what you accept', but just 'accept exactly what the standards say.'

Of course, in this case the client software was not being 'strict in what you issue' -- I am not challenging that part, of course you should _always_ issue exactly correct according to standard requests or other protocol communications. But there will inevitably be bugs, bugs happen.

"Be liberal in what you accept" makes it harder to find those bugs, and leaves them waiting to surprise you when the (non-standard) level of "liberalness" on the receiving end changes, which it inevitably will because it was not according to standard in the first place.

I think the HTML/JS/CSS web provides another good example of the dangers of 'be liberal in what you accept', very similarly -- you may think your web page is 'correct' because one or more browsers render it correctly while being 'liberal', and not realize it's in fact buggy and will not render correctly on on or more other past, present, or future browsers. This example has been commented upon by others, and I think has led to a move away from 'be liberal in what you accept' in web user agents. http://books.google.com/books?id=5WXp4j4eV4UC&pg=PA136&lpg=P...




How about this as a middle-ground:

Be strict in what you issue (duh!), be liberal in what you accept - but both emit strong warnings when the input isn't strict, and have a strict mode.


That doesn't work. Strict mode ends up getting turned off by default, or turned off at the earliest problem. After all, what's the point in being so strict? I've seen security bugs arise from this, nicely commented in source with a "// spec says x but no need to be so pedantic".

If everyone can be strict in what's sent, then the problem is solved. But since that won't happen, even on accident, the only solution is to be harsh on receiving input and hope things fail early in the dev cycle.

Also, text-based protocols are especially prone to this poor handling, A: because spec writers (like HTTP's) go moronically overboard, being all creative (line folding? comments in HTTP headers? FFS!) and B: because text is so easy, everyone just figures anything goes and pays less attention.


I'd say the fault with HTML/JS/CSS is that the implementation of the rendered (the browser) broke the stack by not being strict in what it emitted. Put another way, a badly formed page should render badly and/or issue errors. For historical reasons, browsers did not and do not. Hence, the reason the browsers are "broken".


It's quite a game theoretical problem. Make a strictly standard compliant browser and nobody will use it, since it won't display most of the websites. You have to render badly formed pages somehow if you want your browser compete with other browsers, since they are doing the same.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: