ahh, very nice What advantages, if any, would this approach have over, say, escaping special characters in the user input? Is that method insufficient in some cases?
Good on you for actually doing input validation at all. In the past, I've pulled off deep evil by taking advantage of a lack of validation at places like the NIH's website.
yup, I even have my own system for dealing with cross-site scripting :) When tags are allowed, I pass everything through html tidy first, and then filter for the naughty stuff. I have tested it against all known XSS methods, but I may still have some work to do on it since I'm using a blacklist filter, rather than the recommended method of using a whitelist. Using tidy of course has the added benefit that my pages are still valid xhtml, even when users put in invalid html.
pulled off deep evil by taking advantage.. yes, I think I remember a post to that effect
Re: just one question....
What advantages, if any, would this approach have over, say, escaping special characters in the user input? Is that method insufficient in some cases?
Re: just one question....
Got it in one! Yup, and we have a paper with some fun math proving it. :)
Re: just one question....
hehe :)
Re: just one question....
Re: just one question....
When tags are allowed, I pass everything through html tidy first, and then filter for the naughty stuff. I have tested it against all known XSS methods, but I may still have some work to do on it since I'm using a blacklist filter, rather than the recommended method of using a whitelist. Using tidy of course has the added benefit that my pages are still valid xhtml, even when users put in invalid html.
pulled off deep evil by taking advantage..
yes, I think I remember a post to that effect