maradydd: (Default)
[personal profile] maradydd
I love Python because of this and this, which are both working together to make my life easier. That is all.

Re: just one question....

Date: 2005-02-18 12:25 am (UTC)
From: [identity profile] maradydd.livejournal.com
A colleague and I are working on a system to prevent SQL injection attacks, which involves parsing the SQL string being passed to a db and comparing it to a "known-good" string, as specified by the programmer for a given task; if the parses match, no injection. (That's a very brief overview. I can send you the paper if you want to read it.)

We have to allow developers to specify which part of the SQL string corresponds to user input, and find the narrowest enclosing scope for that substring (ie, the lowest node in the parse tree which generates the whole substring), in order to deal with tokenization.

Re: just one question....

Date: 2005-02-18 06:19 am (UTC)
From: [identity profile] pturing.livejournal.com
ahh, very nice
What advantages, if any, would this approach have over, say, escaping special characters in the user input? Is that method insufficient in some cases?

Re: just one question....

Date: 2005-02-18 06:07 pm (UTC)
From: [identity profile] maradydd.livejournal.com
Is that method insufficient in some cases?

Got it in one! Yup, and we have a paper with some fun math proving it. :)

Re: just one question....

Date: 2005-02-18 11:21 pm (UTC)
From: [identity profile] pturing.livejournal.com
well, shit. I guess this means I might have to go back and audit all my code again.
hehe :)

Re: just one question....

Date: 2005-02-18 11:46 pm (UTC)
From: [identity profile] maradydd.livejournal.com
Good on you for actually doing input validation at all. In the past, I've pulled off deep evil by taking advantage of a lack of validation at places like the NIH's website.

Re: just one question....

Date: 2005-02-19 07:18 am (UTC)
From: [identity profile] pturing.livejournal.com
yup, I even have my own system for dealing with cross-site scripting :)
When tags are allowed, I pass everything through html tidy first, and then filter for the naughty stuff. I have tested it against all known XSS methods, but I may still have some work to do on it since I'm using a blacklist filter, rather than the recommended method of using a whitelist. Using tidy of course has the added benefit that my pages are still valid xhtml, even when users put in invalid html.

pulled off deep evil by taking advantage..
yes, I think I remember a post to that effect

Profile

maradydd: (Default)
maradydd

September 2010

S M T W T F S
   1234
567891011
12131415 161718
19202122232425
26 27282930  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags