Suru MITM web proxy for web application assessments - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Suru MITM web proxy for web application assessments

Description:

Does intelligent file and directory discovery (and Wikto was just not cutting it ... Automatically start looking for other directories within /bigsite ... – PowerPoint PPT presentation

Number of Views:126
Avg rating:3.0/5.0
Slides: 19
Provided by: roel3
Category:

less

Transcript and Presenter's Notes

Title: Suru MITM web proxy for web application assessments


1
(No Transcript)
2
Introduction
  • From the makers of Wikto, Crowbar and BiDiBLAH,
    the producers of Hacking by Numbers, Setiri and
    Putting the Tea in CyberTerrorism, the directors
    of When the tables turn, several Syngress fairy
    tales and the inspiration of the Matrix trilogy
    (right) comes a presentation so powerful and
    compelling

3
Why another proxy??
  • We wanted something that
  • Does intelligent file and directory discovery
    (and Wikto was just not cutting it anymore).
  • Does intelligent fuzzing of web applications
    (without trying to be too clever about it).
  • After looking for long at how people use other
    web application assessment tools we found that
  • There is no one-button web application
    assessment tool
  • Those who attempt to provide such tools mostly
    fails miserably.
  • People that are good at web application
    assessments still wants to be in control of every
    request that they make (thats why the _at_stake
    webproxy rocked so much).
  • While they still want to be in control, they
    perform some actions over and over (but with
    enough variation that it cannot be automated).
  • They need something that can automate some
    parts of the assessment process effectively
    without taking away flexibility or power of doing
    it manually.
  • The lines between the application and web server
    are blurring

4
it didnt happen in one day
  • We wanted something that works like Nikto, but
    wont be fooled by friendly 404s
  • We created Wikto in 2004
  • Some people still dont know how the AI option
    works ?.
  • The cleverness of Wikto sits in the content
    comparison algorithm.
  • We created Crowbar early in 2005
  • Most people dont know how it works ?.
  • Sadly, most people dont know how to use it
    either
  • With Crowbar we expanded the thinking we wanted
    to create a generic brute forcer and ended up
    with something a lot more useful. Of all the
    tools up to this point, Crowbar was one of the
    most powerful yet most people didnt know how
    to use it properly.
  • We really wanted a proxy (for E-Or actually), so
    we took some proxy code and started mangling it
    early in 2006.

5
Sohow DOES it work?
  • The content comparison algorithm basically it
    compares two strings.
  • In Wikto it compares the response for a test file
    with that of a file that will never exist on the
    system. If the response differ we know that the
    file is there.
  • GET /scripts/moomoomoo.pl HTTP/1.0 BRR
  • GET /scripts/login.pl HTTP/1.0 real test
  • In Crowbar it compares the output of a test
    request with that of a base response. The user
    can choose the base response, and choose how she
    wants to construct the test response.
  • GET /scripts/login.pl?usermoopassblah HTTP/1.0
    BRR
  • GET /scripts/login.pl?useradminpassaaa
    HTTP/1.0 real test

6
And what about the content compare?
  • Step 1 crop header (if possible)
  • Step 2 split string A and B on \n, gt and space
    gt collectionA,B
  • Step 3 count blanks items in both A and B
  • Foreach itemA in collectionA
  • foreach itemB in collection B
  • if (itemAitemB)
  • increment counter
  • break
  • Return counter x 2 / ((collectionAcollectioB)-b
    lanks)

7
And what about the content compare?
  • See it in action
  • ltbgt I am testing this lt/bgt ltbgt doedelsakdoeklt/bgt
  • ltbgt I am testing this lt/bgtltbgt kaaskrulletjieslt/bgt
  • Becomes
  • Collection A I am testing this doedelsakdoek
  • Collection B I am testing this kaaskrulletjies
  • Matching count I am testing this 4
  • Blank count zero
  • A B 55 10
  • Return (4 x 2) / 10 0.8 or 80 match
  • ltbgt I was testing lt/bgt
  • ltbgt I am testing them things lt/bgt
  • Return (2 x 2)/8 0.5 or 50 match

8
Sohow DOES it work?
  • Crowbar also started to provide us with the
    ability to filter certain responses using a fuzzy
    logic trigger

9
Sohow DOES it work?
  • Crowbar also allowed us to do content extraction.
  • For example consider mining information from
    Google regarding how many results for a certain
    item (a name in this case)

10
Why Wikto sucks
  • One of the most used features of Wikto is the
    BackEnd miner used to discover directories and
    files.
  • What if the entire site is located behind
    /bigsite/ ? It fails to find anything cause its
    testing in the /.
  • Thats why we have mirroring option in Wikto to
    find directories and mine within the known
    directories.
  • But what if the site has form based login (or
    something similar)?
  • Thats why Wikto sucks - it wouldnt test
    anything beyond the login screen
  • What about finding /bigsite/strange_form.bak from
    /bigsite/strange_form.asp ? Or .backup or .zip ?
    What about /bigsite/cgi-bin/bigsite ?
  • Thats why Wikto sucks it does not know
    anything about the site itself. Wikto is a blind
    chicken, pecking away at dirt.

11
Why Wikto sucks
  • Now, if we had a proxy we could see where the
    user is browsing to and adjust our recon process
    accordingly
  • If we see /bigsite/content.php
  • Automatically start looking for other directories
    within /bigsite/
  • If we see /bigsite/moo_form.asp
  • Automatically start looking for moo_form.XX where
    XX is all other extensions (like .zip and .backup
    and .old etc.)
  • If we see /scripts/abc_intranet/login.php
  • Automatically start looking for /abc_intranet in
    other places
  • And while were at it why not check the
    indexability of every directory we visited and
    mined?

12
Recon demo
13
Fuzzing with Suru
  • If we have a content comparison algorithm, then
    we can see if an application would react
    differently when we put junk into it compared
    to good data.
  • In other words, we can send a whole lot of
    requests, and see what different responses are
    generated, and how the good responses differ to
    the bad responses.
  • We can, in fact, group the responses by looking
    how they differ from a base response.
  • In other words when I send 1000 different
    requests to the application modifying a single
    parameter I could just get back 2 different
    responses.

14
Fuzzing with Suru
  • Having a proxy, we can thus parse the request,
    break in nicely up into pairs and let the user
    decide what portion she wants to fuzz.

15
Fuzzing with Suru (Demo)
  • Of course, you can choose to fuzz ANYTHING in the
    HTTP request
  • We can also choose to extract anything from the
    reply
  • ..and group results automatically, with
    adjustable tolerance

16
Other reasons why Suru is nice
  • Automatic relationship discovery
  • Compares md5, sha1, b64e and b64d of every
    parameter with all other parameters (incl. cookie
    values)
  • WHY?
  • Example - after login the application uses the
    MD5 of your username to populate a cookie thats
    used for session tracking (this is a very bad
    idea), or sending your password Base64 encoded in
    another parameter (also a bad idea).
  • Search and replace on both incoming and outgoing
    streams with ability to also change binary data.

17
Other reasons why Suru is nice
  • Usability
  • Uses a IE browser object to replay requests no
    issues with authentication etc
  • Change and replay request instantly whilst
    keeping track of what youve done.
  • Edited requests are marked you dont need to
    find them in a sea of requests.
  • Handles XML (for web services) MultiPart POSTs,
    and shows verb and number of POST/GET parameter
    instantly (so you can choose the juicy requests
    quickly).
  • Saving loading of sessions.
  • Instantly fuzz any variable (and edit your fuzz
    strings in the app)
  • Free form fuzz strings (supports commenting) NO
    limitation only your imagination sorted by
    file name.
  • Instant access to HTTP raw request with automatic
    content length recalculation.
  • Raw replay or browsed replay.
  • One click file/directory mining from recon tree.
  • User defined speed for recon (cause you want to
    be able to still surf the app).
  • Etc.etc.etc.

18
Conclusion
  • Suru is a very nice new MITM web application
    proxy.
  • Suru still allow the analyst the freedom of
    thought, but automates the mundane.
  • Suru is a combination between a useful proxy and
    the best features of Wikto, Crowbar and E-Or.
  • If you are new to web application assessment you
    should perhaps start off with another proxy
    Suru is intense.
  • Suru was written by people that does hundreds of
    web application assessments every year. In others
    words a web application testing tool by people
    that does web application testing.
  • Suru lives at
  • http//www.sensepost.com/research/suru
Write a Comment
User Comments (0)
About PowerShow.com