Friday, September 3, 2010

Skipfish [General Tutorial]

Start skipfish in Matriux


Code: (type following to check the various options)
./skipfish -h


Code: ( various options are listed )
tiger@tiger-desktop:/pentest/web/skipfish$ ./skipfish --h
skipfish version 1.01b by
./skipfish: invalid option -- '-'
Usage: ./skipfish [ options ... ] -o output_dir start_url [ start_url2 ... ]

Authentication and access options:

-A user:pass - use specified HTTP authentication credentials
-F host:IP - pretend that 'host' resolves to 'IP'
-C name=val - append a custom cookie to all requests
-H name=val - append a custom HTTP header to all requests
-b (i|f) - use headers consistent with MSIE / Firefox
-N - do not accept any new cookies

Crawl scope options:

-d max_depth - maximum crawl tree depth (16)
-c max_child - maximum children to index per node (1024)
-r r_limit - max total number of requests to send (100000000)
-p crawl% - node and link crawl probability (100%)
-q hex - repeat probabilistic scan with given seed
-I string - only follow URLs matching 'string'
-X string - exclude URLs matching 'string'
-S string - exclude pages containing 'string'
-D domain - crawl cross-site links to another domain
-B domain - trust, but do not crawl, another domain
-O - do not submit any forms
-P - do not parse HTML, etc, to find new links

Reporting options:

-o dir - write output to specified directory (required)
-J - be less noisy about MIME / charset mismatches
-M - log warnings about mixed content
-E - log all HTTP/1.0 / HTTP/1.1 caching intent mismatches
-U - log all external URLs and e-mails seen
-Q - completely suppress duplicate nodes in reports

Dictionary management options:

-W wordlist - load an alternative wordlist (skipfish.wl)
-L - do not auto-learn new keywords for the site
-V - do not update wordlist based on scan results
-Y - do not fuzz extensions in directory brute-force
-R age - purge words hit more than 'age' scans ago
-T name=val - add new form auto-fill rule
-G max_guess - maximum number of keyword guesses to keep (256)

Performance settings:

-g max_conn - max simultaneous TCP connections, global (50)
-m host_conn - max simultaneous connections, per target IP (10)
-f max_fail - max number of consecutive HTTP errors (100)
-t req_tmout - total request response timeout (20 s)
-w rw_tmout - individual network I/O timeout (10 s)
-i idle_tmout - timeout on idle HTTP connections (10 s)
-s s_limit - response size limit (200000 B)



For a general complete scan type:

Code:
./skipfish -o outputdirectory targeturl


and the output is listed in a very good webpage. like this





or check the video here




No comments:

Post a Comment