Class: Arachni::HTTP
- Includes:
- Mixins::Observable, Module::Utilities, UI::Output, Singleton
- Defined in:
- lib/arachni/http.rb
Overview
Arachni::Module::HTTP class
Provides a simple, high-performance and thread-safe HTTP interface to modules.
All requests are run Async (compliments of Typhoeus) providing great speed and performance.
Exceptions
Any exceptions or session corruption is handled by the class.<br/> Some are ignored, on others the HTTP session is refreshed.<br/> Point is, you don’t need to worry about it.
@author: Tasos “Zapotek” Laskos
<[email protected]>
<[email protected]>
@version: 0.2.7
Instance Attribute Summary collapse
-
#cookie_jar ⇒ Hash
readonly
The user supplied cookie jar.
-
#curr_res_cnt ⇒ Object
readonly
Returns the value of attribute curr_res_cnt.
-
#curr_res_time ⇒ Object
readonly
Returns the value of attribute curr_res_time.
-
#init_headers ⇒ Hash
readonly
The headers with which the HTTP client is initialized<br/> This is always kept updated.
- #last_url ⇒ URI readonly
-
#request_count ⇒ Object
readonly
Returns the value of attribute request_count.
-
#response_count ⇒ Object
readonly
Returns the value of attribute response_count.
-
#time_out_count ⇒ Object
readonly
Returns the value of attribute time_out_count.
-
#trainer ⇒ Object
readonly
Returns the value of attribute trainer.
Class Method Summary collapse
Instance Method Summary collapse
- #abort ⇒ Object
-
#after_run(&block) ⇒ Object
Gets called each time a hydra run finishes.
- #average_res_time ⇒ Object
-
#cookie(url, opts = { }) ⇒ Typhoeus::Request
Gets a url with cookies and url variables.
- #current_cookies ⇒ Object
-
#custom_404?(res) ⇒ Boolean
Checks whether or not the provided response is a custom 404 page.
- #fire_and_forget ⇒ Object
-
#get(url, opts = { }) ⇒ Typhoeus::Request
Gets a URL passing the provided query parameters.
-
#get_cookies_str(cookies = { }, with_existing = true) ⇒ string
Returns a hash of cookies as a string (merged with the cookie-jar).
-
#header(url, opts = { }) ⇒ Typhoeus::Request
Gets a url with optional url variables and modified headers.
-
#initialize ⇒ HTTP
constructor
A new instance of HTTP.
- #max_concurrency ⇒ Object
- #max_concurrency!(max_concurrency) ⇒ Object
- #parse_and_set_cookies(res) ⇒ Object
-
#parse_cookie_str(str) ⇒ Hash
Converts HTTP cookies from string to Hash.
-
#parse_url(url) ⇒ URI
Encodes and parses a URL String.
-
#post(url, opts = { }) ⇒ Typhoeus::Request
Posts a form to a URL with the provided query parameters.
- #q_to_h(url) ⇒ Object
-
#queue(req, async = true) ⇒ Object
Queues a Tyhpoeus::Request and applies an ‘on_complete’ callback on behalf of the trainer.
-
#request(url, opts) ⇒ Typhoeus::Request
Makes a generic request.
- #reset ⇒ Object
-
#run ⇒ Object
Runs Hydra (all the asynchronous queued HTTP requests).
-
#set_cookies(cookies) ⇒ void
Sets cookies for the HTTP session.
-
#trace(url, opts = { }) ⇒ Typhoeus::Request
Sends an HTTP TRACE request to “url”.
- #update_cookies(cookies) ⇒ Object
Methods included from Mixins::Observable
Methods included from Module::Utilities
#exception_jail, #get_path, #hash_keys_to_str, #normalize_url, #read_file, #seed, #uri_decode, #uri_encode, #uri_parse, #uri_parser, #url_sanitize
Methods included from UI::Output
#buffer, #debug!, #debug?, #flush_buffer, #mute!, #muted?, #only_positives!, #only_positives?, #print_bad, #print_debug, #print_debug_backtrace, #print_debug_pp, #print_error, #print_error_backtrace, #print_info, #print_line, #print_ok, #print_status, #print_verbose, #reroute_to_file, #reroute_to_file?, #uncap_buffer!, #unmute!, #verbose!, #verbose?
Constructor Details
#initialize ⇒ HTTP
Returns a new instance of HTTP.
77 78 79 |
# File 'lib/arachni/http.rb', line 77 def initialize( ) reset end |
Dynamic Method Handling
This class handles dynamic methods through the method_missing method in the class Arachni::Mixins::Observable
Instance Attribute Details
#cookie_jar ⇒ Hash (readonly)
The user supplied cookie jar
65 66 67 |
# File 'lib/arachni/http.rb', line 65 def @cookie_jar end |
#curr_res_cnt ⇒ Object (readonly)
Returns the value of attribute curr_res_cnt.
73 74 75 |
# File 'lib/arachni/http.rb', line 73 def curr_res_cnt @curr_res_cnt end |
#curr_res_time ⇒ Object (readonly)
Returns the value of attribute curr_res_time.
72 73 74 |
# File 'lib/arachni/http.rb', line 72 def curr_res_time @curr_res_time end |
#init_headers ⇒ Hash (readonly)
The headers with which the HTTP client is initialized<br/> This is always kept updated.
58 59 60 |
# File 'lib/arachni/http.rb', line 58 def init_headers @init_headers end |
#last_url ⇒ URI (readonly)
50 51 52 |
# File 'lib/arachni/http.rb', line 50 def last_url @last_url end |
#request_count ⇒ Object (readonly)
Returns the value of attribute request_count.
67 68 69 |
# File 'lib/arachni/http.rb', line 67 def request_count @request_count end |
#response_count ⇒ Object (readonly)
Returns the value of attribute response_count.
68 69 70 |
# File 'lib/arachni/http.rb', line 68 def response_count @response_count end |
#time_out_count ⇒ Object (readonly)
Returns the value of attribute time_out_count.
70 71 72 |
# File 'lib/arachni/http.rb', line 70 def time_out_count @time_out_count end |
#trainer ⇒ Object (readonly)
Returns the value of attribute trainer.
75 76 77 |
# File 'lib/arachni/http.rb', line 75 def trainer @trainer end |
Class Method Details
.content_type(headers_hash) ⇒ Object
614 615 616 617 618 619 620 621 622 623 |
# File 'lib/arachni/http.rb', line 614 def self.content_type( headers_hash ) return if !headers_hash.is_a?( Hash ) headers_hash.each_pair { |key, val| return val if key.to_s.downcase == 'content-type' } return end |
.parse_cookiejar(cookie_jar) ⇒ Hash
Class method
Parses netscape HTTP cookie files
592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 |
# File 'lib/arachni/http.rb', line 592 def self.( ) = Hash.new jar = File.open( , 'r' ) jar.each_line { |line| # skip empty lines if (line = line.strip).size == 0 then next end # skip comment lines if line[0] == '#' then next end = line.split( "\t" ) [[-2]] = [-1] } end |
Instance Method Details
#abort ⇒ Object
177 178 179 180 181 |
# File 'lib/arachni/http.rb', line 177 def abort exception_jail { @hydra.abort } end |
#after_run(&block) ⇒ Object
Gets called each time a hydra run finishes
275 276 277 |
# File 'lib/arachni/http.rb', line 275 def after_run( &block ) @after_run << block end |
#average_res_time ⇒ Object
183 184 185 186 |
# File 'lib/arachni/http.rb', line 183 def average_res_time return 0 if @curr_res_cnt == 0 return @curr_res_time / @curr_res_cnt end |
#cookie(url, opts = { }) ⇒ Typhoeus::Request
Gets a url with cookies and url variables
417 418 419 420 421 |
# File 'lib/arachni/http.rb', line 417 def ( url, opts = { } ) opts[:cookies] = opts[:params].dup || {} opts[:params] = nil request( url, opts ) end |
#current_cookies ⇒ Object
468 469 470 |
# File 'lib/arachni/http.rb', line 468 def ( @init_headers['cookie'] ) end |
#custom_404?(res) ⇒ Boolean
Checks whether or not the provided response is a custom 404 page
643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 |
# File 'lib/arachni/http.rb', line 643 def custom_404?( res ) @_404 ||= {} path = get_path( res.effective_url ) @_404[path] ||= {} if( !@_404[path]['file'] ) # force a 404 and grab the html body force_404 = path + Digest::SHA1.hexdigest( rand( 9999999 ).to_s ) @_404[path]['file'] = Typhoeus::Request.get( force_404 ).body # force another 404 and grab the html body force_404 = path + Digest::SHA1.hexdigest( rand( 9999999 ).to_s ) not_found2 = Typhoeus::Request.get( force_404 ).body @_404[path]['file_rdiff'] = @_404[path]['file'].rdiff( not_found2 ) end if( !@_404[path]['dir'] ) force_404 = path + Digest::SHA1.hexdigest( rand( 9999999 ).to_s ) + '/' @_404[path]['dir'] = Typhoeus::Request.get( force_404 ).body force_404 = path + Digest::SHA1.hexdigest( rand( 9999999 ).to_s ) + '/' not_found2 = Typhoeus::Request.get( force_404 ).body @_404[path]['dir_rdiff'] = @_404[path]['dir'].rdiff( not_found2 ) end return @_404[path]['dir'].rdiff( res.body ) == @_404[path]['dir_rdiff'] || @_404[path]['file'].rdiff( res.body ) == @_404[path]['file_rdiff'] end |
#fire_and_forget ⇒ Object
171 172 173 174 175 |
# File 'lib/arachni/http.rb', line 171 def fire_and_forget exception_jail { @hydra.fire_and_forget } end |
#get(url, opts = { }) ⇒ Typhoeus::Request
Gets a URL passing the provided query parameters
368 369 370 |
# File 'lib/arachni/http.rb', line 368 def get( url, opts = { } ) request( url, opts ) end |
#get_cookies_str(cookies = { }, with_existing = true) ⇒ string
Returns a hash of cookies as a string (merged with the cookie-jar)
550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 |
# File 'lib/arachni/http.rb', line 550 def ( = { }, with_existing = true ) if with_existing jar = ( @init_headers['cookie'] ) = jar.merge( ) end str = '' .each_pair { |name, value| value = '' if !value val = uri_encode( uri_encode( value ), '+;' ) str += "#{name}=#{val};" } return str end |
#header(url, opts = { }) ⇒ Typhoeus::Request
Gets a url with optional url variables and modified headers
434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 |
# File 'lib/arachni/http.rb', line 434 def header( url, opts = { } ) headers = opts[:params] || {} orig_headers = @init_headers.clone opts[:headers] = @init_headers = @init_headers.merge( headers ) opts[:user_agent] = @init_headers['User-Agent'] opts[:params] = nil req = request( url, opts ) @init_headers = orig_headers.clone return req end |
#max_concurrency ⇒ Object
192 193 194 |
# File 'lib/arachni/http.rb', line 192 def max_concurrency @hydra.max_concurrency end |
#max_concurrency!(max_concurrency) ⇒ Object
188 189 190 |
# File 'lib/arachni/http.rb', line 188 def max_concurrency!( max_concurrency ) @hydra.max_concurrency = max_concurrency end |
#parse_and_set_cookies(res) ⇒ Object
491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 |
# File 'lib/arachni/http.rb', line 491 def ( res ) = {} # extract cookies from the header field begin [res.headers_hash['Set-Cookie']].flatten.each { || break if !.is_a?( String ) .merge!( WEBrick::Cookie.().inject({}) do |hash, | hash[.name] = .value if !! hash end ) } rescue Exception => e print_debug( e.to_s ) print_debug_backtrace( e ) end # extract cookies from the META tags begin # get get the head in order to check if it has an http-equiv for set-cookie head = res.body.match( /<head(.*)<\/head>/imx ) # if it does feed the head to the parser in order to extract the cookies if head && head.to_s.substring?( 'set-cookie' ) Nokogiri::HTML( head.to_s ).search( "//meta[@http-equiv]" ).each { |elem| next if elem['http-equiv'].downcase != 'set-cookie' k, v = elem['content'].split( ';' )[0].split( '=', 2 ) [k] = v } end rescue Exception => e print_debug( e.to_s ) print_debug_backtrace( e ) end return if .empty? # update framework cookies Arachni::Options.instance. = ( , res ) current = ( @init_headers['cookie'] ) ( current.merge( ) ) end |
#parse_cookie_str(str) ⇒ Hash
Converts HTTP cookies from string to Hash
574 575 576 577 578 579 580 581 |
# File 'lib/arachni/http.rb', line 574 def ( str ) = Hash.new str.split( ';' ).each { |kvp| [kvp.split( "=" )[0]] = kvp.split( "=" )[1] } return end |
#parse_url(url) ⇒ URI
Encodes and parses a URL String
632 633 634 |
# File 'lib/arachni/http.rb', line 632 def parse_url( url ) URI.parse( URI.encode( url ) ) end |
#post(url, opts = { }) ⇒ Typhoeus::Request
Posts a form to a URL with the provided query parameters
384 385 386 |
# File 'lib/arachni/http.rb', line 384 def post( url, opts = { } ) request( url, opts.merge( :method => :post ) ) end |
#q_to_h(url) ⇒ Object
450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 |
# File 'lib/arachni/http.rb', line 450 def q_to_h( url ) params = {} begin query = URI( url.to_s ).query return params if !query query.split( '&' ).each { |param| k,v = param.split( '=', 2 ) params[k] = v } rescue end return params end |
#queue(req, async = true) ⇒ Object
Queues a Tyhpoeus::Request and applies an ‘on_complete’ callback on behalf of the trainer.
203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 |
# File 'lib/arachni/http.rb', line 203 def queue( req, async = true ) req.id = @request_count call_on_queue( req, async ) if( !async ) @hydra_sync.queue( req ) else @hydra.queue( req ) end @request_count += 1 print_debug( '------------' ) print_debug( 'Queued request.' ) print_debug( 'ID#: ' + req.id.to_s ) print_debug( 'URL: ' + req.url ) print_debug( 'Method: ' + req.method.to_s ) print_debug( 'Params: ' + req.params.to_s ) print_debug( 'Headers: ' + req.headers.to_s ) print_debug( 'Train?: ' + req.train?.to_s ) print_debug( '------------' ) req.on_complete( true ) { |res| @response_count += 1 @curr_res_cnt += 1 @curr_res_time += res.start_transfer_time call_on_complete( res ) ( res ) if req. print_debug( '------------' ) print_debug( 'Got response.' ) print_debug( 'Request ID#: ' + res.request.id.to_s ) print_debug( 'URL: ' + res.effective_url ) print_debug( 'Method: ' + res.request.method.to_s ) print_debug( 'Params: ' + res.request.params.to_s ) print_debug( 'Headers: ' + res.request.headers.to_s ) print_debug( 'Train?: ' + res.request.train?.to_s ) print_debug( '------------' ) if res.timed_out? # print_error( 'Request timed-out! -- ID# ' + res.request.id.to_s ) @time_out_count += 1 end if( req.train? ) # handle redirections if( ( redir = redirect?( res.dup ) ).is_a?( String ) ) req2 = get( redir, :remove_id => true ) req2.on_complete { |res2| @trainer.add_response( res2, true ) } if req2 else @trainer.add_response( res ) end end } exception_jail { @hydra_sync.run if !async } end |
#request(url, opts) ⇒ Typhoeus::Request
Makes a generic request
287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 |
# File 'lib/arachni/http.rb', line 287 def request( url, opts ) params = opts[:params] || {} remove_id = opts[:remove_id] train = opts[:train] timeout = opts[:timeout] = opts[:cookies] = opts[:update_cookies] async = opts[:async] async = true if async == nil follow_location = opts[:follow_location] || false headers = opts[:headers] || {} # # the exception jail function wraps the block passed to it # in exception handling and runs it # # how cool is Ruby? Seriously.... # exception_jail { headers = @init_headers.merge( headers ) headers['cookie'] = ( , false ) if params = params.merge( { @rand_seed => '' } ) if !remove_id # # There are cases where the url already has a query and we also have # some params to work with. Some webapp frameworks will break # or get confused...plus the url will not be RFC compliant. # # Thus we need to merge the provided params with the # params of the url query and remove the latter from the url. # cparams = params.dup curl = normalize_url( url.dup ) if opts[:method] != :post begin cparams = q_to_h( curl ).merge( cparams ) curl.gsub!( "?#{URI(curl).query}", '' ) if URI(curl).query rescue return end end opts = { :headers => headers, :params => cparams.empty? ? nil : cparams, :method => opts[:method].nil? ? :get : opts[:method] }.merge( @opts ) opts[:follow_location] = follow_location if follow_location opts[:timeout] = timeout if timeout req = Typhoeus::Request.new( curl, opts ) req.train! if train req. if queue( req, async ) return req } end |
#reset ⇒ Object
81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 |
# File 'lib/arachni/http.rb', line 81 def reset opts = Options.instance # someone wants to reset us although nothing has been *set* in the first place # otherwise we'd have a url in opts return if !opts.url req_limit = opts.http_req_limit hydra_opts = { :max_concurrency => req_limit, :username => opts.url.user, :password => opts.url.password, :method => :auto, } @hydra = Typhoeus::Hydra.new( hydra_opts ) @hydra_sync = Typhoeus::Hydra.new( hydra_opts.merge( :max_concurrency => 1 ) ) @hydra.disable_memoization @hydra_sync.disable_memoization @trainer = Arachni::Module::Trainer.new @trainer.http = self @init_headers = { 'cookie' => '', 'From' => opts.authed_by || '', 'Accept' => 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 'User-Agent' => opts.user_agent }.merge( opts.custom_headers ) = {} .merge!( self.class.( opts. ) ) if opts. .merge!( opts. ) if opts. ( ) if !.empty? proxy_opts = {} proxy_opts = { :proxy => "#{opts.proxy_addr}:#{opts.proxy_port}", :proxy_username => opts.proxy_user, :proxy_password => opts.proxy_pass, :proxy_type => opts.proxy_type } if opts.proxy_addr @opts = { :follow_location => false, :disable_ssl_peer_verification => true, :timeout => 50000 }.merge( proxy_opts ) @request_count = 0 @response_count = 0 @time_out_count = 0 # we'll use it to identify our requests @rand_seed = seed( ) @curr_res_time = 0 @curr_res_cnt = 0 @after_run = [] end |
#run ⇒ Object
Runs Hydra (all the asynchronous queued HTTP requests)
Should only be called by the framework after all module threads have been joined!
154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 |
# File 'lib/arachni/http.rb', line 154 def run exception_jail { @hydra.run @after_run.each { |block| block.call } @after_run.clear call_after_run_persistent( ) @curr_res_time = 0 @curr_res_cnt = 0 } end |
#set_cookies(cookies) ⇒ void
This method returns an undefined value.
Sets cookies for the HTTP session
483 484 485 486 487 488 489 |
# File 'lib/arachni/http.rb', line 483 def ( ) @init_headers['cookie'] = '' @cookie_jar = .each_pair { |name, value| @init_headers['cookie'] += "#{name}=#{value};" } end |
#trace(url, opts = { }) ⇒ Typhoeus::Request
Sends an HTTP TRACE request to “url”.
400 401 402 |
# File 'lib/arachni/http.rb', line 400 def trace( url, opts = { } ) request( url, opts.merge( :method => :trace ) ) end |
#update_cookies(cookies) ⇒ Object
472 473 474 |
# File 'lib/arachni/http.rb', line 472 def ( ) ( .merge( ) ) end |