Class: Cobweb
- Inherits:
-
Object
- Object
- Cobweb
- Defined in:
- lib/cobweb.rb
Class Method Summary collapse
-
.version ⇒ Object
redesign to have a resque stack and a single threaded stack dry the code below, its got a lot of duplication detect the end of the crawl (queued == 0 ?) on end of crawl, return statistic hash (could call specified method ?) if single threaded or enqueue to a specified queue the stat hash investigate using event machine for single threaded crawling.
Instance Method Summary collapse
- #deep_symbolize_keys(hash) ⇒ Object
- #get(url, options = @options) ⇒ Object
- #head(url, options = @options) ⇒ Object
-
#initialize(options = {}) ⇒ Cobweb
constructor
A new instance of Cobweb.
- #method_missing(method_sym, *arguments, &block) ⇒ Object
- #start(base_url) ⇒ Object
Constructor Details
#initialize(options = {}) ⇒ Cobweb
Returns a new instance of Cobweb.
35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
# File 'lib/cobweb.rb', line 35 def initialize( = {}) @options = default_use_encoding_safe_process_job_to false default_follow_redirects_to true default_redirect_limit_to 10 default_processing_queue_to CobwebProcessJob default_crawl_finished_queue_to CobwebFinishedJob default_quiet_to true default_debug_to false default_cache_to 300 default_timeout_to 10 Hash.new default_internal_urls_to [] default_first_page_redirect_internal_to true end |
Dynamic Method Handling
This class handles dynamic methods through the method_missing method
#method_missing(method_sym, *arguments, &block) ⇒ Object
26 27 28 29 30 31 32 33 |
# File 'lib/cobweb.rb', line 26 def method_missing(method_sym, *arguments, &block) if method_sym.to_s =~ /^default_(.*)_to$/ tag_name = method_sym.to_s.split("_")[1..-2].join("_").to_sym @options[tag_name] = arguments[0] unless @options.has_key?(tag_name) else super end end |
Class Method Details
.version ⇒ Object
redesign to have a resque stack and a single threaded stack dry the code below, its got a lot of duplication detect the end of the crawl (queued == 0 ?) on end of crawl, return statistic hash (could call specified method ?) if single threaded or enqueue to a specified queue the stat hash investigate using event machine for single threaded crawling
22 23 24 |
# File 'lib/cobweb.rb', line 22 def self.version "0.0.39" end |
Instance Method Details
#deep_symbolize_keys(hash) ⇒ Object
348 349 350 351 352 353 354 355 356 357 358 |
# File 'lib/cobweb.rb', line 348 def deep_symbolize_keys(hash) hash.keys.each do |key| value = hash[key] hash.delete(key) hash[key.to_sym] = value if hash[key.to_sym].instance_of? Hash hash[key.to_sym] = deep_symbolize_keys(hash[key.to_sym]) end end hash end |
#get(url, options = @options) ⇒ Object
79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 |
# File 'lib/cobweb.rb', line 79 def get(url, = @options) raise "url cannot be nil" if url.nil? uri = Addressable::URI.parse(url) uri.fragment=nil url = uri.to_s # get the unique id for this request unique_id = Digest::SHA1.hexdigest(url.to_s) if .has_key?(:redirect_limit) and ![:redirect_limit].nil? redirect_limit = [:redirect_limit].to_i else redirect_limit = 10 end # connect to redis if .has_key? :crawl_id redis = NamespacedRedis.new(@options[:redis_options], "cobweb-#{Cobweb.version}-#{[:crawl_id]}") else redis = NamespacedRedis.new(@options[:redis_options], "cobweb-#{Cobweb.version}") end content = {:base_url => url} # check if it has already been cached if redis.get(unique_id) and @options[:cache] puts "Cache hit for #{url}" unless @options[:quiet] content = deep_symbolize_keys(Marshal.load(redis.get(unique_id))) else # this url is valid for processing so lets get on with it #TODO the @http here is different from in head. Should it be? - in head we are using a method-scoped variable. # retrieve data unless @http && @http.address == uri.host && @http.port == uri.inferred_port puts "Creating connection to #{uri.host}..." unless @options[:quiet] @http = Net::HTTP.new(uri.host, uri.inferred_port) end if uri.scheme == "https" @http.use_ssl = true @http.verify_mode = OpenSSL::SSL::VERIFY_NONE end request_time = Time.now.to_f @http.read_timeout = @options[:timeout].to_i @http.open_timeout = @options[:timeout].to_i begin print "Retrieving #{url }... " unless @options[:quiet] request = Net::HTTP::Get.new uri.request_uri response = @http.request request if @options[:follow_redirects] and response.code.to_i >= 300 and response.code.to_i < 400 puts "redirected... " unless @options[:quiet] # get location to redirect to url = UriHelper.join_no_fragment(uri, response['location']) # decrement redirect limit redirect_limit = redirect_limit - 1 # raise exception if we're being redirected to somewhere we've been redirected to in this content request #raise RedirectError("Loop detected in redirect for - #{url}") if content[:redirect_through].include? url # raise exception if redirect limit has reached 0 raise RedirectError, "Redirect Limit reached" if redirect_limit == 0 # get the content from redirect location content = get(url, .merge(:redirect_limit => redirect_limit)) content[:url] = uri.to_s content[:redirect_through] = [] if content[:redirect_through].nil? content[:redirect_through].insert(0, url) content[:response_time] = Time.now.to_f - request_time else content[:response_time] = Time.now.to_f - request_time puts "Retrieved." unless @options[:quiet] # create the content container content[:url] = uri.to_s content[:status_code] = response.code.to_i content[:mime_type] = "" content[:mime_type] = response.content_type.split(";")[0].strip unless response.content_type.nil? if !response["Content-Type"].nil? && response["Content-Type"].include?(";") charset = response["Content-Type"][response["Content-Type"].index(";")+2..-1] if !response["Content-Type"].nil? and response["Content-Type"].include?(";") charset = charset[charset.index("=")+1..-1] if charset and charset.include?("=") content[:character_set] = charset end content[:length] = response.content_length if content[:mime_type].include?("text/html") or content[:mime_type].include?("application/xhtml+xml") if response["Content-Encoding"]=="gzip" content[:body] = Zlib::GzipReader.new(StringIO.new(response.body)).read else content[:body] = response.body end else content[:body] = Base64.encode64(response.body) end content[:location] = response["location"] content[:headers] = deep_symbolize_keys(response.to_hash) # parse data for links link_parser = ContentLinkParser.new(content[:url], content[:body]) content[:links] = link_parser.link_data end # add content to cache if required if @options[:cache] redis.set(unique_id, Marshal.dump(content)) redis.expire unique_id, @options[:cache].to_i end rescue RedirectError => e puts "ERROR: #{e.}" ## generate a blank content content = {} content[:url] = uri.to_s content[:response_time] = Time.now.to_f - request_time content[:status_code] = 0 content[:length] = 0 content[:body] = "" content[:error] = e. content[:mime_type] = "error/dnslookup" content[:headers] = {} content[:links] = {} rescue SocketError => e puts "ERROR: SocketError#{e.}" ## generate a blank content content = {} content[:url] = uri.to_s content[:response_time] = Time.now.to_f - request_time content[:status_code] = 0 content[:length] = 0 content[:body] = "" content[:error] = e. content[:mime_type] = "error/dnslookup" content[:headers] = {} content[:links] = {} rescue Timeout::Error => e puts "ERROR Timeout::Error: #{e.}" ## generate a blank content content = {} content[:url] = uri.to_s content[:response_time] = Time.now.to_f - request_time content[:status_code] = 0 content[:length] = 0 content[:body] = "" content[:error] = e. content[:mime_type] = "error/serverdown" content[:headers] = {} content[:links] = {} end end content end |
#head(url, options = @options) ⇒ Object
237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 |
# File 'lib/cobweb.rb', line 237 def head(url, = @options) raise "url cannot be nil" if url.nil? uri = Addressable::URI.parse(url) uri.fragment=nil url = uri.to_s # get the unique id for this request unique_id = Digest::SHA1.hexdigest(url) if .has_key?(:redirect_limit) and ![:redirect_limit].nil? redirect_limit = [:redirect_limit].to_i else redirect_limit = 10 end # connect to redis if .has_key? :crawl_id redis = NamespacedRedis.new(@options[:redis_options], "cobweb-#{Cobweb.version}-#{[:crawl_id]}") else redis = NamespacedRedis.new(@options[:redis_options], "cobweb-#{Cobweb.version}") end content = {} # check if it has already been cached if redis.get("head-#{unique_id}") and @options[:cache] puts "Cache hit for #{url}" unless @options[:quiet] content = deep_symbolize_keys(Marshal.load(redis.get("head-#{unique_id}"))) else print "Retrieving #{url }... " unless @options[:quiet] # retrieve data http = Net::HTTP.new(uri.host, uri.inferred_port) if uri.scheme == "https" http.use_ssl = true http.verify_mode = OpenSSL::SSL::VERIFY_NONE end request_time = Time.now.to_f http.read_timeout = @options[:timeout].to_i http.open_timeout = @options[:timeout].to_i begin request = Net::HTTP::Head.new uri.request_uri response = http.request request if @options[:follow_redirects] and response.code.to_i >= 300 and response.code.to_i < 400 puts "redirected... " unless @options[:quiet] url = UriHelper.join_no_fragment(uri, response['location']) redirect_limit = redirect_limit - 1 = .clone [:redirect_limit]=redirect_limit content = head(url, ) content[:url] = uri.to_s content[:redirect_through] = [] if content[:redirect_through].nil? content[:redirect_through].insert(0, url) else content[:url] = uri.to_s content[:status_code] = response.code.to_i unless response.content_type.nil? content[:mime_type] = response.content_type.split(";")[0].strip if response["Content-Type"].include? ";" charset = response["Content-Type"][response["Content-Type"].index(";")+2..-1] if !response["Content-Type"].nil? and response["Content-Type"].include?(";") charset = charset[charset.index("=")+1..-1] if charset and charset.include?("=") content[:character_set] = charset end end # add content to cache if required if @options[:cache] puts "Stored in cache [head-#{unique_id}]" if @options[:debug] redis.set("head-#{unique_id}", Marshal.dump(content)) redis.expire "head-#{unique_id}", @options[:cache].to_i else puts "Not storing in cache as cache disabled" if @options[:debug] end end rescue SocketError => e puts "ERROR: #{e.}" ## generate a blank content content = {} content[:url] = uri.to_s content[:response_time] = Time.now.to_f - request_time content[:status_code] = 0 content[:length] = 0 content[:body] = "" content[:error] = e. content[:mime_type] = "error/dnslookup" content[:headers] = {} content[:links] = {} rescue Timeout::Error => e puts "ERROR: #{e.}" ## generate a blank content content = {} content[:url] = uri.to_s content[:response_time] = Time.now.to_f - request_time content[:status_code] = 0 content[:length] = 0 content[:body] = "" content[:error] = e. content[:mime_type] = "error/serverdown" content[:headers] = {} content[:links] = {} end content end end |
#start(base_url) ⇒ Object
52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
# File 'lib/cobweb.rb', line 52 def start(base_url) raise ":base_url is required" unless base_url request = { :crawl_id => Digest::SHA1.hexdigest("#{Time.now.to_i}.#{Time.now.usec}"), :url => base_url } if @options[:internal_urls].empty? uri = Addressable::URI.parse(base_url) @options[:internal_urls] << [uri.scheme, "://", uri.host, "/*"].join end request.merge!(@options) @redis = NamespacedRedis.new(request[:redis_options], "cobweb-#{Cobweb.version}-#{request[:crawl_id]}") @redis.hset "statistics", "queued_at", DateTime.now @redis.set("crawl-counter", 0) @redis.set("queue-counter", 1) @stats = Stats.new(request) @stats.start_crawl(request) # add internal_urls into redis @options[:internal_urls].map{|url| @redis.sadd("internal_urls", url)} Resque.enqueue(CrawlJob, request) end |