Detailed example of using Ruby and Nokogiri to simulate crawlers to export RSS seeds

Y2J
Release: 2017-05-02 09:42:48
Original
2125 people have browsed it

# encoding: utf-8
require 'thread'
require 'nokogiri'
require 'open-uri'
require 'rss/maker'
 
$result=Queue.new
def extract_readme_header(no,name,url)
  frame = Nokogiri::HTML(open(url))
  return unless frame
  readme=$url+frame.css('frame')[1]['src']
  return unless readme
  open(readme) do |f|
    doc = Nokogiri::HTML(f.read)
    text=doc.css("div#content div#filecontents p")[0..4].map { |c| c.content }.join(" ").strip
    return if text.length==0
    if text !~ /(rails)|(activ_)/i
      puts "========= #{no} #{name} : #{text[0..50]}"
      date = f.last_modified
      $result << [no,name,readme,date,text]
    end
  end
rescue
  puts $!.to_s
end
 
def make_rss(items)
  RSS::Maker.make("2.0") do |m|
    m.channel.title = "GtitHub recently updated projects"
    m.channel.link = "http://localhost"
    m.channel.description = "GitHub recently updated projects"
    m.items.do_sort = true
    items.each do |no,name,url,date,descr|
      i = m.items.new_item
      i.title = name
      i.link = url
      i.description=descr
      i.date = date
    end
  end
end
 
############################## M A I N ########################
 
############# Scan list of recent project
 
lth=[]
$url="http://rdoc.info"
puts "get url #{$url}..."
doc = Nokogiri::HTML(open($url))
doc.css(&#39;ul.libraries&#39;)[1].css(&#39;li&#39;).each_with_index do |li,i|
  aname =li.css(&#39;a&#39;).first
  name=aname.content
  purl=$url+aname[&#39;href&#39;]
  lth << Thread.new(i,name,purl) { |j,n,u| extract_readme_header(j,n,u)  }
end
 
################ wait all readme are read
 
lth.each { |th| th.join() }
 
################ dequeue results and sort them by date descending
 
result=[]
result << $result.shift while $result.size>0
result.sort!  { |a,b| a[0] <=> b[0] }
 
 
################ format results in rss
 
File.open("RubyFeeds.rss","w") do |file|
  file.write make_rss(result)
end
Copy after login

The above is the detailed content of Detailed example of using Ruby and Nokogiri to simulate crawlers to export RSS seeds. For more information, please follow other related articles on the PHP Chinese website!

source:php.cn
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!