Module: RDoc::TokenStream
- Included in:
- AnyMethod, Parser::Ruby
- Defined in:
- lib/rdoc/token_stream.rb
Overview
A TokenStream is a list of tokens, gathered during the parse of some entity (say a method). Entities populate these streams by being registered with the lexer. Any class can collect tokens by including TokenStream. From the outside, you use such an object by calling the start_collecting_tokens method, followed by calls to add_token and pop_token.
Class Method Summary collapse
-
.to_html(token_stream) ⇒ Object
Converts
token_streamto HTML wrapping various tokens with<span>elements.
Instance Method Summary collapse
-
#add_token(token) ⇒ Object
Adds one
tokento the collected tokens. -
#add_tokens(tokens) ⇒ Object
Adds
tokensto the collected tokens. -
#collect_tokens(language) ⇒ Object
(also: #start_collecting_tokens)
Starts collecting tokens.
-
#pop_token ⇒ Object
Remove the last token from the collected tokens.
-
#source_language ⇒ Object
Returns the source language of the token stream as a string.
-
#token_stream ⇒ Object
Current token stream.
-
#tokens_to_s ⇒ Object
Returns a string representation of the token stream.
Class Method Details
.to_html(token_stream) ⇒ Object
Converts token_stream to HTML wrapping various tokens with <span> elements. Some tokens types are wrapped in spans with the given class names. Other token types are not wrapped in spans.
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 |
# File 'lib/rdoc/token_stream.rb', line 17 def self.to_html(token_stream) starting_title = false token_stream.map do |t| next unless t style = case t[:kind] when :on_const then 'ruby-constant' when :on_kw then 'ruby-keyword' when :on_ivar then 'ruby-ivar' when :on_cvar then 'ruby-identifier' when :on_gvar then 'ruby-identifier' when '=' != t[:text] && :on_op then 'ruby-operator' when :on_tlambda then 'ruby-operator' when :on_ident then 'ruby-identifier' when :on_label then 'ruby-value' when :on_backref, :on_dstring then 'ruby-node' when :on_comment then 'ruby-comment' when :on_embdoc then 'ruby-comment' when :on_regexp then 'ruby-regexp' when :on_tstring then 'ruby-string' when :on_int, :on_float, :on_rational, :on_imaginary, :on_heredoc, :on_symbol, :on_CHAR then 'ruby-value' when :on_heredoc_beg, :on_heredoc_end then 'ruby-identifier' end text = t[:text] if :on_ident == t[:kind] && starting_title starting_title = false style = 'ruby-identifier ruby-title' end if :on_kw == t[:kind] and 'def' == t[:text] starting_title = true end text = CGI.escapeHTML text if style then end_with_newline = text.end_with?("\n") text = text.chomp if end_with_newline "<span class=\"#{style}\">#{text}</span>#{"\n" if end_with_newline}" else text end end.join end |
Instance Method Details
#add_token(token) ⇒ Object
Adds one token to the collected tokens
81 82 83 |
# File 'lib/rdoc/token_stream.rb', line 81 def add_token(token) @token_stream.push(token) end |
#add_tokens(tokens) ⇒ Object
Adds tokens to the collected tokens
74 75 76 |
# File 'lib/rdoc/token_stream.rb', line 74 def add_tokens(tokens) @token_stream.concat(tokens) end |
#collect_tokens(language) ⇒ Object Also known as: start_collecting_tokens
Starts collecting tokens
89 90 91 92 |
# File 'lib/rdoc/token_stream.rb', line 89 def collect_tokens(language) @token_stream = [] @token_stream_language = language end |
#pop_token ⇒ Object
Remove the last token from the collected tokens
99 100 101 |
# File 'lib/rdoc/token_stream.rb', line 99 def pop_token @token_stream.pop end |
#source_language ⇒ Object
Returns the source language of the token stream as a string
Returns ‘c’ or ‘ruby’
122 123 124 |
# File 'lib/rdoc/token_stream.rb', line 122 def source_language @token_stream_language == :c ? 'c' : 'ruby' end |
#token_stream ⇒ Object
Current token stream
106 107 108 |
# File 'lib/rdoc/token_stream.rb', line 106 def token_stream @token_stream end |
#tokens_to_s ⇒ Object
Returns a string representation of the token stream
113 114 115 |
# File 'lib/rdoc/token_stream.rb', line 113 def tokens_to_s (token_stream or return '').compact.map { |token| token[:text] }.join '' end |