Class: Logicality::Lexer::RegexpLexer
- Inherits:
-
Object
- Object
- Logicality::Lexer::RegexpLexer
show all
- Includes:
- Grammar
- Defined in:
- lib/logicality/lexer/regexp_lexer.rb
Overview
This class is a simple lexical token analyzer based on regular expression grammer matchers.
Constant Summary
Constants included
from Grammar
Grammar::AND_OP, Grammar::LEFT_PAREN, Grammar::NOT_OP, Grammar::OR_OP, Grammar::RIGHT_PAREN, Grammar::VALUE
Instance Attribute Summary collapse
Class Method Summary
collapse
Instance Method Summary
collapse
Constructor Details
#initialize(expression) ⇒ RegexpLexer
Returns a new instance of RegexpLexer.
38
39
40
41
42
43
44
45
46
47
48
49
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 38
def initialize(expression)
raise ArgumentError, 'Expression is required' unless expression &&
expression.to_s.length.positive?
@expression = expression.to_s
if invalid_matches.length.positive?
raise ArgumentError, "Invalid syntax: #{invalid_matches}"
end
reset
end
|
Instance Attribute Details
#expression ⇒ Object
Returns the value of attribute expression.
36
37
38
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 36
def expression
@expression
end
|
Class Method Details
.invalid_pattern ⇒ Object
17
18
19
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 17
def invalid_pattern
"#{pattern}|(\\s*)"
end
|
.invalid_regexp ⇒ Object
21
22
23
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 21
def invalid_regexp
Regexp.new(invalid_pattern)
end
|
.pattern ⇒ Object
25
26
27
28
29
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 25
def pattern
Grammar.constants
.map { |c| Grammar.const_get(c).source }
.join('|')
end
|
.regexp ⇒ Object
31
32
33
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 31
def regexp
Regexp.new(pattern)
end
|
Instance Method Details
#next_token ⇒ Object
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 51
def next_token
return nil if index > matches.length - 1
increment
scan_array = matches[index]
return nil unless scan_array
tokens = scan_array.map.with_index do |value, index|
const = Grammar.constants[index]
value ? Token.new(const, value) : nil
end.compact
raise ArgumentError, "Too many tokens found for: #{scan_array}" if tokens.length > 1
raise ArgumentError, "Cannot tokenize: #{scan_array}" if tokens.length.zero?
tokens.first
end
|
#reset ⇒ Object
72
73
74
75
76
|
# File 'lib/logicality/lexer/regexp_lexer.rb', line 72
def reset
@index = -1
self
end
|