Class: LogStash::Filters::Grok
- Defined in:
- lib/logstash/filters/grok.rb
Overview
Parse arbitrary text and structure it.
Grok is currently the best way in logstash to parse crappy unstructured log data into something structured and queryable.
This tool is perfect for syslog logs, apache and other webserver logs, mysql logs, and in general, any log format that is generally written for humans and not computer consumption.
Logstash ships with about 120 patterns by default. You can find them here: <github.com/logstash/logstash/tree/v%VERSION%/patterns>. You can add your own trivially. (See the patterns_dir setting)
If you need help building patterns to match your logs, you will find the <grokdebug.herokuapp.com> too quite useful!
#### Grok Basics
Grok works by combining text patterns into something that matches your logs.
The syntax for a grok pattern is ‘%SYNTAX:SEMANTIC`
The ‘SYNTAX` is the name of the pattern that will match your text. For example, “3.44” will be matched by the NUMBER pattern and “55.3.244.1” will be matched by the IP pattern. The syntax is how you match.
The ‘SEMANTIC` is the identifier you give to the piece of text being matched. For example, “3.44” could be the duration of an event, so you could call it simply ’duration’. Further, a string “55.3.244.1” might identify the ‘client’ making a request.
Optionally you can add a data type conversion to your grok pattern. By default all semantics are saved as strings. If you wish to convert a semantic’s data type, for example change a string to an integer then suffix it with the target data type. For example ‘%NUMBER:num:int` which converts the ’num’ semantic from a string to an integer. Currently the only supported conversions are ‘int` and `float`.
#### Example
With that idea of a syntax and semantic, we can pull out useful fields from a sample log like this fictional http request log:
55.3.244.1 GET /index.html 15824 0.043
The pattern for this could be:
%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}
A more realistic example, let’s read these logs from a file:
input {
file {
path => "/var/log/http.log"
}
}
filter {
grok {
match => [ "message", "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" ]
}
}
After the grok filter, the event will have a few extra fields in it:
-
client: 55.3.244.1
-
method: GET
-
request: /index.html
-
bytes: 15824
-
duration: 0.043
#### Regular Expressions
Grok sits on top of regular expressions, so any regular expressions are valid in grok as well. The regular expression library is Oniguruma, and you can see the full supported regexp syntax [on the Onigiruma site](www.geocities.jp/kosako3/oniguruma/doc/RE.txt)
#### Custom Patterns
Sometimes logstash doesn’t have a pattern you need. For this, you have a few options.
First, you can use the Oniguruma syntax for ‘named capture’ which will let you match a piece of text and save it as a field:
(?<field_name>the pattern here)
For example, postfix logs have a ‘queue id’ that is an 10 or 11-character hexadecimal value. I can capture that easily like this:
(?<queue_id>[0-9A-F]{10,11})
Alternately, you can create a custom patterns file.
-
Create a directory called ‘patterns` with a file in it called `extra` (the file name doesn’t matter, but name it meaningfully for yourself)
-
In that file, write the pattern you need as the pattern name, a space, then the regexp for that pattern.
For example, doing the postfix queue id example as above:
# in ./patterns/postfix
POSTFIX_QUEUEID [0-9A-F]{10,11}
Then use the ‘patterns_dir` setting in this plugin to tell logstash where your custom patterns directory is. Here’s a full example with a sample log:
Jan 1 06:25:43 mailserver14 postfix/cleanup[21403]: BEF25A72965: message-id=<[email protected]>
filter {
grok {
patterns_dir => "./patterns"
match => [ "message", "%{SYSLOGBASE} %{POSTFIX_QUEUEID:queue_id}: %{GREEDYDATA:syslog_message}" ]
}
}
The above will match and result in the following fields:
-
timestamp: Jan 1 06:25:43
-
logsource: mailserver14
-
program: postfix/cleanup
-
pid: 21403
-
queue_id: BEF25A72965
-
syslog_message: message-id=<[email protected]
The ‘timestamp`, `logsource`, `program`, and `pid` fields come from the SYSLOGBASE pattern which itself is defined by other patterns.
Constant Summary
Constants inherited from Base
Constants included from Config::Mixin
Instance Attribute Summary
Attributes included from Config::Mixin
Attributes inherited from Plugin
Instance Method Summary collapse
- #filter(event) ⇒ Object
-
#initialize(params) ⇒ Grok
constructor
A new instance of Grok.
- #register ⇒ Object
Methods inherited from Base
Methods included from Config::Mixin
Methods inherited from Plugin
#eql?, #finished, #finished?, #hash, #inspect, lookup, #reload, #running?, #shutdown, #teardown, #terminating?, #to_s
Constructor Details
#initialize(params) ⇒ Grok
Returns a new instance of Grok.
224 225 226 227 228 229 230 |
# File 'lib/logstash/filters/grok.rb', line 224 def initialize(params) super(params) @match["message"] ||= [] @match["message"] += @pattern if @pattern # the config 'pattern' value (array) # a cache of capture name handler methods. @handlers = {} end |
Instance Method Details
#filter(event) ⇒ Object
282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 |
# File 'lib/logstash/filters/grok.rb', line 282 def filter(event) return unless filter?(event) matched = false done = false @logger.debug? and @logger.debug("Running grok filter", :event => event); @patterns.each do |field, grok| if match(grok, field, event) matched = true break if @break_on_match end #break if done end # @patterns.each if matched filter_matched(event) else # Tag this event if we can't parse it. We can use this later to # reparse+reindex logs if we improve the patterns given. @tag_on_failure.each do |tag| event["tags"] ||= [] event["tags"] << tag unless event["tags"].include?(tag) end end @logger.debug? and @logger.debug("Event now: ", :event => event) end |
#register ⇒ Object
233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 |
# File 'lib/logstash/filters/grok.rb', line 233 def register require "grok-pure" # rubygem 'jls-grok' @patternfiles = [] # Have @@patterns_path show first. Last-in pattern definitions win; this # will let folks redefine built-in patterns at runtime. @patterns_dir = @@patterns_path.to_a + @patterns_dir @logger.info? and @logger.info("Grok patterns path", :patterns_dir => @patterns_dir) @patterns_dir.each do |path| # Can't read relative paths from jars, try to normalize away '../' while path =~ /file:\/.*\.jar!.*\/\.\.\// # replace /foo/bar/../baz => /foo/baz path = path.gsub(/[^\/]+\/\.\.\//, "") @logger.debug? and @logger.debug("In-jar path to read", :path => path) end if File.directory?(path) path = File.join(path, "*") end Dir.glob(path).each do |file| @logger.info? and @logger.info("Grok loading patterns from file", :path => file) @patternfiles << file end end @patterns = Hash.new { |h,k| h[k] = [] } @logger.info? and @logger.info("Match data", :match => @match) @match.each do |field, patterns| patterns = [patterns] if patterns.is_a?(String) if !@patterns.include?(field) @patterns[field] = Grok::Pile.new #@patterns[field].logger = @logger add_patterns_from_files(@patternfiles, @patterns[field]) end @logger.info? and @logger.info("Grok compile", :field => field, :patterns => patterns) patterns.each do |pattern| @logger.debug? and @logger.debug("regexp: #{@type}/#{field}", :pattern => pattern) @patterns[field].compile(pattern) end end # @match.each end |