Class: Beaker::TestSuiteResult

Inherits:
Object
  • Object
show all
Defined in:
lib/beaker/test_suite_result.rb

Overview

Holds the output of a test suite, formats in plain text or xml

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(options, name) ⇒ TestSuiteResult

Create a Beaker::TestSuiteResult instance.

Parameters:

  • options (Hash{Symbol=>String})

    Options for this object

  • name (String)

    The name of the Beaker::TestSuite that the results are for

Options Hash (options):

  • :logger (Logger)

    The Logger object to report information to



15
16
17
18
19
20
# File 'lib/beaker/test_suite_result.rb', line 15

def initialize(options, name)
  @options = options
  @logger = options[:logger]
  @name = name
  @test_cases = []
end

Instance Attribute Details

#start_timeObject

Returns the value of attribute start_time.



9
10
11
# File 'lib/beaker/test_suite_result.rb', line 9

def start_time
  @start_time
end

#stop_timeObject

Returns the value of attribute stop_time.



9
10
11
# File 'lib/beaker/test_suite_result.rb', line 9

def stop_time
  @stop_time
end

#total_testsObject

Returns the value of attribute total_tests.



9
10
11
# File 'lib/beaker/test_suite_result.rb', line 9

def total_tests
  @total_tests
end

Instance Method Details

#add_test_case(test_case) ⇒ Object

Add a Beaker::TestCase to this Beaker::TestSuiteResult instance, used in calculating Beaker::TestSuiteResult data.

Parameters:



24
25
26
# File 'lib/beaker/test_suite_result.rb', line 24

def add_test_case(test_case)
  @test_cases << test_case
end

#elapsed_timeObject

The sum of all Beaker::TestCase runtimes in this Beaker::TestSuiteResult



74
75
76
# File 'lib/beaker/test_suite_result.rb', line 74

def elapsed_time
  @test_cases.inject(0.0) { |r, t| r + t.runtime.to_f }
end

#errored_testsObject

How many errored Beaker::TestCase instances are in this Beaker::TestSuiteResult



39
40
41
# File 'lib/beaker/test_suite_result.rb', line 39

def errored_tests
  @test_cases.count { |c| c.test_status == :error }
end

#failed?Boolean

Did one or more Beaker::TestCase instances in this Beaker::TestSuiteResult fail?

Returns:

  • (Boolean)


69
70
71
# File 'lib/beaker/test_suite_result.rb', line 69

def failed?
  !success?
end

#failed_testsObject

How many failed Beaker::TestCase instances are in this Beaker::TestSuiteResult



44
45
46
# File 'lib/beaker/test_suite_result.rb', line 44

def failed_tests
  @test_cases.count { |c| c.test_status == :fail }
end

#passed_testsObject

How many passed Beaker::TestCase instances are in this Beaker::TestSuiteResult



34
35
36
# File 'lib/beaker/test_suite_result.rb', line 34

def passed_tests
  @test_cases.count { |c| c.test_status == :pass }
end

#pending_testsObject

How many pending Beaker::TestCase instances are in this Beaker::TestSuiteResult



54
55
56
# File 'lib/beaker/test_suite_result.rb', line 54

def pending_tests
  @test_cases.count { |c| c.test_status == :pending }
end

#persist_test_results(filepath) ⇒ Object

Saves failure and error cases as a JSON file for only-failures processing

Parameters:

  • filepath (String)

    Where to put the results



152
153
154
155
156
157
# File 'lib/beaker/test_suite_result.rb', line 152

def persist_test_results(filepath)
  return if filepath.empty?

  results = @test_cases.select { |c| %i[fail error].include? c.test_status }.map(&:path)
  File.open(filepath, 'w') { |file| file.puts JSON.dump(results) }
end

A convenience method for printing the results of a Beaker::TestCase

Parameters:



132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
# File 'lib/beaker/test_suite_result.rb', line 132

def print_test_result(test_case)
  if test_case.exception
    test_file_trace = ""
    test_case.exception.backtrace.each do |line|
      if line.include?(test_case.path)
        test_file_trace = "\r\n    Test line: #{line}"
        break
      end
    end if test_case.exception.backtrace && test_case.path
    test_reported = "reported: #{test_case.exception.inspect}#{test_file_trace}"
  else
    test_case.test_status
  end
  "  Test Case #{test_case.path} #{test_reported}"
end

#skipped_testsObject

How many skipped Beaker::TestCase instances are in this Beaker::TestSuiteResult



49
50
51
# File 'lib/beaker/test_suite_result.rb', line 49

def skipped_tests
  @test_cases.count { |c| c.test_status == :skip }
end

#success?Boolean

Did all the Beaker::TestCase instances in this Beaker::TestSuiteResult pass?

Returns:

  • (Boolean)


64
65
66
# File 'lib/beaker/test_suite_result.rb', line 64

def success?
  sum_failed == 0
end

#sum_failedObject

How many Beaker::TestCase instances failed in this Beaker::TestSuiteResult



59
60
61
# File 'lib/beaker/test_suite_result.rb', line 59

def sum_failed
  failed_tests + errored_tests
end

#summarize(summary_logger) ⇒ Object

Plain text summay of test suite

Parameters:

  • summary_logger (Logger)

    The logger we will print the summary to



80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
# File 'lib/beaker/test_suite_result.rb', line 80

def summarize(summary_logger)
  summary_logger.notify <<-HEREDOC
  Test Suite: #{@name} @ #{start_time}

  - Host Configuration Summary -
  HEREDOC

  average_test_time = elapsed_time / test_count

  summary_logger.notify format(%[

          - Test Case Summary for suite '#{@name}' -
   Total Suite Time: %.2f seconds
  Average Test Time: %.2f seconds
          Attempted: #{test_count}
             Passed: #{passed_tests}
             Failed: #{failed_tests}
            Errored: #{errored_tests}
            Skipped: #{skipped_tests}
            Pending: #{pending_tests}
              Total: #{@total_tests}

  - Specific Test Case Status -
    ], elapsed_time, average_test_time)

  grouped_summary = @test_cases.group_by { |test_case| test_case.test_status }

  summary_logger.notify "Failed Tests Cases:"
  (grouped_summary[:fail] || []).each do |test_case|
    summary_logger.notify print_test_result(test_case)
  end

  summary_logger.notify "Errored Tests Cases:"
  (grouped_summary[:error] || []).each do |test_case|
    summary_logger.notify print_test_result(test_case)
  end

  summary_logger.notify "Skipped Tests Cases:"
  (grouped_summary[:skip] || []).each do |test_case|
    summary_logger.notify print_test_result(test_case)
  end

  summary_logger.notify "Pending Tests Cases:"
  (grouped_summary[:pending] || []).each do |test_case|
    summary_logger.notify print_test_result(test_case)
  end

  summary_logger.notify("\n\n")
end

#test_countObject

How many Beaker::TestCase instances are in this Beaker::TestSuiteResult



29
30
31
# File 'lib/beaker/test_suite_result.rb', line 29

def test_count
  @test_cases.length
end

#write_junit_xml(xml_file, file_to_link = nil, time_sort = false) ⇒ Object

This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.

Writes Junit XML of this Beaker::TestSuiteResult

Parameters:

  • xml_file (String)

    Path to the XML file (from Beaker’s running directory)

  • file_to_link (String) (defaults to: nil)

    Path to the paired file that should be linked from this one (this is relative to the XML file itself, so it would just be the different file name if they’re in the same directory)

  • time_sort (Boolean) (defaults to: false)

    Whether the test results should be output in order of time spent in the test, or in the order of test execution (default)

Returns:

  • nil



172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
# File 'lib/beaker/test_suite_result.rb', line 172

def write_junit_xml(xml_file, file_to_link = nil, time_sort = false)
  stylesheet = File.join(@options[:project_root], @options[:xml_stylesheet])

  begin
    LoggerJunit.write_xml(xml_file, stylesheet) do |_doc, suites|
      meta_info = suites.add_element(REXML::Element.new('meta_test_info'))
      if file_to_link.nil?
        meta_info.add_attribute('page_active', 'no-links')
        meta_info.add_attribute('link_url', '')
      else
        time_sort ? meta_info.add_attribute('page_active', 'performance') : meta_info.add_attribute('page_active', 'execution')
        meta_info.add_attribute('link_url', file_to_link)
      end

      suite = suites.add_element(REXML::Element.new('testsuite'))
      suite.add_attributes(
        [
          ['name', @name],
          ['tests', test_count],
          ['errors', errored_tests],
          ['failures', failed_tests],
          ['skipped', skipped_tests],
          ['pending', pending_tests],
          ['total', @total_tests],
          ['time', format("%f", (stop_time - start_time))],
        ],
      )
      properties = suite.add_element(REXML::Element.new('properties'))
      @options.each_pair do |name, value|
        property = properties.add_element(REXML::Element.new('property'))
        property.add_attributes([['name', name], ['value', value.to_s || '']])
      end

      test_cases_to_report = @test_cases
      test_cases_to_report = @test_cases.sort { |x, y| y.runtime <=> x.runtime } if time_sort
      test_cases_to_report.each do |test|
        item = suite.add_element(REXML::Element.new('testcase'))
        item.add_attributes(
          [
            ['classname', File.dirname(test.path)],
            ['name', File.basename(test.path)],
            ['time', "%f" % test.runtime],
          ],
        )

        test.exports.each do |export|
          export.keys.each do |key|
            item.add_attribute(key.to_s.tr(" ", "_"), export[key])
          end
        end

        # Report failures
        if test.test_status == :fail || test.test_status == :error
          status = item.add_element(REXML::Element.new('failure'))
          status.add_attribute('type', test.test_status.to_s)
          if test.exception
            status.add_attribute('message', test.exception.to_s.delete("\e"))
            data = LoggerJunit.format_cdata(test.exception.backtrace.join('\n'))
            REXML::CData.new(data, true, status)
          end
        end

        if test.test_status == :skip
          status = item.add_element(REXML::Element.new('skipped'))
          status.add_attribute('type', test.test_status.to_s)
        end

        if test.test_status == :pending
          status = item.add_element(REXML::Element.new('pending'))
          status.add_attribute('type', test.test_status.to_s)
        end

        if test.sublog
          stdout = item.add_element(REXML::Element.new('system-out'))
          data = LoggerJunit.format_cdata(test.sublog)
          REXML::CData.new(data, true, stdout)
        end

        if test.last_result and test.last_result.stderr and not test.last_result.stderr.empty?
          stderr = item.add_element('system-err')
          data = LoggerJunit.format_cdata(test.last_result.stderr)
          REXML::CData.new(data, true, stderr)
        end
      end
    end
  rescue Exception => e
    @logger.error "failure in XML output: \n#{e}" + e.backtrace.join("\n")
  end
end