Class: AssemblyAI::AsyncLemurClient
- Inherits:
-
Object
- Object
- AssemblyAI::AsyncLemurClient
- Defined in:
- lib/assemblyai/lemur/client.rb
Instance Attribute Summary collapse
Instance Method Summary collapse
-
#action_items(transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, answer_format: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurActionItemsResponse
Use LeMUR to generate a list of action items from a transcript.
-
#get_response(request_id:, request_options: nil) ⇒ AssemblyAI::Lemur::LemurStringResponse, AssemblyAI::Lemur::LemurQuestionAnswerResponse
Retrieve a LeMUR response that was previously generated.
- #initialize(request_client:) ⇒ AssemblyAI::AsyncLemurClient constructor
-
#purge_request_data(request_id:, request_options: nil) ⇒ AssemblyAI::Lemur::PurgeLemurRequestDataResponse
Delete the data for a previously submitted LeMUR request.
-
#question_answer(questions:, transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurQuestionAnswerResponse
Question & Answer allows you to ask free-form questions about a single transcript or a group of transcripts.
-
#summary(transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, answer_format: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurSummaryResponse
Custom Summary allows you to distill a piece of audio into a few impactful sentences.
-
#task(prompt:, transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurTaskResponse
Use the LeMUR task endpoint to input your own LLM prompt.
Constructor Details
#initialize(request_client:) ⇒ AssemblyAI::AsyncLemurClient
279 280 281 |
# File 'lib/assemblyai/lemur/client.rb', line 279 def initialize(request_client:) @request_client = request_client end |
Instance Attribute Details
#request_client ⇒ AssemblyAI::AsyncRequestClient (readonly)
275 276 277 |
# File 'lib/assemblyai/lemur/client.rb', line 275 def request_client @request_client end |
Instance Method Details
#action_items(transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, answer_format: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurActionItemsResponse
Use LeMUR to generate a list of action items from a transcript
466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 |
# File 'lib/assemblyai/lemur/client.rb', line 466 def action_items(transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, answer_format: nil, request_options: nil) Async do response = @request_client.conn.post do |req| req..timeout = .timeout_in_seconds unless &.timeout_in_seconds.nil? req.headers["Authorization"] = .api_key unless &.api_key.nil? req.headers = { **req.headers, **(&.additional_headers || {}) }.compact req.body = { **(&.additional_body_parameters || {}), transcript_ids: transcript_ids, input_text: input_text, context: context, final_model: final_model, max_output_size: max_output_size, temperature: temperature, answer_format: answer_format }.compact req.url "#{@request_client.get_url(request_options: )}/lemur/v3/generate/action-items" end AssemblyAI::Lemur::LemurActionItemsResponse.from_json(json_object: response.body) end end |
#get_response(request_id:, request_options: nil) ⇒ AssemblyAI::Lemur::LemurStringResponse, AssemblyAI::Lemur::LemurQuestionAnswerResponse
Retrieve a LeMUR response that was previously generated.
502 503 504 505 506 507 508 509 510 511 512 |
# File 'lib/assemblyai/lemur/client.rb', line 502 def get_response(request_id:, request_options: nil) Async do response = @request_client.conn.get do |req| req..timeout = .timeout_in_seconds unless &.timeout_in_seconds.nil? req.headers["Authorization"] = .api_key unless &.api_key.nil? req.headers = { **req.headers, **(&.additional_headers || {}) }.compact req.url "#{@request_client.get_url(request_options: )}/lemur/v3/#{request_id}" end AssemblyAI::Lemur::LemurResponse.from_json(json_object: response.body) end end |
#purge_request_data(request_id:, request_options: nil) ⇒ AssemblyAI::Lemur::PurgeLemurRequestDataResponse
Delete the data for a previously submitted LeMUR request.
The LLM response data, as well as any context provided in the original request
will be removed.
529 530 531 532 533 534 535 536 537 538 539 |
# File 'lib/assemblyai/lemur/client.rb', line 529 def purge_request_data(request_id:, request_options: nil) Async do response = @request_client.conn.delete do |req| req..timeout = .timeout_in_seconds unless &.timeout_in_seconds.nil? req.headers["Authorization"] = .api_key unless &.api_key.nil? req.headers = { **req.headers, **(&.additional_headers || {}) }.compact req.url "#{@request_client.get_url(request_options: )}/lemur/v3/#{request_id}" end AssemblyAI::Lemur::PurgeLemurRequestDataResponse.from_json(json_object: response.body) end end |
#question_answer(questions:, transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurQuestionAnswerResponse
Question & Answer allows you to ask free-form questions about a single
transcript or a group of transcripts.
The questions can be any whose answers you find useful, such as judging whether
a caller is likely to become a customer or whether all items on a meeting's
agenda were covered.
417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 |
# File 'lib/assemblyai/lemur/client.rb', line 417 def question_answer(questions:, transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, request_options: nil) Async do response = @request_client.conn.post do |req| req..timeout = .timeout_in_seconds unless &.timeout_in_seconds.nil? req.headers["Authorization"] = .api_key unless &.api_key.nil? req.headers = { **req.headers, **(&.additional_headers || {}) }.compact req.body = { **(&.additional_body_parameters || {}), transcript_ids: transcript_ids, input_text: input_text, context: context, final_model: final_model, max_output_size: max_output_size, temperature: temperature, questions: questions }.compact req.url "#{@request_client.get_url(request_options: )}/lemur/v3/generate/question-answer" end AssemblyAI::Lemur::LemurQuestionAnswerResponse.from_json(json_object: response.body) end end |
#summary(transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, answer_format: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurSummaryResponse
Custom Summary allows you to distill a piece of audio into a few impactful
sentences.
You can give the model context to obtain more targeted results while outputting
the results in a variety of formats described in human language.
361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 |
# File 'lib/assemblyai/lemur/client.rb', line 361 def summary(transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, answer_format: nil, request_options: nil) Async do response = @request_client.conn.post do |req| req..timeout = .timeout_in_seconds unless &.timeout_in_seconds.nil? req.headers["Authorization"] = .api_key unless &.api_key.nil? req.headers = { **req.headers, **(&.additional_headers || {}) }.compact req.body = { **(&.additional_body_parameters || {}), transcript_ids: transcript_ids, input_text: input_text, context: context, final_model: final_model, max_output_size: max_output_size, temperature: temperature, answer_format: answer_format }.compact req.url "#{@request_client.get_url(request_options: )}/lemur/v3/generate/summary" end AssemblyAI::Lemur::LemurSummaryResponse.from_json(json_object: response.body) end end |
#task(prompt:, transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, request_options: nil) ⇒ AssemblyAI::Lemur::LemurTaskResponse
Use the LeMUR task endpoint to input your own LLM prompt.
309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 |
# File 'lib/assemblyai/lemur/client.rb', line 309 def task(prompt:, transcript_ids: nil, input_text: nil, context: nil, final_model: nil, max_output_size: nil, temperature: nil, request_options: nil) Async do response = @request_client.conn.post do |req| req..timeout = .timeout_in_seconds unless &.timeout_in_seconds.nil? req.headers["Authorization"] = .api_key unless &.api_key.nil? req.headers = { **req.headers, **(&.additional_headers || {}) }.compact req.body = { **(&.additional_body_parameters || {}), transcript_ids: transcript_ids, input_text: input_text, context: context, final_model: final_model, max_output_size: max_output_size, temperature: temperature, prompt: prompt }.compact req.url "#{@request_client.get_url(request_options: )}/lemur/v3/generate/task" end AssemblyAI::Lemur::LemurTaskResponse.from_json(json_object: response.body) end end |