From c148b3777aafb5bfc6ee416239e2f7e7003d986f Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 07:33:35 -0300 Subject: [PATCH 1/8] refactoring credentials --- components/errors.rb | 1 + controllers/client.rb | 19 +++++++++++-------- template.md | 15 +++------------ 3 files changed, 15 insertions(+), 20 deletions(-) diff --git a/components/errors.rb b/components/errors.rb index 4ed0d0c..5e973ea 100644 --- a/components/errors.rb +++ b/components/errors.rb @@ -10,6 +10,7 @@ def initialize(message = nil) class MissingProjectIdError < GeminiError; end class UnsupportedServiceError < GeminiError; end + class ConflictingCredentialsError < GeminiError; end class BlockWithoutServerSentEventsError < GeminiError; end class RequestError < GeminiError diff --git a/controllers/client.rb b/controllers/client.rb index 52697d1..b7d8674 100644 --- a/controllers/client.rb +++ b/controllers/client.rb @@ -27,16 +27,19 @@ def initialize(config) if config[:credentials][:api_key] @authentication = :api_key @api_key = config[:credentials][:api_key] - elsif config[:credentials][:file_path] - @authentication = :service_account - @authorizer = ::Google::Auth::ServiceAccountCredentials.make_creds( - json_key_io: File.open(config[:credentials][:file_path]), - scope: 'https://www.googleapis.com/auth/cloud-platform' - ) - elsif config[:credentials][:file_contents] + elsif config[:credentials].key?(:file_path) && config[:credentials].key?(:file_contents) + raise Errors::ConflictingCredentialsError, + "You must choose 'file_path' or 'file_contents,' not both." + elsif config[:credentials][:file_path] || config[:credentials][:file_contents] @authentication = :service_account + json_key_io = if config[:credentials][:file_path] + File.open(config[:credentials][:file_path]) + else + StringIO.new(config[:credentials][:file_contents]) + end + @authorizer = ::Google::Auth::ServiceAccountCredentials.make_creds( - json_key_io: StringIO.new(config[:credentials][:file_contents]), + json_key_io:, scope: 'https://www.googleapis.com/auth/cloud-platform' ) else diff --git a/template.md b/template.md index 729814e..3bfc20c 100644 --- a/template.md +++ b/template.md @@ -163,7 +163,7 @@ Similar to [Option 2](#option-2-service-account-credentials-file-vertex-ai-api), For local development, you can generate your default credentials using the [gcloud CLI](https://cloud.google.com/sdk/gcloud) as follows: ```sh -gcloud auth application-default login +gcloud auth application-default login ``` For more details about alternative methods and different environments, check the official documentation: @@ -201,17 +201,7 @@ Remember that hardcoding your API key in code is unsafe; it's preferable to use } ``` -**Option 3**: For the Service Account, provide the raw contents of a `google-credentials.json` file and a `REGION`: - -```ruby -{ - service: 'vertex-ai-api', - file_contents: ENV['GOOGLE_CREDENTIALS_FILE_CONTENTS'], - region: 'us-east4' -} -``` - -**Option 4**: For _Application Default Credentials_, omit both the `api_key` and the `file_path`: +**Option 3**: For _Application Default Credentials_, omit both the `api_key` and the `file_path`: ```ruby { @@ -1266,6 +1256,7 @@ GeminiError MissingProjectIdError UnsupportedServiceError +ConflictingCredentialsError BlockWithoutServerSentEventsError RequestError From 3e6b776f951a88df76b4f16999a790cf65500ffc Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 07:42:01 -0300 Subject: [PATCH 2/8] updating README --- template.md | 52 +++++++++++++++++++++++++++++++++++++++++++++++++++- 1 file changed, 51 insertions(+), 1 deletion(-) diff --git a/template.md b/template.md index 3bfc20c..a58ca87 100644 --- a/template.md +++ b/template.md @@ -34,6 +34,17 @@ client = Gemini.new( options: { model: 'gemini-pro', server_sent_events: true } ) +# With a Service Account Credentials File Contents +client = Gemini.new( + credentials: { + service: 'vertex-ai-api', + file_contents: File.read('google-credentials.json'), + # file_contents: ENV['GOOGLE_CREDENTIALS_FILE_CONTENTS'], + region: 'us-east4' + }, + options: { model: 'gemini-pro', server_sent_events: true } +) + # With Application Default Credentials client = Gemini.new( credentials: { @@ -163,7 +174,7 @@ Similar to [Option 2](#option-2-service-account-credentials-file-vertex-ai-api), For local development, you can generate your default credentials using the [gcloud CLI](https://cloud.google.com/sdk/gcloud) as follows: ```sh -gcloud auth application-default login +gcloud auth application-default login ``` For more details about alternative methods and different environments, check the official documentation: @@ -201,6 +212,23 @@ Remember that hardcoding your API key in code is unsafe; it's preferable to use } ``` +Alternatively, you can pass the file contents instead of the path: +```ruby +{ + service: 'vertex-ai-api', + file_contents: File.read('google-credentials.json'), + region: 'us-east4' +} +``` + +```ruby +{ + service: 'vertex-ai-api', + file_contents: ENV['GOOGLE_CREDENTIALS_FILE_CONTENTS'], + region: 'us-east4' +} +``` + **Option 3**: For _Application Default Credentials_, omit both the `api_key` and the `file_path`: ```ruby @@ -259,6 +287,17 @@ client = Gemini.new( options: { model: 'gemini-pro', server_sent_events: true } ) +# With a Service Account Credentials File Contents +client = Gemini.new( + credentials: { + service: 'vertex-ai-api', + file_contents: File.read('google-credentials.json'), + # file_contents: ENV['GOOGLE_CREDENTIALS_FILE_CONTENTS'], + region: 'us-east4' + }, + options: { model: 'gemini-pro', server_sent_events: true } +) + # With Application Default Credentials client = Gemini.new( credentials: { @@ -340,6 +379,17 @@ client = Gemini.new( options: { model: 'gemini-pro', server_sent_events: true } ) +# With a Service Account Credentials File Contents +client = Gemini.new( + credentials: { + service: 'vertex-ai-api', + file_contents: File.read('google-credentials.json'), + # file_contents: ENV['GOOGLE_CREDENTIALS_FILE_CONTENTS'], + region: 'us-east4' + }, + options: { model: 'gemini-pro', server_sent_events: true } +) + # With Application Default Credentials client = Gemini.new( credentials: { From 13801a8ac9da1d65dea549ca3ee46f9245a8230d Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 07:43:45 -0300 Subject: [PATCH 3/8] updating README --- template.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/template.md b/template.md index a58ca87..e879a1f 100644 --- a/template.md +++ b/template.md @@ -34,7 +34,7 @@ client = Gemini.new( options: { model: 'gemini-pro', server_sent_events: true } ) -# With a Service Account Credentials File Contents +# With the Service Account Credentials File contents client = Gemini.new( credentials: { service: 'vertex-ai-api', @@ -287,7 +287,7 @@ client = Gemini.new( options: { model: 'gemini-pro', server_sent_events: true } ) -# With a Service Account Credentials File Contents +# With the Service Account Credentials File contents client = Gemini.new( credentials: { service: 'vertex-ai-api', @@ -379,7 +379,7 @@ client = Gemini.new( options: { model: 'gemini-pro', server_sent_events: true } ) -# With a Service Account Credentials File Contents +# With the Service Account Credentials File contents client = Gemini.new( credentials: { service: 'vertex-ai-api', From d4fd482fb7e7177471f97a64f8d8e6c61a13216e Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 08:10:04 -0300 Subject: [PATCH 4/8] adding tests to credentials --- .rspec | 1 + .rubocop.yml | 3 +++ Gemfile | 4 ++- components/errors.rb | 2 +- controllers/client.rb | 22 ++++++++++++--- spec/controllers/client_spec.rb | 48 +++++++++++++++++++++++++++++++++ spec/tasks/run-generate.rb | 15 ----------- 7 files changed, 74 insertions(+), 21 deletions(-) create mode 100644 .rspec create mode 100644 spec/controllers/client_spec.rb diff --git a/.rspec b/.rspec new file mode 100644 index 0000000..c99d2e7 --- /dev/null +++ b/.rspec @@ -0,0 +1 @@ +--require spec_helper diff --git a/.rubocop.yml b/.rubocop.yml index 0a44d98..81c5e10 100644 --- a/.rubocop.yml +++ b/.rubocop.yml @@ -4,3 +4,6 @@ AllCops: Style/Documentation: Enabled: false + +require: + - rubocop-rspec diff --git a/Gemfile b/Gemfile index 9d322d0..df66f53 100644 --- a/Gemfile +++ b/Gemfile @@ -7,5 +7,7 @@ gemspec group :test, :development do gem 'dotenv', '~> 3.1', '>= 3.1.2' gem 'pry-byebug', '~> 3.10', '>= 3.10.1' - gem 'rubocop', '~> 1.63', '>= 1.63.5' + gem 'rspec', '~> 3.13' + gem 'rubocop', '~> 1.64', '>= 1.64.1' + gem 'rubocop-rspec', '~> 3.0', '>= 3.0.1' end diff --git a/components/errors.rb b/components/errors.rb index 5e973ea..f35363c 100644 --- a/components/errors.rb +++ b/components/errors.rb @@ -4,7 +4,7 @@ module Gemini module Errors class GeminiError < StandardError def initialize(message = nil) - super(message) + super end end diff --git a/controllers/client.rb b/controllers/client.rb index b7d8674..1f2c419 100644 --- a/controllers/client.rb +++ b/controllers/client.rb @@ -21,15 +21,14 @@ def initialize(config) @service = config[:credentials][:service] unless %w[vertex-ai-api generative-language-api].include?(@service) - raise Errors::UnsupportedServiceError, "Unsupported service: #{@service}" + raise Errors::UnsupportedServiceError, "Unsupported service: '#{@service}'." end + avoid_conflicting_credentials!(config[:credentials]) + if config[:credentials][:api_key] @authentication = :api_key @api_key = config[:credentials][:api_key] - elsif config[:credentials].key?(:file_path) && config[:credentials].key?(:file_contents) - raise Errors::ConflictingCredentialsError, - "You must choose 'file_path' or 'file_contents,' not both." elsif config[:credentials][:file_path] || config[:credentials][:file_contents] @authentication = :service_account json_key_io = if config[:credentials][:file_path] @@ -84,6 +83,21 @@ def initialize(config) end end + def avoid_conflicting_credentials!(credentials) + conflicting_keys = %i[api_key file_path file_contents] + + found = credentials.keys.filter { |key| conflicting_keys.include?(key) } + + return unless found.size > 1 + + message = found.sort.each_with_index.map do |key, i| + i == found.size - 1 ? "or '#{key}'" : "'#{key}'" + end.join(', ') + + raise Errors::ConflictingCredentialsError, + "You must choose either #{message}." + end + def predict(payload, server_sent_events: nil, &callback) result = request( "#{@model_address}:predict", payload, diff --git a/spec/controllers/client_spec.rb b/spec/controllers/client_spec.rb new file mode 100644 index 0000000..6c7df9c --- /dev/null +++ b/spec/controllers/client_spec.rb @@ -0,0 +1,48 @@ +# frozen_string_literal: true + +require_relative '../../ports/dsl/gemini-ai' +require_relative '../../components/errors' + +RSpec.describe Gemini do + it 'avoids unsupported services' do + expect do + described_class.new( + credentials: { + service: 'unknown-service' + } + ) + end.to raise_error( + Gemini::Errors::UnsupportedServiceError, + "Unsupported service: 'unknown-service'." + ) + end + + it 'avoids conflicts with credential keys' do + expect do + described_class.new( + credentials: { + service: 'vertex-ai-api', + api_key: 'key', + file_path: 'path', + file_contents: 'contents' + } + ) + end.to raise_error( + Gemini::Errors::ConflictingCredentialsError, + "You must choose either 'api_key', 'file_contents', or 'file_path'." + ) + + expect do + described_class.new( + credentials: { + service: 'vertex-ai-api', + file_path: 'path', + file_contents: 'contents' + } + ) + end.to raise_error( + Gemini::Errors::ConflictingCredentialsError, + "You must choose either 'file_contents', or 'file_path'." + ) + end +end diff --git a/spec/tasks/run-generate.rb b/spec/tasks/run-generate.rb index 1d4ce6b..8941d1c 100644 --- a/spec/tasks/run-generate.rb +++ b/spec/tasks/run-generate.rb @@ -4,21 +4,6 @@ require_relative '../../ports/dsl/gemini-ai' -begin - client = Gemini.new( - credentials: { - service: 'unknown-service' - }, - options: { model: 'gemini-pro', server_sent_events: true } - ) - - client.stream_generate_content( - { contents: { role: 'user', parts: { text: 'hi!' } } } - ) -rescue StandardError => e - raise "Unexpected error: #{e.class}" unless e.instance_of?(Gemini::Errors::UnsupportedServiceError) -end - client = Gemini.new( credentials: { service: 'generative-language-api', From 2b02bc20cc9524f89e381be987b9115d112486f4 Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 08:31:52 -0300 Subject: [PATCH 5/8] fixing spec tasks --- Gemfile.lock | 32 +++++++++++++++++++++++++------- spec/tasks/run-generate.rb | 8 ++++---- spec/tasks/run-safety.rb | 4 ++-- spec/tasks/run-system.rb | 4 ++-- template.md | 9 ++++++++- 5 files changed, 41 insertions(+), 16 deletions(-) diff --git a/Gemfile.lock b/Gemfile.lock index 74a0a9d..892b905 100644 --- a/Gemfile.lock +++ b/Gemfile.lock @@ -17,6 +17,7 @@ GEM base64 (0.2.0) byebug (11.1.3) coderay (1.1.3) + diff-lcs (1.5.1) dotenv (3.1.2) ethon (0.16.0) ffi (>= 1.15.0) @@ -47,8 +48,8 @@ GEM net-http (0.4.1) uri os (1.1.4) - parallel (1.24.0) - parser (3.3.1.0) + parallel (1.25.1) + parser (3.3.3.0) ast (~> 2.4.1) racc pry (0.14.2) @@ -58,12 +59,25 @@ GEM byebug (~> 11.0) pry (>= 0.13, < 0.15) public_suffix (5.0.5) - racc (1.7.3) + racc (1.8.0) rainbow (3.1.1) regexp_parser (2.9.2) - rexml (3.2.8) - strscan (>= 3.0.9) - rubocop (1.63.5) + rexml (3.3.0) + strscan + rspec (3.13.0) + rspec-core (~> 3.13.0) + rspec-expectations (~> 3.13.0) + rspec-mocks (~> 3.13.0) + rspec-core (3.13.0) + rspec-support (~> 3.13.0) + rspec-expectations (3.13.1) + diff-lcs (>= 1.2.0, < 2.0) + rspec-support (~> 3.13.0) + rspec-mocks (3.13.1) + diff-lcs (>= 1.2.0, < 2.0) + rspec-support (~> 3.13.0) + rspec-support (3.13.1) + rubocop (1.64.1) json (~> 2.3) language_server-protocol (>= 3.17.0) parallel (~> 1.10) @@ -76,6 +90,8 @@ GEM unicode-display_width (>= 2.4.0, < 3.0) rubocop-ast (1.31.3) parser (>= 3.3.1.0) + rubocop-rspec (3.0.1) + rubocop (~> 1.61) ruby-progressbar (1.13.0) signet (0.19.0) addressable (~> 2.8) @@ -95,7 +111,9 @@ DEPENDENCIES dotenv (~> 3.1, >= 3.1.2) gemini-ai! pry-byebug (~> 3.10, >= 3.10.1) - rubocop (~> 1.63, >= 1.63.5) + rspec (~> 3.13) + rubocop (~> 1.64, >= 1.64.1) + rubocop-rspec (~> 3.0, >= 3.0.1) BUNDLED WITH 2.4.22 diff --git a/spec/tasks/run-generate.rb b/spec/tasks/run-generate.rb index 8941d1c..0d6ea15 100644 --- a/spec/tasks/run-generate.rb +++ b/spec/tasks/run-generate.rb @@ -15,12 +15,12 @@ result = client.stream_generate_content( { contents: { role: 'user', parts: { text: 'hi!' } } } ) do |event, _parsed, _raw| - print event['candidates'][0]['content']['parts'][0]['text'] + print event.dig('candidates', 0, 'content', 'parts', 0, 'text') end puts "\n#{'-' * 20}" -puts result.map { |event| event['candidates'][0]['content']['parts'][0]['text'] }.join +puts result.map { |event| event.dig('candidates', 0, 'content', 'parts', 0, 'text') }.join puts '-' * 20 @@ -35,9 +35,9 @@ result = client.stream_generate_content( { contents: { role: 'user', parts: { text: 'hi!' } } } ) do |event, _parsed, _raw| - print event['candidates'][0]['content']['parts'][0]['text'] + print event.dig('candidates', 0, 'content', 'parts', 0, 'text') end puts "\n#{'-' * 20}" -puts result.map { |event| event['candidates'][0]['content']['parts'][0]['text'] }.join +puts result.map { |event| event.dig('candidates', 0, 'content', 'parts', 0, 'text') }.join diff --git a/spec/tasks/run-safety.rb b/spec/tasks/run-safety.rb index 47623e7..97e8a00 100644 --- a/spec/tasks/run-safety.rb +++ b/spec/tasks/run-safety.rb @@ -20,9 +20,9 @@ { category: 'HARM_CATEGORY_SEXUALLY_EXPLICIT', threshold: 'BLOCK_ONLY_HIGH' }, { category: 'HARM_CATEGORY_DANGEROUS_CONTENT', threshold: 'BLOCK_ONLY_HIGH' }] } ) do |event, _parsed, _raw| - print event['candidates'][0]['content']['parts'][0]['text'] + print event.dig('candidates', 0, 'content', 'parts', 0, 'text') end puts "\n#{'-' * 20}" -puts result.map { |event| event['candidates'][0]['content']['parts'][0]['text'] }.join +puts result.map { |event| event.dig('candidates', 0, 'content', 'parts', 0, 'text') }.join diff --git a/spec/tasks/run-system.rb b/spec/tasks/run-system.rb index fb02017..8f40c6e 100644 --- a/spec/tasks/run-system.rb +++ b/spec/tasks/run-system.rb @@ -16,9 +16,9 @@ { contents: { role: 'user', parts: { text: 'Hi! Who are you?' } }, system_instruction: { role: 'user', parts: [{ text: 'You are a cat.' }, { text: 'Your name is Neko.' }] } } ) do |event, _parsed, _raw| - print event['candidates'][0]['content']['parts'][0]['text'] + print event.dig('candidates', 0, 'content', 'parts', 0, 'text') end puts "\n#{'-' * 20}" -puts result.map { |event| event['candidates'][0]['content']['parts'][0]['text'] }.join +puts result.map { |event| event.dig('candidates', 0, 'content', 'parts', 0, 'text') }.join diff --git a/template.md b/template.md index e879a1f..a73cf7a 100644 --- a/template.md +++ b/template.md @@ -1318,7 +1318,14 @@ RequestError bundle rubocop -A -bundle exec ruby spec/tasks/run-client.rb +rspec + +bundle exec ruby spec/tasks/run-available-models.rb +bundle exec ruby spec/tasks/run-embed.rb +bundle exec ruby spec/tasks/run-generate.rb +bundle exec ruby spec/tasks/run-json.rb +bundle exec ruby spec/tasks/run-safety.rb +bundle exec ruby spec/tasks/run-system.rb ``` ### Purpose From 45b7779c3ca7842d82fba91cc81727cd2bb44500 Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 09:04:19 -0300 Subject: [PATCH 6/8] adding json tests --- spec/tasks/run-json.rb | 172 +++++++++++++++++++++++++++++++++++++++++ 1 file changed, 172 insertions(+) create mode 100644 spec/tasks/run-json.rb diff --git a/spec/tasks/run-json.rb b/spec/tasks/run-json.rb new file mode 100644 index 0000000..ac7083e --- /dev/null +++ b/spec/tasks/run-json.rb @@ -0,0 +1,172 @@ +# frozen_string_literal: true + +require 'dotenv/load' + +require_relative '../../ports/dsl/gemini-ai' + +# # References: +# # - https://cloud.google.com/vertex-ai/generative-ai/docs/learn/model-versioning +# # - https://cloud.google.com/vertex-ai/generative-ai/docs/learn/models + +CACHE_FILE_PATH = 'available-models-json.tmp' + +models = [ + 'gemini-pro-vision', + 'gemini-pro', + 'gemini-1.5-pro-preview-0514', + 'gemini-1.5-pro-preview-0409', + 'gemini-1.5-pro', + 'gemini-1.5-flash-preview-0514', + 'gemini-1.5-flash', + 'gemini-1.0-pro-vision-latest', + 'gemini-1.0-pro-vision-001', + 'gemini-1.0-pro-vision', + 'gemini-1.0-pro-latest', + 'gemini-1.0-pro-002', + 'gemini-1.0-pro-001', + 'gemini-1.0-pro', + 'gemini-ultra', + 'gemini-1.0-ultra', + 'gemini-1.0-ultra-001' +] + +def client_for(service, model) + credentials = if service == 'vertex-ai-api' + { service: 'vertex-ai-api', region: 'us-east4' } + else + { service: 'generative-language-api', + api_key: ENV.fetch('GOOGLE_API_KEY', nil) } + end + + Gemini.new(credentials:, options: { model:, server_sent_events: true }) +end + +if File.exist?(CACHE_FILE_PATH) + results = Marshal.load(File.read(CACHE_FILE_PATH)) +else + results = {} + + models.each do |model| + %w[vertex-ai-api generative-language-api].each do |service| + key = "#{service}/#{model}" + + client = client_for(service, model) + + begin + sleep 1 + client.stream_generate_content( + { contents: { role: 'user', parts: { text: 'hi!' } } } + ) + rescue Faraday::BadRequestError, Faraday::ResourceNotFound => e + results[key] = { + service:, model:, + result: 'access-error', output: e.message + } + + print '-' + next + end + + begin + sleep 1 + client.stream_generate_content( + { + contents: { + role: 'user', + parts: { + text: 'List 3 random colors.' + } + }, + generation_config: { + response_mime_type: 'application/json' + } + } + ) + rescue Faraday::BadRequestError, Faraday::ResourceNotFound => e + results[key] = { + service:, model:, + result: 'json-error', output: e.message + } + + print '*' + next + end + + begin + sleep 1 + output = client.stream_generate_content( + { + contents: { + role: 'user', + parts: { + text: 'List 3 random colors.' + } + }, + generation_config: { + response_mime_type: 'application/json', + response_schema: { + type: 'object', + properties: { + colors: { + type: 'array', + items: { + type: 'object', + properties: { + name: { + type: 'string' + } + } + } + } + } + } + } + } + ) + + results[key] = { + service:, model:, + result: 'success', output: + } + + print '.' + rescue Faraday::BadRequestError, Faraday::ResourceNotFound => e + results[key] = { + service:, model:, + result: 'schema-error', output: e.message + } + + print '/' + end + end + end + + puts '' + + File.write(CACHE_FILE_PATH, Marshal.dump(results)) +end + +puts '| Model | Vertex AI | Generative Language |' +puts '|------------------------------------------|:---------:|:-------------------:|' + +table = {} + +results.each_value do |result| + table[result[:model]] = { model: result[:model] } unless table.key?(result[:model]) + table[result[:model]][result[:service]] = case result[:result] + when 'success' + '✅' + when 'access-error' + '🔒' + when 'schema-error' + '🟡' + when 'json-error' + '❌' + else + '?' + end +end + +table.values.sort_by { |row| models.index(row[:model]) }.each do |row| + puts "| #{row[:model].ljust(40)} | #{row['vertex-ai-api'].rjust(4).ljust(8)} | #{row['generative-language-api'].rjust(10).ljust(18)} |" +end From 4432daa407d8f91e96c43c8cc5818032c5cd6a56 Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 09:13:17 -0300 Subject: [PATCH 7/8] updating gems --- Gemfile.lock | 14 +++---- gemini-ai.gemspec | 2 +- spec/spec_helper.rb | 98 +++++++++++++++++++++++++++++++++++++++++++++ 3 files changed, 106 insertions(+), 8 deletions(-) create mode 100644 spec/spec_helper.rb diff --git a/Gemfile.lock b/Gemfile.lock index 892b905..ca34026 100644 --- a/Gemfile.lock +++ b/Gemfile.lock @@ -3,7 +3,7 @@ PATH specs: gemini-ai (4.0.0) event_stream_parser (~> 1.0) - faraday (~> 2.9) + faraday (~> 2.9, >= 2.9.2) faraday-typhoeus (~> 1.1) googleauth (~> 1.8) typhoeus (~> 1.4, >= 1.4.1) @@ -11,8 +11,8 @@ PATH GEM remote: https://rubygems.org/ specs: - addressable (2.8.6) - public_suffix (>= 2.0.2, < 6.0) + addressable (2.8.7) + public_suffix (>= 2.0.2, < 7.0) ast (2.4.2) base64 (0.2.0) byebug (11.1.3) @@ -22,14 +22,14 @@ GEM ethon (0.16.0) ffi (>= 1.15.0) event_stream_parser (1.0.0) - faraday (2.9.0) + faraday (2.9.2) faraday-net_http (>= 2.0, < 3.2) faraday-net_http (3.1.0) net-http faraday-typhoeus (1.1.0) faraday (~> 2.0) typhoeus (~> 1.4) - ffi (1.16.3) + ffi (1.17.0) google-cloud-env (2.1.1) faraday (>= 1.0, < 3.a) googleauth (1.11.0) @@ -40,7 +40,7 @@ GEM os (>= 0.9, < 2.0) signet (>= 0.16, < 2.a) json (2.7.2) - jwt (2.8.1) + jwt (2.8.2) base64 language_server-protocol (3.17.0.3) method_source (1.1.0) @@ -58,7 +58,7 @@ GEM pry-byebug (3.10.1) byebug (~> 11.0) pry (>= 0.13, < 0.15) - public_suffix (5.0.5) + public_suffix (6.0.0) racc (1.8.0) rainbow (3.1.1) regexp_parser (2.9.2) diff --git a/gemini-ai.gemspec b/gemini-ai.gemspec index 009fbb3..d5cd720 100644 --- a/gemini-ai.gemspec +++ b/gemini-ai.gemspec @@ -30,7 +30,7 @@ Gem::Specification.new do |spec| spec.require_paths = ['ports/dsl'] spec.add_dependency 'event_stream_parser', '~> 1.0' - spec.add_dependency 'faraday', '~> 2.9' + spec.add_dependency 'faraday', '~> 2.9', '>= 2.9.2' spec.add_dependency 'faraday-typhoeus', '~> 1.1' # Before upgrading, check this: diff --git a/spec/spec_helper.rb b/spec/spec_helper.rb new file mode 100644 index 0000000..4f8c8d9 --- /dev/null +++ b/spec/spec_helper.rb @@ -0,0 +1,98 @@ +# frozen_string_literal: true + +# This file was generated by the `rspec --init` command. Conventionally, all +# specs live under a `spec` directory, which RSpec adds to the `$LOAD_PATH`. +# The generated `.rspec` file contains `--require spec_helper` which will cause +# this file to always be loaded, without a need to explicitly require it in any +# files. +# +# Given that it is always loaded, you are encouraged to keep this file as +# light-weight as possible. Requiring heavyweight dependencies from this file +# will add to the boot time of your test suite on EVERY test run, even for an +# individual file that may not need all of that loaded. Instead, consider making +# a separate helper file that requires the additional dependencies and performs +# the additional setup, and require it from the spec files that actually need +# it. +# +# See https://rubydoc.info/gems/rspec-core/RSpec/Core/Configuration +RSpec.configure do |config| + # rspec-expectations config goes here. You can use an alternate + # assertion/expectation library such as wrong or the stdlib/minitest + # assertions if you prefer. + config.expect_with :rspec do |expectations| + # This option will default to `true` in RSpec 4. It makes the `description` + # and `failure_message` of custom matchers include text for helper methods + # defined using `chain`, e.g.: + # be_bigger_than(2).and_smaller_than(4).description + # # => "be bigger than 2 and smaller than 4" + # ...rather than: + # # => "be bigger than 2" + expectations.include_chain_clauses_in_custom_matcher_descriptions = true + end + + # rspec-mocks config goes here. You can use an alternate test double + # library (such as bogus or mocha) by changing the `mock_with` option here. + config.mock_with :rspec do |mocks| + # Prevents you from mocking or stubbing a method that does not exist on + # a real object. This is generally recommended, and will default to + # `true` in RSpec 4. + mocks.verify_partial_doubles = true + end + + # This option will default to `:apply_to_host_groups` in RSpec 4 (and will + # have no way to turn it off -- the option exists only for backwards + # compatibility in RSpec 3). It causes shared context metadata to be + # inherited by the metadata hash of host groups and examples, rather than + # triggering implicit auto-inclusion in groups with matching metadata. + config.shared_context_metadata_behavior = :apply_to_host_groups + + # The settings below are suggested to provide a good initial experience + # with RSpec, but feel free to customize to your heart's content. + # # This allows you to limit a spec run to individual examples or groups + # # you care about by tagging them with `:focus` metadata. When nothing + # # is tagged with `:focus`, all examples get run. RSpec also provides + # # aliases for `it`, `describe`, and `context` that include `:focus` + # # metadata: `fit`, `fdescribe` and `fcontext`, respectively. + # config.filter_run_when_matching :focus + # + # # Allows RSpec to persist some state between runs in order to support + # # the `--only-failures` and `--next-failure` CLI options. We recommend + # # you configure your source control system to ignore this file. + # config.example_status_persistence_file_path = "spec/examples.txt" + # + # # Limits the available syntax to the non-monkey patched syntax that is + # # recommended. For more details, see: + # # https://rspec.info/features/3-12/rspec-core/configuration/zero-monkey-patching-mode/ + # config.disable_monkey_patching! + # + # # This setting enables warnings. It's recommended, but in some cases may + # # be too noisy due to issues in dependencies. + # config.warnings = true + # + # # Many RSpec users commonly either run the entire suite or an individual + # # file, and it's useful to allow more verbose output when running an + # # individual spec file. + # if config.files_to_run.one? + # # Use the documentation formatter for detailed output, + # # unless a formatter has already been configured + # # (e.g. via a command-line flag). + # config.default_formatter = "doc" + # end + # + # # Print the 10 slowest examples and example groups at the + # # end of the spec run, to help surface which specs are running + # # particularly slow. + # config.profile_examples = 10 + # + # # Run specs in random order to surface order dependencies. If you find an + # # order dependency and want to debug it, you can fix the order by providing + # # the seed, which is printed after each run. + # # --seed 1234 + # config.order = :random + # + # # Seed global randomization in this process using the `--seed` CLI option. + # # Setting this allows you to use `--seed` to deterministically reproduce + # # test failures related to randomization by passing the same `--seed` value + # # as the one that triggered the failure. + # Kernel.srand config.seed +end From 7b228687d81b64a7421773c40bc22c073df9d552 Mon Sep 17 00:00:00 2001 From: gbaptista Date: Sun, 23 Jun 2024 09:13:35 -0300 Subject: [PATCH 8/8] updating README --- template.md | 38 ++++++++++++++++++++++++++++++++++---- 1 file changed, 34 insertions(+), 4 deletions(-) diff --git a/template.md b/template.md index a73cf7a..aeb2e66 100644 --- a/template.md +++ b/template.md @@ -311,7 +311,7 @@ client = Gemini.new( ## Available Models -These models are accessible to the repository **author** as of May 2025 in the `us-east4` region. Access to models may vary by region, user, and account. All models here are expected to work, if you can access them. This is just a reference of what a "typical" user may expect to have access to right away: +These models are accessible to the repository **author** as of June 2025 in the `us-east4` region. Access to models may vary by region, user, and account. All models here are expected to work, if you can access them. This is just a reference of what a "typical" user may expect to have access to right away: | Model | Vertex AI | Generative Language | |------------------------------------------|:---------:|:-------------------:| @@ -319,9 +319,9 @@ These models are accessible to the repository **author** as of May 2025 in the ` | gemini-pro | ✅ | ✅ | | gemini-1.5-pro-preview-0514 | ✅ | 🔒 | | gemini-1.5-pro-preview-0409 | ✅ | 🔒 | -| gemini-1.5-pro | 🔒 | 🔒 | +| gemini-1.5-pro | ✅ | ✅ | | gemini-1.5-flash-preview-0514 | ✅ | 🔒 | -| gemini-1.5-flash | 🔒 | 🔒 | +| gemini-1.5-flash | ✅ | ✅ | | gemini-1.0-pro-vision-latest | 🔒 | 🔒 | | gemini-1.0-pro-vision-001 | ✅ | 🔒 | | gemini-1.0-pro-vision | ✅ | 🔒 | @@ -981,7 +981,7 @@ Output: #### JSON Schema -> _As of the writing of this README, only the `vertex-ai-api` service and `gemini` models version `1.5` support this feature._ +> _While Gemini 1.5 Flash models only accept a text description of the JSON schema you want returned, the Gemini 1.5 Pro models let you pass a schema object (or a Python type equivalent), and the model output will strictly follow that schema. This is also known as controlled generation or constrained decoding._ You can also provide a [JSON Schema](https://json-schema.org) for the expected JSON output: @@ -1036,6 +1036,36 @@ Output: ] } ``` +#### Models That Support JSON + +These models are accessible to the repository **author** as of June 2025 in the `us-east4` region. Access to models may vary by region, user, and account. + +- ❌ Does not support JSON mode. +- 🟡 Supports JSON mode but not Schema. +- ✅ Supports JSON mode and Schema. +- 🔒 I don't have access to the model. + +| Model | Vertex AI | Generative Language | +|------------------------------------------|:---------:|:-------------------:| +| gemini-pro-vision | ❌ | 🔒 | +| gemini-pro | 🟡 | ❌ | +| gemini-1.5-pro-preview-0514 | ✅ | 🔒 | +| gemini-1.5-pro-preview-0409 | ✅ | 🔒 | +| gemini-1.5-pro | ✅ | ❌ | +| gemini-1.5-flash-preview-0514 | 🟡 | 🔒 | +| gemini-1.5-flash | 🟡 | ❌ | +| gemini-1.0-pro-vision-latest | 🔒 | 🔒 | +| gemini-1.0-pro-vision-001 | ❌ | 🔒 | +| gemini-1.0-pro-vision | ❌ | 🔒 | +| gemini-1.0-pro-latest | 🔒 | ❌ | +| gemini-1.0-pro-002 | 🟡 | 🔒 | +| gemini-1.0-pro-001 | ❌ | ❌ | +| gemini-1.0-pro | 🟡 | ❌ | +| gemini-ultra | 🔒 | 🔒 | +| gemini-1.0-ultra | 🔒 | 🔒 | +| gemini-1.0-ultra-001 | 🔒 | 🔒 | + + ### Tools (Functions) Calling > As of the writing of this README, only the `vertex-ai-api` service and the `gemini-pro` model [supports](https://cloud.google.com/vertex-ai/docs/generative-ai/multimodal/function-calling#supported_models) tools (functions) calls.