You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
require'ruby_llm'RubyLLM.configuredo |config|
config.openai_api_base='http://localhost:11434/v1'config.openai_api_key='ollama'config.default_embedding_model='nomic-embed-text'endembedding=RubyLLM.embed("Ruby is a programmer's best friend",provider: 'openai',assume_model_exists: true)vector=embedding.vectorsputs"Vector dimension: #{vector.length}"
Runtime error
xxxxxxxxxxx/vendor/bundle/ruby/3.4.0/bundler/gems/ruby_llm-347e630e9c9b/lib/ruby_llm/embedding.rb:26:in 'RubyLLM::Embedding.embed': undefined method 'connection' for an instance of String (NoMethodError)
connection = context ? context.connection_for(provider) : provider.connection(config)
^^^^^^^^^^^
from xxxxxxxxxxx/vendor/bundle/ruby/3.4.0/bundler/gems/ruby_llm-347e630e9c9b/lib/ruby_llm.rb:45:in 'RubyLLM.embed'
from app/rubyllm/embed.rb:10:in '<main>'
Possible Fix
config=context&.config || RubyLLM.configmodel ||= config.default_embedding_model# <--- this is a quick fixmodel,provider=Models.resolve(model,provider: provider,assume_exists: assume_model_exists)ifmodel
The text was updated successfully, but these errors were encountered:
Hola @crmne, if you like I can submit a PR for the quick fix.
I took a quick look at the test specs and all the current #embed tests specify the model explicitly. I can add tests if you like, but caveat: I have zero SaaS LLM API keys, all my model use is local and I've never used VCR. 😄
I'm working with
ollama
via the OpenAI API proxy. Essentially,#embed
doesn't handle the default embedding model, leading to a runtime error.See end of post for suggested fix.
Using
main
My
Gemfile
has the following:Test program
Runtime error
Possible Fix
The text was updated successfully, but these errors were encountered: