Skip to content

Model as string ollama#463

Merged
onuratakan merged 2 commits intomasterfrom
model_as_string
Nov 6, 2025
Merged

Model as string ollama#463
onuratakan merged 2 commits intomasterfrom
model_as_string

Conversation

@onuratakan
Copy link
Copy Markdown
Member

@onuratakan onuratakan commented Nov 6, 2025

Note

Updates models package init and project configuration.

  • Library:
    • Modify src/upsonic/models/__init__.py.
  • Config:
    • Update pyproject.toml.

Written by Cursor Bugbot for commit 6da3fd2. This will update automatically on new commits. Configure here.

@onuratakan onuratakan changed the title Model as string Model as string ollama Nov 6, 2025
@sentry
Copy link
Copy Markdown

sentry bot commented Nov 6, 2025

🔍 Existing Issues For Review

Your pull request is modifying functions with the following pre-existing issues:

📄 File: src/upsonic/models/init.py

Function Unhandled Issue
infer_model UserError: Set the GOOGLE_API_KEY environment variable or pass it via GoogleProvider(api_key=...)to use the Google Generative Language API. ...
Event Count: 2
infer_model UserError: Set the ANTHROPIC_API_KEY environment variable or pass it via AnthropicProvider(api_key=...)to use the Anthropic provider. ...
Event Count: 1
infer_model UserError: Unknown model: google/gemini-2.5-pro' ...
Event Count: 1
infer_model ImportError: Missing required package: google-genai upsonic.utils.printing in import_er...
Event Count: 1

@claude
Copy link
Copy Markdown

claude bot commented Nov 6, 2025

Pull Request Review: Model as String Ollama Support

Summary

This PR adds support for Ollama models through the string-based model specification system by routing ollama provider requests to use the OpenAIChatModel class. The change is minimal and leverages existing OpenAI-compatible infrastructure.


Positive Aspects

  1. Minimal, Focused Change: The implementation is clean and follows the existing pattern for other OpenAI-compatible providers (e.g., deepseek, cerebras, fireworks).

  2. Leverages Existing Infrastructure: Reusing OpenAIChatModel for Ollama is appropriate since Ollama provides an OpenAI-compatible API, avoiding code duplication.

  3. Consistent with Architecture: The change aligns with the project's model inference system in src/upsonic/models/__init__.py:infer_model().

  4. Version Bump: Appropriately incremented patch version from 0.66.0 to 0.66.1.


⚠️ Issues and Concerns

1. Missing Test Coverage (High Priority)

The PR lacks tests for the new Ollama functionality. The existing test suite in tests/unit_tests/test_model_specification.py tests other providers but not Ollama.

Recommendation: Add a test case similar to the existing provider tests:

@patch('upsonic.providers.openai.AsyncOpenAI')
def test_ollama_specification(self, mock_openai):
    # Mock the OpenAI client (since Ollama uses OpenAI-compatible API)
    mock_openai.return_value = MagicMock()
    
    agent = Agent(name="Ollama Agent", model="ollama/llama2")
    self.assertEqual(agent.model.model_name, "llama2")
    self.assertIsInstance(agent.model._provider, OpenAIProvider)

Also add an inference test:

@patch('upsonic.providers.openai.AsyncOpenAI')
def test_ollama_model_inference(self, mock_openai):
    mock_openai.return_value = MagicMock()
    
    ollama_model = infer_model("ollama/llama2")
    self.assertEqual(ollama_model.model_name, "llama2")

2. Documentation Gap (Medium Priority)

The PR description is minimal and doesn't explain:

  • How to configure Ollama (base URL, API key requirements)
  • Example usage patterns
  • Which Ollama models are supported
  • Any limitations or known issues

Recommendation: Update the PR description or add documentation showing:

# Example usage
agent = Agent(
    name="Ollama Agent",
    model="ollama/llama2"
)

3. Lock File Changes (Low Priority - Informational)

The uv.lock file shows extensive changes (removal of revision = 2 and reformatting of package metadata). These appear to be from a uv version update or lock file regeneration.

Question: Was this intentional? If these changes are unrelated to Ollama support, consider separating them into a different PR for cleaner history.


🔍 Code Quality Assessment

✅ Best Practices:

  • Follows existing code patterns
  • Proper provider string literal typing
  • Maintains backward compatibility

⚠️ Potential Issues:

  • No validation that the Ollama model name is valid
  • No specific error handling for Ollama-specific issues
  • Missing integration test with actual Ollama server (though unit tests with mocks are acceptable)

🔒 Security Considerations

No significant security concerns identified, but consider:

  • Ollama typically runs locally, so ensure documentation clarifies how to configure the base URL
  • If users configure Ollama with external endpoints, standard API security practices apply

Performance Considerations

No performance impact expected. The change adds a single conditional branch in the model inference logic, which is negligible.


📋 Recommendations

Before Merging:

  1. Add unit tests for Ollama model specification (required)
  2. Add documentation or examples in the PR description or README
  3. Clarify the lock file changes - are they intentional?

Optional Improvements:
4. Add integration test with Ollama if CI infrastructure supports it
5. Document Ollama configuration in CLAUDE.md or main documentation
6. Consider adding Ollama to the model registry if applicable


📝 Summary

The core implementation is solid and follows the project's conventions. However, this PR should not be merged without adding tests. The lack of test coverage is a blocker for a production-ready framework focused on reliability.

Recommended Action: Add tests, update documentation, then approve for merge.


Overall Assessment: ⚠️ Needs Work - Implementation is good, but missing critical test coverage.

🤖 Generated with Claude Code

@onuratakan onuratakan merged commit 71a915d into master Nov 6, 2025
5 checks passed
@onuratakan onuratakan deleted the model_as_string branch November 6, 2025 15:29
DoganK01 pushed a commit that referenced this pull request Jan 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant