feat(ollama): support calling the Ollama local process by stevenh · Pull Request #2923 · testcontainers/testcontainers-go
changed the title
refactor(ollama): local process
feat(ollama): local process
stevenh
changed the title
feat(ollama): local process
feat(ollama): refactor local process
stevenh
marked this pull request as ready for review
mdelapenya
changed the title
feat(ollama): refactor local process
feat(ollama): support calling the Ollama local process
Refactor local process handling for Ollama using a container implementation avoiding the wrapping methods. This defaults to running the binary with an ephemeral port to avoid port conflicts. This behaviour can be overridden my setting OLLAMA_HOST either in the parent environment or in the values passed via WithUseLocal. Improve API compatibility with: - Multiplexed output streams - State reporting - Exec option processing - WaitingFor customisation Fix Container implementation: - Port management - Running checks - Terminate processing - Endpoint argument definition - Add missing methods - Consistent environment handling
Validate the container request to ensure the user configuration can be processed and no fields that would be ignored are present.
This was referenced
Jan 2, 2025mdelapenya added a commit to mdelapenya/testcontainers-go that referenced this pull request
Jan 8, 2025mdelapenya added a commit to mdelapenya/testcontainers-go that referenced this pull request
Feb 3, 2025This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters. Learn more about bidirectional Unicode characters