We’re currently in research preview! We’re excited to share our system with you, and we would love to hear your feedback. If you have thoughts, we’d love to hear from you.

Model Offerings

NameAPI NameDescription
TIM-small-previewtim-small-previewCore fine-tuned TIM-8b model running on TIMRUN, optimized for long horizon reasoning and tool use with search tools
TIM-largetim-largeHighly capable generalized inference engine built on TIM principles, powered by OpenAI GPT-4.1

TIM-small-preview

TIM-small-preview is our core fine-tuned TIM-8b model running on TIMRUN. It enables long horizon reasoning and tool use, and is tuned specifically for tasks with search tools. This model represents our foundational approach to building intelligent agents that can reason through complex, multi-step problems while effectively utilizing available tools.

TIM-large

TIM-large is a highly capable generalized inference engine built on the same principles as TIM-small but backed by the power of OpenAI GPT-4.1. We talked to teams around the world building agents, and many won’t deviate from using anything but the most powerful generalized models. We listened, and we built a way to experience long horizon reasoning and the agent developer experience we imagined backed by the power of models you’re more familiar with.

Terminology

We’ll use the phrase model and inference engine interchangeably. We really have models which are transformer models, a runtime, which is, we’re usually referring to Tim run, which is our custom runtime that runs with the Tim models. And an inference engine is the combination of the model and the runtime.