EbbotGPT LLMs

Read about the technical details and practical implications of Ebbot's LLMs.

GPT models

  • EbbotGPT 2 - Released 12/12-2024

  • EbbotGPT 3 - Released 3/11-2025

EbbotGPT 3

Knowledge comprehension

EbbotGPT 3 has almost twice the parameters of EbbotGPT 2 which improves its knowledge comprehension and problem-solving capabilities.

Speed

Despite being bigger than EbbotGPT 2, EbbotGPT 3 is twice as fast because it uses a Mixture of Experts (MoE) architecture. This means the model only activates the most relevant 5% of its parameters to answer a user's query.

Integrated Tool Calling

EbbotGPT 3 features integrated tool calling, enabling it to seamlessly decide whether to use a tool or generate an answer from the uploaded knowledge.

Context Window

With a context window of 31,000 tokens, EbbotGPT 3 can process and manage four times more information than EbbotGPT 2.

EbbotGPT LLMs
Speed (1-10)
Knowledge (1-10)
Integrated tool calling
Context window (tokens)

2

6

7

8000

3

8

9

31000

Last updated

Was this helpful?