EbbotGPT LLMs

Read about the technical details and practical implications of Ebbot's LLMs.

GPT models

  • EbbotGPT 2 - Released 12/12-2024

  • EbbotGPT 0.6.4 - Released 23/5-2024

  • EbbotGPT 3 Preview - Released 29/08-2025

EbbotGPT 3

Knowledge comprehension

EGPT3 has 3 times the parameters of EGPT2 which vastly improves its knowledge comprehension and problem-solving capabilities.

Speed

Despite being 3 times the size of EGPT2, EGPT3 is twice as fast because it uses a Mixture of Experts (MoE) architecture. This means the model only activates the most relevant 10% of its parameters to answer a user's query.

Integrated Tool Calling

EGPT3 features integrated tool calling, enabling it to seamlessly decide whether to use a tool or generate an answer from the uploaded knowledge.

Context Window

With a context window of 32,000 tokens, EGPT3 can process and manage four times more information than EGPT2.

EbbotGPT LLMs
Speed (1-10)
Knowledge (1-10)
Integrated tool calling
Context window (tokens)

2

6

7

8000

0.6.4

3

6

8000

3

8

9

32000

Last updated

Was this helpful?