Blog LogoTYPHOON
Home
Research
Join Text-to-Speech Research
DocumentationDemo AppsPlayground
Blog
About
Blog LogoTyphoon
  • Home
  • Research
    • Join Text-to-Speech Research
  • Get Started
    • Documentation
    • Demo Apps
    • Playground
  • Blog
  • About

© 2025 SCB 10X Co., Ltd.

Introducing Typhoon 2 API Pro: Accessible, Production-grade Thai LLMs

Introducing Typhoon 2 API Pro: Accessible, Production-grade Thai LLMs

API
Krisanapong Jirayoot
Krisanapong Jirayoot
June 18, 2025

Table of Contents

🚀 What’s NewKey FeaturesPricingQuickstart GuideTry out Typhoon Models in Together’s playgroundGet API AccessUsing Typhoon API Pro with Official Together LibrariesOpenAI CompatibilityUsing Together APIs with OpenAI Client LibrariesJoin the Typhoon Community

Originally published March 17, 2025 – Updated June 18, 2025

We’re excited to announce major updates to Typhoon 2 API Pro, our production-ready API offering in collaboration with Together AI. Most notably, we’ve launched Typhoon 2.1 Gemma—a more powerful, more scalable Thai LLM now accessible via Together’s serverless API infrastructure.

🚀 What’s New

  • Typhoon 2.1 Gemma is now live on Together AI

Typhoon 2.1 Gemma—our latest text model, now available via API Pro with the endpoint scb10x/scb10x-typhoon-2-1-gemma3-12b. Developed with Together AI, it provides production-grade performance and lower latency while scaling seamlessly.

  • Deprecation notice:
    • Typhoon-2-8b model scb10x/scb10x-llama3-1-typhoon2-8b-instruct is no longer available
    • Typhoon-2-70b model scb10x/scb10x-llama3-1-typhoon2-70b-instruct will be deprecated on August 20, 2025 — we encourage all users to migrate to Typhoon 2.1 Gemma (12B) as soon as possible.

typhoon‑v2‑70b‑instruct remains accessible only until August 20, 2025. We strongly recommend switching to Typhoon 2.1 before this date to avoid service disruption.

Key Features

  • Leverage Together AI’s serverless inference infrastructure to deliver fast responses while minimizing costs through per-token billing
  • Compatible with popular frameworks and easy to integrate with existing AI pipelines
  • Built for production-grade Enterprise AI applications, offering reliability, uptime, and scalability

Pricing

Typhoon 2 API Pro is available at the following prices:

  • Typhoon 2.1 Gemma 12b: $0.20 per 1 million tokens
  • Typhoon 2 Instruct 70b: $0.88 per 1 million tokens

Quickstart Guide

Getting started with Typhoon 2 API Pro is easy! Our models are hosted on Together AI for seamless deployment. Follow these simple steps to start using Typhoon.

Try out Typhoon Models in Together’s playground

Register for a Together.ai account. New accounts get $1 free credit to help you get started. You can test the Typhoon models on the Together Playground.

Get API Access

  1. Set your API key as an environment variable named TOGETHER_API_KEY
export TOGETHER_API_KEY="YOUR_API_KEY"
export TOGETHER_API_KEY="YOUR_API_KEY"
plaintext
  1. Make your first API call
curl -X POST "https://api.together.xyz/v1/chat/completions" \
     -H "Authorization: Bearer $TOGETHER_API_KEY" \
     -H "Content-Type: application/json" \
     -d '{
      "model": "scb10x/scb10x-typhoon-2-1-gemma3-12b",
      "messages": [
          {"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"}
      ]
     }'
curl -X POST "https://api.together.xyz/v1/chat/completions" \
     -H "Authorization: Bearer $TOGETHER_API_KEY" \
     -H "Content-Type: application/json" \
     -d '{
      "model": "scb10x/scb10x-typhoon-2-1-gemma3-12b",
      "messages": [
          {"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"}
      ]
     }'
python

Using Typhoon API Pro with Official Together Libraries

Together AI provides official libraries for Python and TypeScript. You can install them using pip install together for Python and npm install together-ai for Typescript.

Send your first Typhoon 2 API Pro request and start generating a response:

Python

from together import Together

# Initialize client
client = Together()

# Stream response from Typhoon 2.1 Gemma 12b
stream = client.chat.completions.create(
  model="scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages=[{"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"}],
  stream=True,
)

# Print the streamed response
for chunk in stream:
  print(chunk.choices[0].delta.content or "", end="", flush=True)
from together import Together

# Initialize client
client = Together()

# Stream response from Typhoon 2.1 Gemma 12b
stream = client.chat.completions.create(
  model="scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages=[{"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"}],
  stream=True,
)

# Print the streamed response
for chunk in stream:
  print(chunk.choices[0].delta.content or "", end="", flush=True)
python

Typescript

import Together from "together-ai"

const together = new Together()

const stream = await together.chat.completions.create({
  model: "scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages: [{ role: "user", content: "ขอสูตรไก่ย่างหน่อย" }],
  stream: true,
})

for await (const chunk of stream) {
  // use process.stdout.write instead of console.log to avoid newlines
  process.stdout.write(chunk.choices[0]?.delta?.content || "")
}
import Together from "together-ai"

const together = new Together()

const stream = await together.chat.completions.create({
  model: "scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages: [{ role: "user", content: "ขอสูตรไก่ย่างหน่อย" }],
  stream: true,
})

for await (const chunk of stream) {
  // use process.stdout.write instead of console.log to avoid newlines
  process.stdout.write(chunk.choices[0]?.delta?.content || "")
}
typescript

OpenAI Compatibility

Together API endpoints are fully compatible with the OpenAI API. If your application is already using OpenAI’s client libraries, you can configure it to point to Together API servers and start using Typhoon models.

Using Together APIs with OpenAI Client Libraries

  1. Set the api_key to your Together API key. You can find your API key in your Settings page.
  2. Update the base_url to https://api.together.xyz/v1

Python

import os
import openai

client = openai.OpenAI(
  api_key=os.environ.get("TOGETHER_API_KEY"),
  base_url="https://api.together.xyz/v1",
)
# Use Typhoon 2.1 Gemma 12b
response = client.chat.completions.create(
  model="scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages=[
    {"role": "system", "content": "You are a helpful assistant named Typhoon created by SCB 10X. You always responds to the user in the language they use or request."},
    {"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"},
  ]
)

print(response.choices[0].message.content)
import os
import openai

client = openai.OpenAI(
  api_key=os.environ.get("TOGETHER_API_KEY"),
  base_url="https://api.together.xyz/v1",
)
# Use Typhoon 2.1 Gemma 12b
response = client.chat.completions.create(
  model="scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages=[
    {"role": "system", "content": "You are a helpful assistant named Typhoon created by SCB 10X. You always responds to the user in the language they use or request."},
    {"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"},
  ]
)

print(response.choices[0].message.content)
python

Typescript

import OpenAI from "openai"

const client = new OpenAI({
  apiKey: process.env.TOGETHER_API_KEY,
  baseURL: "https://api.together.xyz/v1",
})
// Use Typhoon 2.1 Gemma 12b
const response = await client.chat.completions.create({
  model: "scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages: [{ role: "user", content: "ขอสูตรไก่ย่างหน่อย" }],
})

console.log(response.choices[0].message.content)
import OpenAI from "openai"

const client = new OpenAI({
  apiKey: process.env.TOGETHER_API_KEY,
  baseURL: "https://api.together.xyz/v1",
})
// Use Typhoon 2.1 Gemma 12b
const response = await client.chat.completions.create({
  model: "scb10x/scb10x-typhoon-2-1-gemma3-12b",
  messages: [{ role: "user", content: "ขอสูตรไก่ย่างหน่อย" }],
})

console.log(response.choices[0].message.content)
typescript

Join the Typhoon Community

We’re excited to see how developers and businesses use Typhoon 2 API Pro to build innovative Thai AI solutions. Join our Discord community to share your experiences, ask questions, and stay updated on new releases.

Ready to build with Typhoon API Pro? Get started today! 🚀

Read the full documentation at Together AI. For more information on Typhoon, please visit https://docs.opentyphoon.ai/.

For support on Together AI, please visit their Support page.

Previous
Typhoon’s Paper Acceptance at Interspeech 2025: Enhancing Low-Resource Language and Instruction Following Capabilities of Audio Language Models

Typhoon’s Paper Acceptance at Interspeech 2025: Enhancing Low-Resource Language and Instruction Following Capabilities of Audio Language Models

Next

Meet Typhoon Translate: A Small, Fast, High-Performance Model Purpose-Built for Thai-English Translation

Meet Typhoon Translate: A Small, Fast, High-Performance Model Purpose-Built for Thai-English Translation

© 2025 SCB 10X Co., Ltd.. All rights reserved