Blog LogoTYPHOON
Home
Research
Join Text-to-Speech Research
DocumentationDemo AppsPlayground
Blog
About
Blog LogoTyphoon
  • Home
  • Research
    • Join Text-to-Speech Research
  • Get Started
    • Documentation
    • Demo Apps
    • Playground
  • Blog
  • About

© 2025 SCB 10X Co., Ltd.

Introducing Typhoon 2 API Pro: Accessible, Production-grade Thai LLMs

Introducing Typhoon 2 API Pro: Accessible, Production-grade Thai LLMs

APINew ReleaseTyphoon 2
Krisanapong Jirayoot
Krisanapong Jirayoot
March 17, 2025

Table of Contents

Key FeaturesPricingQuickstart GuideTry out Typhoon Models in Together’s playgroundGet API AccessUsing Typhoon API Pro with Official Together LibrariesOpenAI CompatibilityUsing Together APIs with OpenAI Client LibrariesJoin the Typhoon CommunityTyphoon 1.5 API Pro Deprecation

We are pleased to announce the launch of Typhoon 2 API Pro in collaboration with Together AI. These production-grade APIs for the typhoon-v2–8b-instruct and typhoon-v2–70b-instruct models deliver improved performance, faster response times, and greater scalability, making it easier than ever to integrate powerful Thai LLM capabilities into your applications.

Key Features

  • Leverage Together AI’s serverless inference infrastructure to deliver fast responses while minimizing costs through per-token billing
  • Compatible with popular frameworks and easy to integrate with existing AI pipelines
  • Built for production-grade Enterprise AI applications, offering reliability, uptime, and scalability

Pricing

Typhoon 2 API Pro is available at the following prices:

  • Typhoon 2 8b Instruct: $0.18 per 1 million tokens
  • Typhoon 2 70b Instruct: $0.88 per 1 million tokens

Quickstart Guide

Getting started with Typhoon 2 API Pro is easy! Our models are hosted on Together AI for seamless deployment. Follow these simple steps to start using Typhoon.

Try out Typhoon Models in Together’s playground

Register for a Together.ai account. New accounts get $1 free credit to help you get started. You can test the Typhoon models on the Together Playground.

Get API Access

  1. Set your API key as an environment variable named TOGETHER_API_KEY
class="string">"keyword">export TOGETHER_API_KEY=class="string">"YOUR_API_KEY"
  1. Make your first API call
curl -X POST "https://api.together.xyz/v1/chat/completions" \
     -H "Authorization: Bearer $TOGETHER_API_KEY" \
     -H "Content-Type: application/json" \
     -d '{
      "model": "scb10x/scb10x-llama3-1-typhoon-18370",
      "messages": [
          {"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"}
      ]
     }'

Using Typhoon API Pro with Official Together Libraries

Together AI provides official libraries for Python and TypeScript. You can install them using pip install together for Python and npm install together-ai for Typescript.

Send your first Typhoon 2 API Pro request and start generating a response:

Python

from together import Together

# Initialize client
client = Together()

# Stream response from Typhoon2-8b-instruct
stream = client.chat.completions.create(
  model="scb10x/scb10x-llama3-1-typhoon-18370",
  messages=[{"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"}],
  stream=True,
)

# Print the streamed response
for chunk in stream:
  print(chunk.choices[0].delta.content or "", end="", flush=True)

Typescript

import Together from "together-ai"

const together = new Together()

const stream = await together.chat.completions.create({
  model: "scb10x/scb10x-llama3-1-typhoon-18370",
  messages: [{ role: "user", content: "ขอสูตรไก่ย่างหน่อย" }],
  stream: true,
})

for await (const chunk of stream) {
  // use process.stdout.write instead of console.log to avoid newlines
  process.stdout.write(chunk.choices[0]?.delta?.content || "")
}

OpenAI Compatibility

Together API endpoints are fully compatible with the OpenAI API. If your application is already using OpenAI’s client libraries, you can configure it to point to Together API servers and start using Typhoon 2.

Using Together APIs with OpenAI Client Libraries

  1. Set the api_key to your Together API key. You can find your API key in your Settings page.
  2. Update the base_url to https://api.together.xyz/v1

Python

import os
import openai

client = openai.OpenAI(
  api_key=os.environ.get("TOGETHER_API_KEY"),
  base_url="https://api.together.xyz/v1",
)
# Use Typhoon 2 8b Instruct
response = client.chat.completions.create(
  model="scb10x/scb10x-llama3-1-typhoon-18370",
  messages=[
    {"role": "system", "content": "You are a helpful assistant named Typhoon created by SCB 10X. You always responds to the user in the language they use or request."},
    {"role": "user", "content": "ขอสูตรไก่ย่างหน่อย"},
  ]
)

print(response.choices[0].message.content)

Typescript

import OpenAI from "openai"

const client = new OpenAI({
  apiKey: process.env.TOGETHER_API_KEY,
  baseURL: "https://api.together.xyz/v1",
})
// Use Typhoon 2 8B Instruct
const response = await client.chat.completions.create({
  model: "scb10x/scb10x-llama3-1-typhoon-18370",
  messages: [{ role: "user", content: "ขอสูตรไก่ย่างหน่อย" }],
})

console.log(response.choices[0].message.content)

Join the Typhoon Community

We’re excited to see how developers and businesses use Typhoon 2 API Pro to build innovative Thai AI solutions. Join our Discord community to share your experiences, ask questions, and stay updated on new releases.

Ready to build with Typhoon 2 API Pro? Get started today! 🚀

Read the full documentation at Together AI. For more information on Typhoon, please visit https://docs.opentyphoon.ai/.

For support on Together AI, please visit their Support page.

Typhoon 1.5 API Pro Deprecation

The following Typhoon 1.5 API Pro endpoints on Together AI are now deprecated and will be shutdown in 30 days:

  1. scb10x/scb10x-llama3-typhoon-v1–5–8b-instruct
  2. scb10x/scb10x-llama3-typhoon-v1-5x-4f316

Please migrate to the new Typhoon 2 endpoints by 17 April, 2025.

If you have any questions or concerns, please contact us at contact@opentyphoon.ai

Previous
Introducing Typhoon2-R1–70B: Enhanced Reasoning with DeepSeek R1

Introducing Typhoon2-R1–70B: Enhanced Reasoning with DeepSeek R1

Next

Tutorial: Running Typhoon Locally with Ollama and Open WebUI

Tutorial: Running Typhoon Locally with Ollama and Open WebUI

© 2025 SCB 10X Co., Ltd.. All rights reserved