SeaLink
Sign up
← Docs

LangChain integration

Point LangChain's ChatOpenAI to SeaLink. Works with Python and JS/TS LangChain.

One-click LangChain config

Config has the base URL pre-filled. Once you sign in, your real API key is auto-inlined.

Download LangChain config

Steps

  1. 1Install langchain-openai (Python: pip install langchain-openai; Node: npm i @langchain/openai).
  2. 2Create a ChatOpenAI instance with SeaLink's base_url and your API key.
  3. 3Use any SeaLink model ID as the model name. Invoke, stream, or batch as usual.
  4. 4For tool-calling agents, set model_kwargs or bind tools — SeaLink passes through OpenAI-compatible tool definitions.

Configuration

Python
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage
llm = ChatOpenAI(
base_url="https://api.sealink.asia/v1",
api_key="<your-sealink-key>",
model="claude-sonnet-4-6",
temperature=0.7,
)
resp = llm.invoke([HumanMessage(content="Explain SeaLink in one sentence.")])
print(resp.content)
SeaLink is compatible with any LangChain feature that uses OpenAI-compatible chat models. For streaming, pass streaming=True. For tool calling, use bind_tools() or with_structured_output(). See /docs/compatibility for model-by-model capability support.