Skip to content

Add Claude Opus 4.6 (1M context) for GitHub Copilot#897

Open
zhzy0077 wants to merge 1 commit intoanomalyco:devfrom
zhzy0077:fix/opus1m
Open

Add Claude Opus 4.6 (1M context) for GitHub Copilot#897
zhzy0077 wants to merge 1 commit intoanomalyco:devfrom
zhzy0077:fix/opus1m

Conversation

@zhzy0077
Copy link
Contributor

Seems to be available soon: github/copilot-cli#1395

@nkhoit
Copy link

nkhoit commented Feb 14, 2026

For reference I have access to this so the metadata API response for claude-opus-4.6-1m from GH CP CLI looks like this:

{
  "billing": {
    "is_premium": true,
    "multiplier": 6,
    "restricted_to": [
      "pro",
      "edu",
      "pro_plus",
      "business",
      "enterprise"
    ]
  },
  "capabilities": {
    "family": "claude-opus-4.6-1m",
    "limits": {
      "max_context_window_tokens": 1000000,
      "max_non_streaming_output_tokens": 16000,
      "max_output_tokens": 64000,
      "max_prompt_tokens": 936000,
      "vision": {
        "max_prompt_image_size": 3145728,
        "max_prompt_images": 1,
        "supported_media_types": [
          "image/jpeg",
          "image/png",
          "image/webp"
        ]
      }
    },
    "object": "model_capabilities",
    "supports": {
      "adaptive_thinking": true,
      "max_thinking_budget": 32000,
      "min_thinking_budget": 1024,
      "parallel_tool_calls": true,
      "streaming": true,
      "structured_outputs": true,
      "tool_calls": true,
      "vision": true
    },
    "tokenizer": "o200k_base",
    "type": "chat"
  },
  "id": "claude-opus-4.6-1m",
  "is_chat_default": false,
  "is_chat_fallback": false,
  "model_picker_category": "powerful",
  "model_picker_enabled": true,
  "name": "Claude Opus 4.6 (1M context)",
  "object": "model",
  "policy": {
    "state": "enabled",
    "terms": "Enable access to the latest Claude Opus 4.6 (1M context) model from Anthropic. [Learn more about how GitHub Copilot serves Claude Opus 4.6 (1M context)](https://gh.io/copilot-claude-opus)."
  },
  "preview": false,

Comment on lines +17 to +18
context = 936_000
output = 64_000
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think from the API response (#897 (comment)) it should be

context = 1_000_000
input = 936_00
output = 64_000

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants