Skip to content

Releases: eyaltoledano/claude-task-master

[email protected]

18 Nov 23:39
fbb5ee4

Choose a tag to compare

Minor Changes

  • #1427 122c23a Thanks @Crunchyman-ralph! - Added Gemini 3 pro preview to supported Taskmaster AI providers

    • Added to Google providers
    • Added to Gemini CLI providers
      • Attention: Gemini 3 Pro is available for:
        • Google AI Ultra Subscribers
        • Users who have access via a paid Gemini API key
          • If you want to use the gemini api key, make sure you have this defined in your .env or mcp.json env variables: GEMINI_API_KEY=xxxx
  • #1398 e59c16c Thanks @Crunchyman-ralph! - Claude Code provider now respects your global, project, and local Claude Code configuration files.

    When using the Claude Code AI provider, Task Master now automatically loads your Claude Code settings from:

    • Global config (~/.claude/ directory) - Your personal preferences across all projects
    • Project config (.claude/ directory) - Project-specific settings like CLAUDE.md instructions
    • Local config - Workspace-specific overrides

    This means your CLAUDE.md files, custom instructions, and Claude Code settings will now be properly applied when Task Master uses Claude Code as an AI provider. Previously, these settings were being ignored.

    What's improved:

    • ✅ CLAUDE.md files are now automatically loaded and applied (global and local)
    • ✅ Your custom Claude Code settings are respected
    • ✅ Project-specific instructions work as expected
    • ✅ No manual configuration needed - works out of the box

    Issues:

Patch Changes

  • #1400 c62cf84 Thanks @Crunchyman-ralph! - Fix subtasks not showing parent task when displaying in cli (eg. tm show 10)

  • #1393 da8ed6a Thanks @bjcoombs! - Fix completion percentage and dependency resolution to treat cancelled tasks as complete. Cancelled tasks now correctly count toward project completion (e.g., 14 done + 1 cancelled = 100%, not 93%) and satisfy dependencies for dependent tasks, preventing permanent blocks.

  • #1407 0003b6f Thanks @Crunchyman-ralph! - Fix complexity analysis prompt to ensure consistent JSON output format

  • #1351 37aee78 Thanks @bjcoombs! - fix: prioritize .taskmaster in parent directories over other project markers

    When running task-master commands from subdirectories containing other project markers (like .git, go.mod, package.json), findProjectRoot() now correctly finds and uses .taskmaster directories in parent folders instead of stopping at the first generic project marker found.

    This enables multi-repo monorepo setups where a single .taskmaster at the root tracks work across multiple sub-repositories.

  • #1406 9079d04 Thanks @Crunchyman-ralph! - Fix MCP server compatibility with Cursor IDE's latest update by upgrading to fastmcp v3.20.1 with Zod v4 support

    • This resolves connection failures where the MCP server was unable to establish proper capability negotiation.
    • Issue typically included wording like: Server does not support completions
  • #1382 ac4328a Thanks @JJVvV! - Added opt-in proxy support for all AI providers - respects http_proxy/https_proxy environment variables when enabled.

    When using Task Master in corporate or restricted network environments that require HTTP/HTTPS proxies, API calls to AI providers (OpenAI, Anthropic, Google, AWS Bedrock, etc.) would previously fail with ECONNRESET errors. This update adds seamless proxy support that can be enabled via environment variable or configuration file.

    How to enable:

    Proxy support is opt-in. Enable it using either method:

    Method 1: Environment Variable

    export TASKMASTER_ENABLE_PROXY=true
    export http_proxy=http://your-proxy:port
    export https_proxy=http://your-proxy:port
    export no_proxy=localhost,127.0.0.1  # Optional: bypass proxy for specific hosts
    
    # Then use Task Master normally
    task-master add-task "Create a new feature"

    Method 2: Configuration File

    Add to .taskmaster/config.json:

    {
      "global": {
        "enableProxy": true
      }
    }

    Then set your proxy environment variables:

    export http_proxy=http://your-proxy:port
    export https_proxy=http://your-proxy:port

    Technical details:

    • Uses undici's EnvHttpProxyAgent for automatic proxy detection
    • Centralized implementation in BaseAIProvider for consistency across all providers
    • Supports all AI providers: OpenAI, Anthropic, Perplexity, Azure OpenAI, Google AI, Google Vertex AI, AWS Bedrock, and OpenAI-compatible providers
    • Opt-in design ensures users without proxy requirements are not affected
    • Priority: TASKMASTER_ENABLE_PROXY environment variable > config.json setting
  • #1408 10ec025 Thanks @Crunchyman-ralph! - Add --json back to task-master list and task-master show for when using the commands with ai agents (less context)

[email protected]

17 Nov 15:04
838f0a2

Choose a tag to compare

Patch Changes

  • #1421 e75946b Thanks @Crunchyman-ralph! - Upgrade fastmcp dependency to solve Server does not support completions (required for completion/complete)
    • This resolves errors where MCP clients (like Cursor) failed to connect to the Task Master MCP server:

[email protected]

16 Nov 21:22
31965ab

Choose a tag to compare

Patch Changes

  • #1396 9883e83 Thanks @bjcoombs! - Fix box title alignment by adding emoji variant selector to warning sign

[email protected]

14 Nov 20:21
a522287

Choose a tag to compare

[email protected] Pre-release
Pre-release

Minor Changes

  • #1398 e59c16c Thanks @Crunchyman-ralph! - Claude Code provider now respects your global, project, and local Claude Code configuration files.

    When using the Claude Code AI provider, Task Master now automatically loads your Claude Code settings from:

    • Global config (~/.claude/ directory) - Your personal preferences across all projects
    • Project config (.claude/ directory) - Project-specific settings like CLAUDE.md instructions
    • Local config - Workspace-specific overrides

    This means your CLAUDE.md files, custom instructions, and Claude Code settings will now be properly applied when Task Master uses Claude Code as an AI provider. Previously, these settings were being ignored.

    What's improved:

    • ✅ CLAUDE.md files are now automatically loaded and applied (global and local)
    • ✅ Your custom Claude Code settings are respected
    • ✅ Project-specific instructions work as expected
    • ✅ No manual configuration needed - works out of the box

    Issues:

Patch Changes

  • #1400 c62cf84 Thanks @Crunchyman-ralph! - Fix subtasks not showing parent task when displaying in cli (eg. tm show 10)

  • #1393 da8ed6a Thanks @bjcoombs! - Fix completion percentage and dependency resolution to treat cancelled tasks as complete. Cancelled tasks now correctly count toward project completion (e.g., 14 done + 1 cancelled = 100%, not 93%) and satisfy dependencies for dependent tasks, preventing permanent blocks.

  • #1407 0003b6f Thanks @Crunchyman-ralph! - Fix complexity analysis prompt to ensure consistent JSON output format

  • #1351 37aee78 Thanks @bjcoombs! - fix: prioritize .taskmaster in parent directories over other project markers

    When running task-master commands from subdirectories containing other project markers (like .git, go.mod, package.json), findProjectRoot() now correctly finds and uses .taskmaster directories in parent folders instead of stopping at the first generic project marker found.

    This enables multi-repo monorepo setups where a single .taskmaster at the root tracks work across multiple sub-repositories.

  • #1406 9079d04 Thanks @Crunchyman-ralph! - Fix MCP server compatibility with Cursor IDE's latest update by upgrading to fastmcp v3.20.1 with Zod v4 support

    • This resolves connection failures where the MCP server was unable to establish proper capability negotiation.
    • Issue typically included wording like: Server does not support completions
  • #1382 ac4328a Thanks @JJVvV! - Added opt-in proxy support for all AI providers - respects http_proxy/https_proxy environment variables when enabled.

    When using Task Master in corporate or restricted network environments that require HTTP/HTTPS proxies, API calls to AI providers (OpenAI, Anthropic, Google, AWS Bedrock, etc.) would previously fail with ECONNRESET errors. This update adds seamless proxy support that can be enabled via environment variable or configuration file.

    How to enable:

    Proxy support is opt-in. Enable it using either method:

    Method 1: Environment Variable

    export TASKMASTER_ENABLE_PROXY=true
    export http_proxy=http://your-proxy:port
    export https_proxy=http://your-proxy:port
    export no_proxy=localhost,127.0.0.1  # Optional: bypass proxy for specific hosts
    
    # Then use Task Master normally
    task-master add-task "Create a new feature"

    Method 2: Configuration File

    Add to .taskmaster/config.json:

    {
      "global": {
        "enableProxy": true
      }
    }

    Then set your proxy environment variables:

    export http_proxy=http://your-proxy:port
    export https_proxy=http://your-proxy:port

    Technical details:

    • Uses undici's EnvHttpProxyAgent for automatic proxy detection
    • Centralized implementation in BaseAIProvider for consistency across all providers
    • Supports all AI providers: OpenAI, Anthropic, Perplexity, Azure OpenAI, Google AI, Google Vertex AI, AWS Bedrock, and OpenAI-compatible providers
    • Opt-in design ensures users without proxy requirements are not affected
    • Priority: TASKMASTER_ENABLE_PROXY environment variable > config.json setting
  • #1408 10ec025 Thanks @Crunchyman-ralph! - Add --json back to task-master list and task-master show for when using the commands with ai agents (less context)

[email protected]

04 Nov 10:28
91e76b1

Choose a tag to compare

Patch Changes

  • #1377 3c22875 Thanks @Crunchyman-ralph! - Fix parse-prd schema to accept responses from models that omit optional fields (like Z.ai/GLM). Changed metadata field to use union pattern with .default(null) for better structured outputs compatibility.

  • #1377 3c22875 Thanks @Crunchyman-ralph! - Fix ai response not showing price after its json was repaired

  • #1377 3c22875 Thanks @Crunchyman-ralph! - Enable structured outputs for Z.ai providers. Added supportsStructuredOutputs: true to use json_schema mode for more reliable JSON generation in operations like parse-prd.

[email protected]

01 Nov 19:09
8905cae

Choose a tag to compare

Patch Changes

  • #1370 9c3b273 Thanks @Crunchyman-ralph! - Add support for ZAI (GLM) Coding Plan subscription endpoint as a separate provider. Users can now select between two ZAI providers:

    • zai: Standard ZAI endpoint (https://api.z.ai/api/paas/v4/)
    • zai-coding: Coding Plan endpoint (https://api.z.ai/api/coding/paas/v4/)

    Both providers use the same model IDs (glm-4.6, glm-4.5) but route to different API endpoints based on your subscription. When running tm models --setup, you'll see both providers listed separately:

    • zai / glm-4.6 - Standard endpoint
    • zai-coding / glm-4.6 - Coding Plan endpoint
  • #1371 abf46b8 Thanks @Crunchyman-ralph! - Improved auto-update experience:

    • updates now happen before your CLI command runs and automatically restart to execute your command with the new version.
    • No more manual restarts needed!

[email protected]

01 Nov 10:12
47c5b1e

Choose a tag to compare

Minor Changes

  • #1360 819d5e1 Thanks @Crunchyman-ralph! - Add support for custom OpenAI-compatible providers, allowing you to connect Task Master to any service that implements the OpenAI API specification

    How to use:

    Configure your custom provider with the models command:

    task-master models --set-main <your-model-id> --openai-compatible --baseURL <your-api-endpoint>

    Example:

    task-master models --set-main llama-3-70b --openai-compatible --baseURL http://localhost:8000/v1
    # Or for an interactive view
    task-master models --setup

    Set your API key (if required by your provider) in mcp.json, your .env file or in your env exports:

    OPENAI_COMPATIBLE_API_KEY="your-key-here"

    This gives you the flexibility to use virtually any LLM service with Task Master, whether it's self-hosted, a specialized provider, or a custom inference server.

  • #1360 819d5e1 Thanks @Crunchyman-ralph! - Add native support for Z.ai (GLM models), giving you access to high-performance Chinese models including glm-4.6 with massive 200K+ token context windows at competitive pricing

    How to use:

    1. Get your Z.ai API key from https://z.ai/manage-apikey/apikey-list

    2. Set your API key in .env, mcp.json or in env exports:

      ZAI_API_KEY="your-key-here"
    3. Configure Task Master to use GLM models:

      task-master models --set-main glm-4.6
      # Or for an interactive view
      task-master models --setup

    Available models:

    • glm-4.6 - Latest model with 200K+ context, excellent for complex projects
    • glm-4.5 - Previous generation, still highly capable
    • Additional GLM variants for different use cases: glm-4.5-air, glm-4.5v

    GLM models offer strong performance on software engineering tasks, with particularly good results on code generation and technical reasoning. The large context window makes them ideal for analyzing entire codebases or working with extensive documentation.

  • #1360 819d5e1 Thanks @Crunchyman-ralph! - Add LM Studio integration, enabling you to run Task Master completely offline with local models at zero API cost.

    How to use:

    1. Download and install LM Studio

    2. Launch LM Studio and download a model (e.g., Llama 3.2, Mistral, Qwen)

    3. Optional: Add api key to mcp.json or .env (LMSTUDIO_API_KEY)

    4. Go to the "Local Server" tab and click "Start Server"

    5. Configure Task Master:

      task-master models --set-main <model-name> --lmstudio

      Example:

      task-master models --set-main llama-3.2-3b --lmstudio

Patch Changes

  • #1362 3e70edf Thanks @Crunchyman-ralph! - Improve parse PRD schema for better llm model compatiblity

  • #1358 0c639bd Thanks @Crunchyman-ralph! - Fix subtask ID display to show full compound notation

    When displaying a subtask via tm show 104.1, the header and properties table showed only the subtask's local ID (e.g., "1") instead of the full compound ID (e.g., "104.1"). The CLI now preserves and displays the original requested task ID throughout the display chain, ensuring subtasks are clearly identified with their parent context. Also improved TypeScript typing by using discriminated unions for Task/Subtask returns from tasks.get(), eliminating unsafe type coercions.

  • #1339 3b09b5d Thanks @Crunchyman-ralph! - Fixed MCP server sometimes crashing when getting into the commit step of autopilot

    • autopilot now persists state consistently through the whole flow
  • #1326 9d5812b Thanks @SharifMrCreed! - Improve gemini cli integration

    When initializing Task Master with the gemini profile, you now get properly configured context files tailored specifically for Gemini CLI, including MCP configuration and Gemini-specific features like file references, session management, and headless mode.

[email protected]

31 Oct 22:49

Choose a tag to compare

[email protected] Pre-release
Pre-release

Minor Changes

  • #1360 819d5e1 Thanks @Crunchyman-ralph! - Add support for custom OpenAI-compatible providers, allowing you to connect Task Master to any service that implements the OpenAI API specification

    How to use:

    Configure your custom provider with the models command:

    task-master models --set-main <your-model-id> --openai-compatible --baseURL <your-api-endpoint>

    Example:

    task-master models --set-main llama-3-70b --openai-compatible --baseURL http://localhost:8000/v1
    # Or for an interactive view
    task-master models --setup

    Set your API key (if required by your provider) in mcp.json, your .env file or in your env exports:

    OPENAI_COMPATIBLE_API_KEY="your-key-here"

    This gives you the flexibility to use virtually any LLM service with Task Master, whether it's self-hosted, a specialized provider, or a custom inference server.

  • #1360 819d5e1 Thanks @Crunchyman-ralph! - Add native support for Z.ai (GLM models), giving you access to high-performance Chinese models including glm-4.6 with massive 200K+ token context windows at competitive pricing

    How to use:

    1. Get your Z.ai API key from https://z.ai/manage-apikey/apikey-list

    2. Set your API key in .env, mcp.json or in env exports:

      ZAI_API_KEY="your-key-here"
    3. Configure Task Master to use GLM models:

      task-master models --set-main glm-4.6
      # Or for an interactive view
      task-master models --setup

    Available models:

    • glm-4.6 - Latest model with 200K+ context, excellent for complex projects
    • glm-4.5 - Previous generation, still highly capable
    • Additional GLM variants for different use cases: glm-4.5-air, glm-4.5v

    GLM models offer strong performance on software engineering tasks, with particularly good results on code generation and technical reasoning. The large context window makes them ideal for analyzing entire codebases or working with extensive documentation.

  • #1360 819d5e1 Thanks @Crunchyman-ralph! - Add LM Studio integration, enabling you to run Task Master completely offline with local models at zero API cost.

    How to use:

    1. Download and install LM Studio

    2. Launch LM Studio and download a model (e.g., Llama 3.2, Mistral, Qwen)

    3. Optional: Add api key to mcp.json or .env (LMSTUDIO_API_KEY)

    4. Go to the "Local Server" tab and click "Start Server"

    5. Configure Task Master:

      task-master models --set-main <model-name> --lmstudio

      Example:

      task-master models --set-main llama-3.2-3b --lmstudio

Patch Changes

  • #1362 3e70edf Thanks @Crunchyman-ralph! - Improve parse PRD schema for better llm model compatiblity

  • #1358 0c639bd Thanks @Crunchyman-ralph! - Fix subtask ID display to show full compound notation

    When displaying a subtask via tm show 104.1, the header and properties table showed only the subtask's local ID (e.g., "1") instead of the full compound ID (e.g., "104.1"). The CLI now preserves and displays the original requested task ID throughout the display chain, ensuring subtasks are clearly identified with their parent context. Also improved TypeScript typing by using discriminated unions for Task/Subtask returns from tasks.get(), eliminating unsafe type coercions.

  • #1339 3b09b5d Thanks @Crunchyman-ralph! - Fixed MCP server sometimes crashing when getting into the commit step of autopilot

    • autopilot now persists state consistently through the whole flow
  • #1326 9d5812b Thanks @SharifMrCreed! - Improve gemini cli integration

    When initializing Task Master with the gemini profile, you now get properly configured context files tailored specifically for Gemini CLI, including MCP configuration and Gemini-specific features like file references, session management, and headless mode.

[email protected]

28 Oct 16:24
72b01f0

Choose a tag to compare

Patch Changes

[email protected]

27 Oct 17:49
00d600a

Choose a tag to compare

Patch Changes

  • #1305 a98d96e Thanks @bjcoombs! - Fix warning message box width to match dashboard box width for consistent UI alignment

  • #1346 25addf9 Thanks @Crunchyman-ralph! - remove file and complexity report parameter from get-tasks and get-task mcp tool

    • In an effort to reduce complexity and context bloat for ai coding agents, we simplified the parameters of these tools