Skip to content

Conversation

@ParamThakkar123
Copy link
Contributor

No description provided.

"Safetensors"
],
"huggingface_repo": "mistralai/Ministral-3-14B-Instruct-2512",
"transformers_version": "5.0.0.dev0",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this work with the version of transformers we have?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried it with vLLM server plugin and it worked with it, The fastchat server doesn't have support for it yet.

@deep1401 deep1401 marked this pull request as draft January 16, 2026 14:57
@deep1401
Copy link
Member

Marking this as draft since we're blocking this until we upgrade transformers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants