You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I see that model parallelism and tensor parallelism are currently supported. Since Mixtral (MoE) 8x7B is listed as supported, I'm wondering whether expert parallelism is currently supported, or if there are plans to support it in the future?