You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I noticed that the model in Modelscope is m2_encoder_0.2B.ckpt. Is this the 0.4B parameter model mentioned in the paper? Will there be larger models open sourced in the future?
The text was updated successfully, but these errors were encountered:
Indeed, the file 'm2_encoder_0.2B.ckpt' corresponds to the model with 0.4 billion parameters referenced in our publication. Apologies for the discrepancy in naming
Certainly, we are committed to making our research as accessible as possible. In line with this commitment, we plan to open source the 1 billion and 10 billion parameter models within the upcoming months.
Thanks for the detailed response and sharing your future plans. I'm excited to hear about your team's intention to open source models at the 1 billion and 10 billion parameter scale. Appreciate your contributions through open sourcing valuable AI resources.
Hello, I noticed that the model in Modelscope is m2_encoder_0.2B.ckpt. Is this the 0.4B parameter model mentioned in the paper? Will there be larger models open sourced in the future?
The text was updated successfully, but these errors were encountered: