Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inquiry about Model Size and Plans for Open Sourcing Larger Models #3

Open
209ye opened this issue Mar 5, 2024 · 4 comments
Open

Comments

@209ye
Copy link

209ye commented Mar 5, 2024

Hello, I noticed that the model in Modelscope is m2_encoder_0.2B.ckpt. Is this the 0.4B parameter model mentioned in the paper? Will there be larger models open sourced in the future?

@LandyGuo
Copy link
Collaborator

LandyGuo commented Mar 6, 2024

  1. Indeed, the file 'm2_encoder_0.2B.ckpt' corresponds to the model with 0.4 billion parameters referenced in our publication. Apologies for the discrepancy in naming
  2. Certainly, we are committed to making our research as accessible as possible. In line with this commitment, we plan to open source the 1 billion and 10 billion parameter models within the upcoming months.

@LandyGuo
Copy link
Collaborator

LandyGuo commented Mar 7, 2024

fixe the issue in this commit: b38e199, and naming in modelscope: https://www.modelscope.cn/models/M2Cognition/M2-Encoder/files

@209ye
Copy link
Author

209ye commented Mar 7, 2024

Thanks for the detailed response and sharing your future plans. I'm excited to hear about your team's intention to open source models at the 1 billion and 10 billion parameter scale. Appreciate your contributions through open sourcing valuable AI resources.

@LandyGuo
Copy link
Collaborator

We have released 1B and 10B modes in this PR: #14

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants