I hope this message finds you well. First of all, I would like to express my appreciation for your contributions and hard work on Generative Language Models for Paragraph-Level Question Generation. Your efforts have greatly facilitated research and development in QG, and I am truly grateful for the tools and resources you have provided.
I am writing to inquire about the availability of a model based on mBART within the context of this project.
I found that you have released Bart, T5, MT5 models for various versions on Hugging Face. Could you kindly share your thoughts on whether there are plans to release an mBART-based models? If not, might there be any particular challenges or considerations that led to the decision?
Thank you very much for taking the time to read my inquiry. I look forward to your response and appreciate any insights you can provide.
I hope this message finds you well. First of all, I would like to express my appreciation for your contributions and hard work on Generative Language Models for Paragraph-Level Question Generation. Your efforts have greatly facilitated research and development in QG, and I am truly grateful for the tools and resources you have provided.
I am writing to inquire about the availability of a model based on mBART within the context of this project.
I found that you have released Bart, T5, MT5 models for various versions on Hugging Face. Could you kindly share your thoughts on whether there are plans to release an mBART-based models? If not, might there be any particular challenges or considerations that led to the decision?
Thank you very much for taking the time to read my inquiry. I look forward to your response and appreciate any insights you can provide.