Improve model card: Add `transformers` library tag and Chronos-2 paper link
#1
by
nielsr
HF Staff
- opened
This PR improves the model card for amazon/chronos-bolt-mini by making the following updates:
- Added
library_name: transformersmetadata: Theconfig.jsonindicates the model is based on a T5 architecture and includestransformers_version, confirming its compatibility with the Hugging Facetransformerslibrary. This addition will enable an automated, predefined code snippet on the Hub, showcasing how to use the model. - Added prominent link to Chronos-2 paper: The model card now explicitly links to the associated paper, "Chronos-2: From Univariate to Universal Forecasting", providing immediate context and discoverability for researchers.
- Updated Citation section: The citation section has been expanded to include the BibTeX entry for the Chronos-2 paper, aligning with the comprehensive citation provided in the official GitHub repository.
These updates enhance the model card's accuracy, utility, and adherence to documentation best practices.