Improve model card: Add `transformers` library tag and Chronos-2 paper link

#1
by nielsr HF Staff - opened

This PR improves the model card for amazon/chronos-bolt-mini by making the following updates:

  • Added library_name: transformers metadata: The config.json indicates the model is based on a T5 architecture and includes transformers_version, confirming its compatibility with the Hugging Face transformers library. This addition will enable an automated, predefined code snippet on the Hub, showcasing how to use the model.
  • Added prominent link to Chronos-2 paper: The model card now explicitly links to the associated paper, "Chronos-2: From Univariate to Universal Forecasting", providing immediate context and discoverability for researchers.
  • Updated Citation section: The citation section has been expanded to include the BibTeX entry for the Chronos-2 paper, aligning with the comprehensive citation provided in the official GitHub repository.

These updates enhance the model card's accuracy, utility, and adherence to documentation best practices.

Cannot merge
This branch has merge conflicts in the following files:
  • README.md

Sign up or log in to comment