Text-to-Text Long-context pretrained encoder-decoder models Long-context pretrained encoder-decoder models 16 January 2023
Transformer Codes to pre-train T5 (Text-to-Text Transfer Transformer) models pre-trained on Japanese web texts Codes to pre-train T5 (Text-to-Text Transfer Transformer) models pre-trained on Japanese web texts 13 December 2021
Text-to-Text Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 19 October 2021
Transformer mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer 18 October 2021