Peter AlbertGuide: Finetune GPT-NEO (2.7 Billion Parameters) on one GPUGPT-NEO is a series of languages model from EleutherAI, that tries to replicate OpenAI’s GPT-3 language model. EleutherAI’s current models…Apr 10, 20212Apr 10, 20212
InTDS ArchivebyMoshe WasserblatSentence Transformer Fine-Tuning (SetFit): Outperforms GPT-3 on few-shot Text-Classification while…The GPT-n series show very promising results for few-shot NLP classification tasks and keep improving as their model size increases [YK1]…Dec 14, 20217Dec 14, 20217
InTDS ArchivebyDhrumil PatelInstall Hugging Face Transformers on Apple M1Along with Tensorflow and Tokenizers PackageOct 27, 20215Oct 27, 20215
InAnalytics VidhyabyEduardo MuñozCreate a Tokenizer and Train a Huggingface RoBERTa model from scratchPart 1: A product names generator using an Encoder Decoder TransformerAug 16, 20212Aug 16, 20212
InPyTorch Lightning Developer BlogbyPyTorch Lightning teamTraining Transformers at Scale With PyTorch LightningIntroducing Lightning Transformers, a new library that seamlessly integrates PyTorch Lightning, HuggingFace Transformers and HydraApr 21, 2021Apr 21, 2021