Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
-
Updated
Jun 14, 2023 - Python
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
📝 Amazon product description generator using GPT-Neo for Texta.ai
AI Text Generator : Friedrich Nietzsche
Add a description, image, and links to the gpt-neo-fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the gpt-neo-fine-tuning topic, visit your repo's landing page and select "manage topics."