About 58,700 results
Open links in new tab
  1. openai-community/gpt2 · Hugging Face

    Pretrained model on English language using a causal language modeling (CLM) objective. It was introduced in this paper and first released at this page. Disclaimer: The team releasing GPT-2 …

  2. GPT-2 - Wikipedia

    GPT-2 has, like its predecessor GPT-1 and its successors GPT-3, GPT-4 and GPT-5, a generative pre-trained transformer architecture, implementing a deep neural network, …

  3. GitHub - openai/gpt-2: Code for the paper "Language Models are ...

    Code and models from the paper "Language Models are Unsupervised Multitask Learners". You can read about GPT-2 and its staged release in our original blog post, 6 month follow-up post, …

  4. OpenAI GPT2 — TF Transformers documentation

    GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset [1] of 8 million web pages. GPT-2 is trained with a simple objective: predict the next …

  5. LLM: Understanding GPT-2 — Generative Pre-Trained Transformer

    Mar 16, 2025 · Explore the architecture of the first GPT model, how it was built, and key elements that shaped the foundation of… In this article, we will deep dive into GPT-2, explaining its …

  6. GPT-2: OpenAI’s Breakthrough in Generative AI - w3resource

    Jan 30, 2025 · GPT-2 transformed AI by proving the potential of large-scale unsupervised learning. Despite ethical challenges, its contributions to NLP—from creative writing to zero …

  7. GPT-2: 1.5B release - OpenAI

    Nov 5, 2019 · As the final model release of GPT‑2 ⁠ ’s staged release ⁠, we’re releasing the largest version (1.5B parameters) of GPT‑2 along with code and model weights ⁠ to facilitate detection …

  8. The Illustrated GPT-2 (Visualizing Transformer Language Models)

    Aug 12, 2019 · The GPT2 was, however, a very large, transformer-based language model trained on a massive dataset. In this post, we’ll look at the architecture that enabled the model to …

  9. Text Generation with GPT-2 Model - MachineLearningMastery.com

    May 15, 2025 · In this tutorial, you’ll discover how to implement text generation using GPT-2. You’ll learn through hands-on examples that you can run right away, and by the end of this …

  10. OpenAI GPT2 — transformers 3.0.2 documentation - Hugging Face

    GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset [1] of 8 million web pages. GPT-2 is trained with a simple objective: predict the next …