As the final model release of GPT-2’s staged release, we’re releasing the largest version (1.5B parameters) of GPT-2 along with code and model weights to facilitate detection of outputs of GPT-2 models.Read Full Story
Today, I am releasing the first edition of Bootstrapping Machine Learning.
Prediction APIs are making Machine Learning accessible to everyone and this book is the first that teaches how to use them.
We’re releasing the 774 million parameter GPT-2 language model after the release of our small 124M model in February, staged release of our medium 355M model in May, and subsequent research with partners and the AI community into the model’s potential for misuse and societal benefit.Read Full Story
Together with Rosetta Stone, we’re releasing our first Language of Love Report — just in time for Valentine’s DayContrary to what many of us predicted, the world did not implode in 2017. There’s always this year for a worldwide shutdown — but, despite everything that’s going on, we’re feeling pretty hopeful that 2018 may be the year of unity, versus division.Read Full Story