A template for fine-tuning your own GPT-2 model.

GPT-3 has dominated the NLP news cycle recently with its borderline magical performance in text generation, but for everyone without $1,000,000,000 of Azure compute credits there are still plenty of ways to experiment with language models on your own. Hugging Face is…


Language agnostic BERT sentence encoding, SCANN and neural collaborative filtering: a combined approach.

Introduction

Language models are empowering thousands of products and millions of people through language generation like predictive text and classification. For instance, many companies are using NLP for customer service interactions, to help route users to solutions more quickly. Or github and copilot providing code from comments. Maybe most subtly, but…


Logistic Regression from scratch with NumPy.

Preamble

In my previous article, I wrote about linear regression, starting with linear equations and analytical solutions to fitting your data, to gradient descent-optimized models and using PyTorch primitives to create a single layer neural network to solve a continuous linear regression problem. …


Linear regression from scratch using Pytorch and Autograd

Neural network frameworks, automl solutions and staple numerical libraries, like Scikit-learn and SciPy, have abstracted away much of the logic and math from the implementation of workhorse algorithms. Linear regressions fall into this category. Regressions are fundamental techniques that are often as performant as more complicated models, but we sometimes…

Richard Bownes

BBC Data Scientist

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store