A PyTorch re-implementation of GPT, both training and inference. minGPT tries to be small, clean, interpretable and educational, as most of the currently available GPT model implementations can a bit sprawling. GPT is not a complicated model and this implementation is appropriately about 300 lines of code. All that's going on is that a sequence of indices feeds into a Transformer, and a probability distribution over the next index in the sequence comes out. The majority of the complexity is just being clever with batching (both across examples and over sequence length) for efficiency. Includes some sample code for training a blank copy of the model.
An excellent step-by-step article on how to partition and format hard drives larger than 2TB on Linux machines. It involves the use of a GPT partition table instead of a DOS-legacy partition table and the use of GNU Parted. Note that you'll need a fairly recent version of parted (I had to install v3.0.0 on my boxen).
Newer versions of fdisk will do this for you.
4324 links, including 295 private