Page 4 - HRM-00-v1
P. 4

   CAUSERIE
By Andy Kitchen
Illustration by Fauzy Lukman
Like many people from my generation, I remember return- ing to our small apartment to my mother slaving away over a hot GPU. If you’re like me, you probably get nostalgic over the smell of hot PCB and the ping of notifications as Mum
excitedly chatted to her friends over IRC.
I’m going to teach you a great recipe for a deep network, just like my mother used to make. You’ll be a hit in user groups, at science fairs, and at potlucks if you whip out one of these classics.
A lot of people these days get their models premade from cloud or SaaS and that’s OK (hey, we all get busy sometimes), but you can make a classic model at home, and isn’t that more satisfying?
You will need:
• 50,000 data examples (with labels)
• 1 GPU, ~2 Terra-flops
• 1 deep learning framework
The key to a great neural network is good ingredients. That’s why I al- ways use high-quality, hand-labeled data. Call me a traditionalist, but as my mother would say: “A great model needs great inputs! Simple as that!”
Start by splitting your data into validation, training, and test. A lot of peo- ple ask me how to split it up. Traditionally it depends on the school and region you’re from, but go with a 60/20/20 split and you should be fine!
Now normalize your data. This is a small step but a lot of people forget this nowadays. Just like salting an eggplant, you want to get that nas- ty, high-variance bitterness out and ensure a nice, smooth zero mean. Mm, mm, that’s deep learning!
Layers, oh boy. Back when I was in university there was one kind of layer we all used and it was fine. Nowadays, kids have their Incep- tions, ResNet, SqueezeNet, YOLOs...it’s hard to keep up! I like a clas- sic multilayer perceptron (MLP) myself, nothing more fancy than matrix multiplication and sigmoids. Stack five to seven layers with sigmoid activation, but feel free to use ReLu if you want something a bit more spicy.
Bake for 20–40 epochs at a high learning rate, then reduce learning rate and let the model simmer for another 20–40 epochs. Checking validation performance regularly. Add regularization to taste.
Now invite four friends over and build a web interface or API, and you’re halfway to being a startup. I leave you, dear reader, with some parting advice from my mother’s wisdom:
“Son, wash behind your ears and always make off-site backups.”
Bonne epoch! è hello@humanreadablemag.com
  How to Bake
a Deep Network
   4 | Human Readable Magazine
 












































































   2   3   4   5   6