# R

Fermat and his library This morning I woke up to a delightful tweet from fermatslibrary about sample random uniform numbers and how many it takes, on average, to sum to 1.
Pick a uniformly random number in [0,1] and repeat until the sum of the numbers picked is >1. You’ll on average pick e≈2.718… numbers! pic.twitter.com/8ak1hYENCi — Fermat’s Library (@fermatslibrary) October 28, 2017 If you look at the embedded picture, you can see the math sketched out but of course it’s alwasy more fun to simulate.

Learning the hard way About a month ago David Robinson made a tweet that I both agree and disagree with.
New blog post: “Don’t teach students the hard way first” https://t.co/X2drh1tQe5 #rstats pic.twitter.com/GXAEpx5eET — David Robinson (@drob) September 21, 2017 His example is simple enough - you’re going to a friends new house and are provided with directions involving a lot of back roads, twisting and turning. When you arrive you’re told to just take the highway back because it’s easier.

All Subsets Regression What is all subsets regression? It’s a technique for model building which involves taking a set of independent variables (X*1..i) and regressing them in sets of (k), where (k) is in ({1,2,\dots,i}), against the response variable (Y). The ‘all’ part of ‘all subsets’ means it’s every combination of (X*{1..i}) being drawn (k) at a time.
But…why? You’re probably familiar with forward, backward or stepwise model building where terms are added (or removed) from a model one at a time while attempting to maximize or minimize some ‘goodness’ criteria.