Gradient Descent in real life

Sriram Ramanathan
2 min readJul 7, 2018

You might have seen child prodigies like Bobby Fischer and might seem to wonder how they break Malcolm Gladwell’s popular 10,000 hour rule which says that you have to invest at least 10,000 hours of deliberate practice in any particular field to become an expert.

The 10,000 hour rule actually holds true for most of the experts in any field like Beatles in music, Bill Gates in programming etc.

To know why it takes 10,000 hours to reach descent level of expertise, lets understand gradient descent, one of the elegant algorithm/technique in Artificial Intelligence.

Gradient Descent — Reduce the mistakes incrementally

-Typically we start at a random point in the learning curve

-And every the mistake or error is reduced by moving towards either left or right where is slope of curve is low.

-When we reach the lowest point in the curve where the ground is flat or slope is zero, we reached the state of global minimum error that we all aim to.

Each task that we do might have different function but the process of reaching the gloal minimum remains the same.

Most of the people start randomly far away from global minimum and might be stuck at local minima for some years. So on average it takes 10,000 hours to reach this golden spot of global minima.

Golden Global Minima

But very few people whom we call as child prodigies end up very close to the golden spot of global minima at early age. But still you might have seen many child prodigies losing their talent as they age when they stop their deliberate practice. This can be due to two reasons

i)Human memory is volatile, the optimal weights that they got may not sustained forever unless they train. Use it or lose it.

ii)They might never reached global optima in first place, since they stopped training they are still stuck in local minima.

So other folks who are stuck at the local minima, do not lose hope if you are stuck at your field, each hour that you spend moves you an inch closer to mastery!

--

--