Page 10 -
P. 10

4 Scale drives machine learning progress




             Many of the ideas of deep learning (neural networks) have been around for decades. Why are
             these ideas taking off now?

             Two of the biggest drivers of recent progress have been:


             • Data availability.​ People are now spending more time on digital devices (laptops, mobile
               devices). Their digital activities generate huge amounts of data that we can feed to our
               learning algorithms.

             • Computational scale. ​We started just a few years ago to be able to train neural
               networks that are big enough to take advantage of the huge datasets we now have.


             In detail, even as you accumulate more data, usually the performance of older learning
             algorithms, such as logistic regression, “plateaus.” This means its learning curve “flattens
             out,” and the algorithm stops improving even as you give it more data:


















             It was as if the older algorithms didn’t know what to do with all the data we now have.


             If you train a small neutral network (NN) on the same supervised learning task, you might
             get slightly better performance:










             Page 10                            Machine Learning Yearning-Draft                       Andrew Ng
   5   6   7   8   9   10   11   12   13   14   15