Thoughts on the Coursera Deep Learning Specialization

I recently completed the Deep Learning specialization on Coursera from Over five courses, they go over generic neural networks, regularization, convolutional neural nets, and recurrent neural nets.

Having completed it, I would say the specialization is a great overview, and a jumping off point for learning more about particular techniques. I wouldn’t say I have an in-depth understanding of all the material, but I do feel like I could go off and read papers and understand them, which is maybe all I could expect.

Where to go from here?

There are various aspects of deep learning that I’d like to learn more about. Certain aspects, like optimization methods and regularization techniques have quite a bit of overlap with other machine learning approaches. I don’t feel the need to dig in more here, except as needed.

I do think I need to have more hands-on time with particular architectures like ResNets (Residual Networks), the Inception Network, LSTMs (Long Short Term Memory), GRUs (Gated Recurrent Unit), encoder/decoder models, models with attention, and BERT.

And I’d like to get more practice with some specific algorithms, like the YOLO (You Only Look Once) algorithm for object detection, neural-style transfer (I had some ideas on how to improve this) and word embeddings. There are a variety of pre-trained models available, and I’d like to develop a familiarity with them. There are also deep learning frameworks: TensorFlow and PyTorch seem to be the two major ones.

Finally, there are a ton of particular application domains like object detection, facial recognition, machine translation, sentiment analysis, etc, as well as special techniques like adversarial training, deep reinforcement learning, model explainability/interpretability, and debiasing.

Deep Learning Frameworks

  • TensorFlow / Keras
  • PyTorch


Pre-trained models

Application Domains


Subscribe to Adventures in Why

* indicates required
Bob Wilson
Bob Wilson
Decision Scientist

The views expressed on this blog are Bob’s alone and do not necessarily reflect the positions of current or previous employers.