JH
Oct 4, 2020
Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks
SB
Nov 20, 2020
The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.
By Eduardo S
•Aug 9, 2023
Great course!
By Prakash K
•Apr 29, 2023
easy to study
By Mohammad B A
•Feb 26, 2021
I am so happy
By Parma R R
•May 21, 2023
Good course!
By John M
•Apr 14, 2023
It was great
By Teng L
•Feb 18, 2023
Thank you!
By yuzhuobai
•Dec 12, 2022
Thank you!
By Chen
•Oct 27, 2021
Thank you!
By Sohail Z
•Oct 17, 2020
AWESOME!!!
By Ehsan F
•Sep 24, 2023
Fantastic
By LK Z
•Oct 20, 2020
very good
By अनुभव त (
•Sep 27, 2020
Very good
By Hoang Q T
•Jul 25, 2022
Awesome!
By Justin H
•Jul 12, 2023
Brutal.
By M n n
•Nov 22, 2020
Awesome
By Alphin G I
•Sep 14, 2023
Awsome
By Jeff D
•Nov 15, 2020
Thanks
By Ayush S
•May 25, 2023
good
By Md P
•Apr 19, 2023
nice
By Pema W
•Nov 11, 2022
good
By Rifat R
•Sep 30, 2020
Best
By Thierry H
•Oct 25, 2020
I was a bit disappointed by the fact that although 3 instructors are mentioned, in practice we only see Younes. Lukasz just says some intro and conclusion for each lesson, I would have liked seeing him really teach. And we don't see at all Eddy.
There are also some typos in the text in some notebooks and some slides but they don't hurt the quality of the course.
Overall the course is well made. I like the fact that it teaches recent architectures like Reformer. I was surprised that trax is used, at a time where the community is only starting to tame Tensorflow2. It would have been nice to have some words about where we are in the set of frameworks: why trax and how it compares to Tensorflow2, what's the trend and priority comparing TF vs trax (features support, flexibility, targeted audience, production readiness...etc...)
Some notebooks are only about filling the blanks with a big hint a couple lines before but I don't know how to make it more complex without leaving many people stuck, especially with a new framework like trax. I also liked the diagrams very much, especially for the last week with the complex transformations for LSH.
Quite a good course overall. Thanks!
By Brian G
•Oct 31, 2020
I really enjoyed the course, thank you Younes, Lukasz, Eddy and all the staff of Deeplearning.ai and Coursera. I applaud your effort in trying to teach what seem to me cutting edge NLP techniques like Transformers, even though it is a very new and complex topic. The reason I didn't give you five stars is because I didn't feel the final course on Transformers and Reformers does not seem self contained, feels incomplete, a bit too haphazard for me, unlike the first 3 courses in the specialization. I don't feel enough foundation was covered for student to appreciate the topic being discussed or what choices led to the current design e.g. why Q, K, V and not just Q, V? Why not feed NER output as context instead of just the source input? I have to search for supplementary content on the internet to round out my understanding.
By David M
•Apr 2, 2023
I think the labs could either be a bit less cook-book (though it was satisfying to work through them successfully) or go into greater depth about the mechanisms at work. The ungraded labs in the last course of the specialty were *great*, and I think captured the right balance. More labs like that, please.
I come away with a decent understanding of what Attention models are, but I'm not really sure how they do what they do, with the end result that they seem a bit magical. The same is true of Locality Sensitive Hashing, *why* should similar items hash to the same bucket.
Well, I'll spend some time investigating these on my own.