Jump to content

Udemy - Deep Learning for NLP - Part 1


Recommended Posts



185fc328bc065b49f54d294a7cf0c4b5.jpeg
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.14 GB | Duration: 3h 16m
What you'll learn

Deep Learning for Natural Language Processing
Multi-Layered Perceptrons (MLPs)
Word embeddings
Recurrent Models: RNNs, LSTMs, GRUs and variants
DL for NLP
Requirements
Basics of machine learning
Description
This course is a part of "Deep Learning for NLP" Series. In this course, I will introduce basic deep learning concepts like multi-layered perceptrons, word embeddings and recurrent neural networks. These concepts form the base for good understanding of advanced deep learning models for Natural Language Processing.
The course consists of three sections.
In the first section, I will talk about Basic concepts in artificial neural networks like activation functions (like ramp, step, sigmoid, tanh, relu, leaky relu), integration functions, perceptron and back-propagation algorithms. I also talk about what is deep learning, how is it related to machine learning and artificial intelligence? Finally, I will talk about how to handle overfittting in neural network training using methods like regularization, early stopping and dropouts.
In the second section, I will talk about various kinds of word embedding methods. I will start with basic methods like Onehot encoding and Singular Value Decomposition (SVD). Next I will talk about the popular word2vec model including both the CBOW and Skipgram methods. Further, I will talk about multiple methods to make the softmax computation efficient. This will be followed by discussion on GloVe. As special word embedding topics I will cover Cross-lingual embeddings. Finally, I will also talk about sub-word embeddings like BPE (Byte Pair Encoding), wordPiece, SentencePiece which are popularly used for Transformer based models.
In the third session, I will start with general discussion on ngram models. Next I will briefly introduce the neural network language model (NNLM). Then we will spend quite some time understanding how RNNs work. We will also talk about RNN variants like BiRNNs, Deep BiRNNs. Then I will discuss the vanishing and exploding gradients problem. This will be followed by details of the LSTMs and GRUs architectures.
Who this course is for:
Beginners in deep learning
Python developers interested in data science concepts

Homepage
https://www.udemy.com/course/ahol-dl4nlp1/



[b]Download (Uploadgig)[/b]
https://uploadgig.com/file/download/07455dFEcC354bfd/w737u.Deep.Learning.for.NLP..Part.1.part1.rar
https://uploadgig.com/file/download/cc606b3d7e54D9f8/w737u.Deep.Learning.for.NLP..Part.1.part2.rar
Download ( Rapidgator )
https://rapidgator.net/file/3b17514732dad1af679f2975b3c2a542/w737u.Deep.Learning.for.NLP..Part.1.part1.rar.html
https://rapidgator.net/file/e240baa3d2f348454fdf691d5d7497d4/w737u.Deep.Learning.for.NLP..Part.1.part2.rar.html
Download ( NitroFlare )
http://nitro.download/view/BC99E0C34294405/w737u.Deep.Learning.for.NLP..Part.1.part1.rar
http://nitro.download/view/F08E4C8E2426828/w737u.Deep.Learning.for.NLP..Part.1.part2.rar


Links are Interchangeable - No Password - Single Extraction
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...