Text Classification with TensorFlow Keras | NLP Using Embedding and LSTM Recurrent Neural Networks

In this video I’m creating a baseline NLP model for Text Classification with the help of Embedding and LSTM layers from TensorFlow’s high-level API Keras. With Embedding, we map each word to a vector of fixed size with real-valued elements. In contrast to one hot encoding, we can use finite sized vectors to represent an infinite number of real numbers. This feature learning technique can learn the most important features to represent the words in the data. LSTMs are Recurrent Neural Networks (RNN) use
Back to Top