Recurrent Neural Netwoks for Natural Language Processing
Coins: 25
Puntos experiencia: 250
Organizador: Facultad de Informática
Cuándo: 12/02 17:00 — 12/02 19:30
Añadir al calendario
Lugar: Aula 6
Ponente
- Katarzyna Baraniak. ML Researcher. Warsaw, Mazowieckie, Poland
Descripción
This talk discusses recurrent neural networks including the most common architectures Simple RNN, LSTM and GRU. Training, types and applications of RNNs for NLP problems are explained. Also, the suitable text representations like word embedding is explained.
During this workshop simple text generation model will be created using LSTM Network.
Please bring your laptops for the hands-on session and make sure you have access to Google Colab or Jupyter-notebooks with Python and Pytorch installed.
Esta charla/taller es en inglés
Puedes descargar el material de este taller en este repositorio
Registro de la actividad
Si quieres que tu participación se contabilice en la Copa de las Casas, usa este formulario de inscripción.