Back to All Events

ML Fridays: Inside Transformers with Amazon SageMaker and HuggingFace

Event Type: Seminar

Location: Online

Time zone: AEST (GMT +11)

Who can attend: Everyone

Registration: At this link

Host Organisation: AWS

Details:
The transformer is one of the most popular state-of-the-art deep (SOTA) learning architectures that is mostly used for natural language processing (NLP) tasks. Ever since the advent of the transformer, it has replaced RNN and LSTM for various tasks. The transformer also created a major breakthrough in the field of NLP and also paved the way for new revolutionary architectures such as BERT.

Join this session

  • To dive deep into Transformers, and understand how it uses the encoder-decoder architecture for a language translation task.

  • We will learn how transformers overcome one of the major challenges which we had in recurrent models like RNN/LSTM in capturing the long-term dependency. Later we will learn about BERT and see how it differs from other embedding models and build a Disaster Tweet Analysis system.

Previous
Previous
June 17

Zero Knowledge Proofs and Their Applications to Machine Learning

Next
Next
June 21

Privacy-Preserving Machine Learning (PPML)