Actions Panel
Building Transformer-Based Natural Language Processing Applications
Building Transformer-Based Natural Language Processing Applications
When and where
Date and time
Location
Online
About this event
Online. Thursday 20th October (14h-18h) and Friday 21st October (14h-18h)
Artificial intelligence (AI) is bringing significant advances in sectors like healthcare, robotics, and autonomous driving. Deep learning is a powerful method that has been instrumental in the recent success of AI. The CLAIRE office Switzerland is glad to provide its supporters, free of charge, an 8-hour workshop on Transformers for NLP.
The workshop is offered through a collaboration between CLAIRE and the NVIDIA Deep Learning Institute (DLI). Attendants will receive theoretical instruction and be able to do hands-on exercises on GPU-accelerated computing power in the cloud provided by the organizers. Upon successful completion of the course, participants will receive an NVIDIA DLI certificate to recognize their subject matter competency and support professional career growth.
Development of talent and knowledge transfer is one of the key elements of the CLAIRE vision. Through this collaboration we contribute to empowering society to benefit of the potential of Artificial Intelligence.
Audience
People with basic programming skills (Python) interested in artificial intelligence and deep learning methods. Participants should be affiliated to academic institutions and be registered as CLAIRE supporters. Please ensure to register using an email address from an academic institution.
Not a CLAIRE supporter yet? Please join us and sign up here to support a human-centered approach for artificial intelligence in Europe. Stronger together!
Also you can learn more about CLAIRE’s vision about human-centered artificial intelligence for Europe here.
Disclaimer
The workshop is developed and provided by NVIDIA Deep Learning Institution. NVIDIA corporation the sole responsible for their content.
Workshop Description
Applications for natural language processing (NLP) have exploded in the past decade. With the proliferation of AI assistants and organizations infusing their businesses with more interactive human-machine experiences, understanding how NLP techniques can be used to manipulate, analyze, and generate text-based data is essential. Modern techniques can capture the nuance, context, and sophistication of language, just as humans do. And when designed correctly, developers can use these techniques to build powerful NLP applications that provide natural and seamless human-computer interactions within chatbots, AI voice agents, and more.
Deep learning models have gained widespread popularity for NLP because of their ability to accurately generalize over a range of contexts and languages. Transformer-based models, such as Bidirectional Encoder Representations from Transformers (BERT), have revolutionized NLP by offering accuracy comparable to human baselines on benchmarks like SQuAD for question-answer, entity recognition, intent recognition, sentiment analysis, and more.
In this workshop, you’ll learn how to use Transformer-based natural language processing models for text classification tasks, such as categorizing documents. You’ll also learn how to leverage Transformer-based models for named-entity recognition (NER) tasks and how to analyze various model features, constraints, and characteristics to determine which model is best suited for a particular use case based on metrics, domain specificity, and available resources.
Learning Objectives
By participating in this workshop, you’ll:
- Understand how text embeddings have rapidly evolved in NLP tasks such as Word2Vec, recurrent neural network (RNN)-based embeddings, and Transformers
- See how Transformer architecture features, especially self-attention, are used to create language models without RNNs
- Use self-supervision to improve the Transformer architecture in BERT, Megatron, and other variants for superior NLP results
- Leverage pre-trained, modern NLP models to solve multiple tasks such as text classification, NER, and question answering
- Manage inference challenges and deploy refined models for live applications
Workshop Details
Workshop Outline
Organizers
About CLAIRE
The Confederation of Laboratories for Artificial Intelligence Research in Europe (CLAIRE) is an organisation created by the European AI community that seeks to strengthen European excellence in AI research and innovation, with a strong focus on human-centred AI. CLAIRE was launched in June 2018 and now has the support of more than 3,500 people, most of them scientists, technologists, and researchers in Artificial Intelligence. CLAIRE's membership network consists of over 380 research groups and research institutions, covering jointly more than 21,000 employees in 35 countries. CLAIRE has administrative offices in Brussels (BE), The Hague (NL), Oslo (NO), Paris (FR), Prague (CZ), Rome (IT), Saarbrücken (DE), Stockholm (SU), and Zürich (CH)
Claire Vision & Mission video here
About NVIDIA DLI
The NVIDIA Deep Learning Institute (DLI) offers resources for diverse needs—from learning materials to self- paced and live training to educator programs—giving individuals, teams, organizations, educators, and students what they need to advance their knowledge in AI, accelerated computing, data science, graphics and simulation, networking, and more.
With access to GPU-accelerated servers in the cloud, you’ll learn how to train, optimize, and deploy neural networks using the latest deep learning tools, frameworks, and SDKs. You’ll also learn how to assess, parallelize, optimize, and deploy GPU-accelerated computing applications.