Data Science and Analytics Webinar Series 2021 - Evolution of Language Modeling Technologies

06:00pm - 06:50pm
Online

Abstract

In recent years, neural language models, particularly pre-trained language models, have revolutionized natural language processing (NLP) and related fields. Significant progresses have been made in many language processing tasks such as reading comprehension and text generation. A language model is a probability distribution defined on a word sequence, which can be used to calculate the probability of a sentence or a paragraph. Neural language models constructed by combination of language modeling and deep learning have powerful representation and learning capabilities. In this lecture, I will first review the history of language modeling and summarize the state-of-the-art technologies of language modeling. I will then introduce our recent work on Chinese spelling error correction using the pre-trained language model BERT. Finally, I will share my view on the future development of language modeling technologies.

 

About the speaker

Hang Li is currently a Director of the AI Lab at ByteDance Technology. He is also a Fellow of ACL, Fellow of IEEE, and Distinguished Scientist of ACM. He graduated from Kyoto University and earned his Ph.D. from the University of Tokyo. He worked at NEC Research as researcher and at Microsoft Asia Research as senior researcher and research manager. He was a director and chief scientist of Noah’s Ark Lab of Huawei Technologies prior he joined ByteDance.

 

Registration online at here.

Event Format
Organizer
HKUST Big Data Institute
Post an event
Campus organizations are invited to add their events to the calendar.