IoT Thrust Seminar | Towards Decentralized Learning on the Edge

9:00am - 10:00am
Offline Venue: E3-202; Zoom ID: 938 6559 1861, Passcode: iott

Supporting the below United Nations Sustainable Development Goals:支持以下聯合國可持續發展目標:支持以下联合国可持续发展目标:

With the increasing capabilities of small language models and recent advances in computer hardware, it has now become feasible to train machine learning models on edge devices. Federated learning has emerged as a prominent paradigm to allow edge devices to train a shared model collaboratively while preserving data privacy. In this talk, we present some of our recent advances, ongoing work, and future plans toward improving the performance of decentralized learning across edge devices in general, and federated learning in particular.

Our journey starts with Port, a new asynchronous federated learning mechanism that improves convergence speeds in real time, and shows how we implement it in our own open-source framework. We then consider the problem of unlearning in asynchronous federated learning and present Knot, a novel clustering algorithm that optimizes the assignment of devices efficiently. Next, to allow large language models (LLMs) to be fine-tuned in a decentralized learning setting, we propose Titanic, a new distributed training paradigm that partitions an LLM across multiple edge devices so that it can be fine-tuned with no or minimal losses in training performance. In designing Titanic, we focused on its feasibility in real-world systems and implemented a model-agnostic partitioning mechanism that is fully automated. In closing, we briefly present our ongoing work on optimizing decentralized learning across geographically distributed edge nodes and show our vision for such a decentralized learning paradigm in the future.

Event Format
Speakers / Performers:
Dr. Ningxin SU
University of Toronto

Ningxin is a Ph.D. candidate in the Department of Electrical and Computer Engineering at the University of Toronto, supervised by Prof. Baochun Li. Her research interests focus on distributed machine learning and federated learning, particularly the training of large language models in today’s edge computing and metaverse environments. She is passionate about continuing her work in these exciting, interdisciplinary research areas, which bridge distributed systems, edge computing, and machine learning.

Language
English
Recommended For
Faculty and staff
PG students
UG students
Organizer
Internet of Things Thrust, HKUST(GZ)
Post an event
Campus organizations are invited to add their events to the calendar.