May 8, 2023 Monday
During the past decade, deep neural networks achieve great success in many applications such as computer vision, natural language processing, and speech processing. Despite that, deep neural networks are data-hungry as it requires a huge amount of (labeled) data to obtain a good model. However, such requirement does not always hold in many applications. To alleviate such data-hungry requirement, the meta-learning paradigm is devised to extract meta-knowledge from existing tasks to accelerate learning new tasks. Even though many models have been proposed for meta-learning, most of them handle simple tasks in that a linear meta-regularizer, a single meta-initialization, or a single meta-learning objective is sufficient. To improve handle more complex tasks, in this talk I will introduce our recent works which can learn nonlinear meta-regularizers, multiple meta-initializations, and from multiple meta-learning objectives.
Prof. Yu ZHANG
Associate Professor, Department of Computer Science and Engineering
Southern University of Science and Technology
Yu Zhang is now an associate professor in Department of Computer Science and Engineering at Southern University of Science and Technology. He obtained his bachelor and master degrees from Nanjing University and PhD degree from Hong Kong University of Science and Technology (HKUST). Prior to joining Southern University of Science and Technology (SUSTech), he worked in HKUST and Hong Kong Baptist University (HKBU) for eight years. His current research interests include artificial intelligence, machine learning, pattern recognition, and data mining. He is especially interested in multi-task learning, transfer learning, meta learning, deep learning, semi-supervised learning, dimensionality reduction, and metric learning.