Dec 11, 2023 Monday
Abstract:
Recent pretrained language models have demonstrated impressive performance on knowledge-intensive tasks across scientific domains. This talk will introduce two of our recent works on generative reasoning with pretrained language models. First, we introduce a single-sequence prediction method over a local reasoning graph (SeqGraph) to improve the quality of the output reasoning path for multi-hop QA. Second, we will introduce a multimodal prompt retrieval method that leverages a generative model for visual question answering. Finally, I’ll discuss some future directions along the thread of generative reasoning with pretrained large language models.
Time:
December 11, 2023, Monday
11:00 – 11:50
Location:
Rm E1-103, GZ Campus
Zoom:
628 334 1826 (PW: 234567)
Bilibili Live:
ID: 30748067
Speaker Bio:
Assis. Prof. Junjie HU
Affiliated with the Computer Science and Biostatistics Departments and Data Science Institute
University of Wisconsin-Madison
Prof. Junjie Hu is an assistant professor affiliated with the Computer Science and Biostatistics departments and Data Science Institute at the University of Wisconsin-Madison. He obtained his Ph.D. from the Language Technologies Institute under the School of Computer Science at Carnegie Mellon University. His research lies at the intersection of natural language processing and machine learning, particularly multilingual NLP, knowledge reasoning, representation learning, and their applications in human-machine communications