Skip to main content Start main content

Foundations of Transfer and Multitask Optimization and Advances with Generative AI and Large Language Models

Seminar / Talk

  • Date

    22 Jul 2024

  • Organiser

    DSAI

  • Time

    16:15 - 17:15

  • Venue

    Y303 Map  

Speaker

Prof. Yew-Soon Ong

DSAI_Seminar_20240722

Summary

Traditional optimization typically starts from scratch, assuming zero prior knowledge about the task at hand. Classical optimization solvers generally do not automatically improve with experience. In contrast, humans routinely draw from a pool of knowledge from past experiences when faced with new tasks. This approach is often effective, as real-world problems seldom exist in isolation. Similarly, artificial systems are expected to encounter numerous problems throughout their lifetime, many of which will be repetitive or share domain-specific similarities. This perspective naturally motivates the development of advanced optimizers that replicate human cognitive capabilities, leveraging past lessons to accelerate the search for optimal solutions to novel tasks. This talk will provide an overview of the origins and foundations of Transfer and Multitask Optimization and present some of the latest works on Generative AI and Large Language Models based Multi-factorial Evolutionary Algorithms for conceptual design and machine learning model distillation.

Keynote Speaker

Prof. Yew-Soon Ong

President's Chair Professor of Computer Science, School of Computer Science and Engineering, Nanyang Technological University (NTU), Singapore

Your browser is not the latest version. If you continue to browse our website, Some pages may not function properly.

You are recommended to upgrade to a newer version or switch to a different browser. A list of the web browsers that we support can be found here