Winter School

Prof. Wan-Chi Siu (Hong Kong Polytechnic University, China)

Lecture 1
Developing Attention Mechanism from Conventional Machine Learning to Deep Learning
Abstract
Convolutional Neural Networks (CNN) in Deep Learning have dominated the research study in the recent 8 to 10 years for object recognition, tracking, inference and image/video signal processing. In the recent three years, the transformer networks which make use of the concepts of attention mechanism and non-local means have been very successfully applied to language processing. Applications of the transformer networks in image/video processing have just been emerging. In this talk we will try to recall some conventional machine learning elements as the hints on coming up with the encoder structure of the transformer model, and subsequently relate this to the basic structures of self-attention, spatial-attention and channel attention, and make use of them to build some standard transformer architectures. At the end of the presentation, we will highlight some applications and our research works relating to this hot topic in deep learning.