UROP Proceeding 2023-24

School of Science Department of Mathematics 45 Improvement of the Air Quality Forecast by Using Deep-learning Technique Supervisor: FUNG Jimmy Chi Hung / MATH Student: XIE Hangcen / MATH-FAM Course: UROP 1100, Summer This report introduces the experiences and insights gained throughout a UROP project, generally arranged in temporal order and divided into six key sections. The “Introduction” section introduces what the author has done throughout the project. “Matching the Datasets” details the learning curve and practical challenges faced while aligning various datasets. “Ensemble Training” describes the model’s architecture, the troubleshooting of encountered issues, and learnings during model training phases. The “Reviewed Papers” section summarizes the content of the papers the author has analyzed during the project, focusing on their key insights without delving into technical details, due to the limitation of this report. “Future Prospects” outlines the author’s plans for further learning inspired by the project. Lastly, the “Acknowledgments” section expresses gratitude to the supervisors who provided support during the project. Intelligent Tutoring Systems for University Math Foundation Subjects Supervisor: HU Jishan / MATH Student: CHENG Yiyang / MATH-AM Course: UROP 1100, Summer In recent years, significant advancements have been observed in machine learning, marked by the emergence of some innovative architectures such as the transformer and diffusion models. These innovative structures have been widely adopted and applied across various domains, highlighting a great progression in the field. The Transformer model, which may be one of the most successful deep learning architectures, demonstrates its remarkable efficacy and exceptional adaptability. Of particular note is the revolutionary impact of the Transformer model on the field of natural language processing (NLP), such as chatting with people or translating different languages. This report focuses on elucidating the fundamental architecture of the Transformer model and evaluating some advantages of the Transformer.

RkJQdWJsaXNoZXIy NDk5Njg=