UROP Proceeding 2024-25

School of Science Department of Mathematics 55 Some Aspects of High-Dimensional Probability Supervisor: WANG Ke / MATH Student: GAO Shuyang / MATH-STAT Course: UROP 1100, Spring UROP 2100, Summer This report analyzes recent developments in dimensionality reduction via the Johnson-Lindenstrauss (JL) Lemma, focusing on three aspects. The first aspect extends the JL transform from the original inner product to multilinear inner products, introducing a randomized algorithm using Hadamard products of sign matrices to preserve -order inner products. The second aspect investigates the optimal sparsity which depends on both the projection matrix and the dataset points that need to be projected. The third aspect enhances previous sparse JL transforms, improving sparsity bounds for the projection matrix and subspace embeddings. These works advance the theoretical and practical efficiency of dimensionality reduction, addressing multilinear generalizations, computational optimization, and sparsity constraints. Deep Learning-Based Numerical Algorithms for Solving Partial Differential Equations Supervisor: XIANG Yang / MATH Student: JI Wenzhao / MATH-IRE Course: UROP 1100, Fall In recent years, the application of neural networks (NNs) to solving Partial Differential Equations (PDEs) has catalyzed the development of numerous innovative methodologies. Broadly, these approaches can be categorized into two main types: function learning and operator/functional learning. A notable limitation of function learning methods is the requirement to train a separate neural network for each new PDE, as the trained network is not transferable to other PDEs. Operator/functional learning methods, such as DeepONet and the Fourier Neural Operator (FNO), aim to learn a more general mapping between PDE parameters and their corresponding solutions. Error analysis of neural networks in approximating functionals is performed with the Barron space to overcome the curse of dimensionality. Quantum Information Theory and Error-Correcting Codes Supervisor: XIONG Maosheng / MATH Student: WANG Dingqi / MATH-PMA Course: UROP 1100, Summer For both classical and quantum codes, the MacWilliams identity provides a natural foundation, imposing linear constraints on the associated polynomials. Linear programming (LP) is a powerful tool for optimizing systems with linear constraints, making it highly effective for deriving tight bounds in coding theory—similar to semidefinite programming (SDP). Both LP and SDP are widely applicable in code-bound estimation, particularly in association schemes. In this UROP project, we learned the tools of LP and SDP, together with investigating the impact of the complete weight distribution on LP bounds. Additionally, we explored the potential of SDP methods for classical codes, with the aim of extending these techniques to quantum codes in future work.

RkJQdWJsaXNoZXIy NDk5Njg=