General Information

References

Note. See my lecture note for some more advance monographs.

Prerequisite

  • Real and complex analysis

Homeworks

  • Homework 1: Return to the instructor (via email) by March 6, 2025 (Thursday) 23:59
  • Homework 2: Return to the instructor (via email) by March 13, 2025 (Thursday) 23:59
  • Homework 3: Return to the instructor (via email) by March 20, 2025 (Thursday) 23:59
  • Homework 4: Return to the instructor (via email) by March 27, 2025 (Thursday) 23:59
  • Homework 5: Return to the instructor (via email) by April 10, 2025 (Thursday) 23:59
  • Homework 6: Return to the instructor (via email) by April 17, 2025 (Thursday) 23:59

Schedule

  • The lectures are on Thursday (13:10-16:00) at 志希070116.
Time Room Activities
20.02.2025 13:10-16:00 志希070116 Week 1 - Lecture (Thursday): Recall some preliminaries and introducing PDE (in classical sense)
27.02.2025 13:10-16:00 志希070116 Week 2 - No class
06.03.2025 13:10-16:00 志希070116 Week 3 - Lecture (Thursday): Weak derivatives and distribution derivatives [Return Homework 1 by 23:59]
13.03.2025 13:10-16:00 志希070116 Week 4 - Lecture (Thursday): Some Sobolev spaces and Hilbert spaces [Return Homework 2 by 23:59]
20.03.2025 13:10-16:00 志希070116 Week 5 - No class [Return Homework 3 by 23:59]
27.03.2025 13:10-16:00 志希070116 Week 6 - Lecture (Thursday): Some Hilbert spaces [Return Homework 4 by 23:59]
03.04.2025 13:10-16:00 志希070116 Week 7 - No class: children's day and tomb sweeping day
10.04.2025 13:10-16:00 志希070116 Week 8 - Lecture (Thursday): Solving elliptic PDE for a small wave number [Return Homework 5 by 23:59]
17.04.2025 13:10-16:00 志希070116 Week 9 - Lecture (Thursday): Eigenvalue problem [Return Homework 6 by 23:59]
24.04.2025 13:10-16:00 志希070116 Week 10 - No class
01.05.2025 13:10-16:00 志希070116 Week 11 - Lecture (Thursday): talks
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Cheng, Huan-Chang
Title. Reading Reflection on Random Matrix Methods for Machine Learning, by Romain Couillet and Zhenyu Liao
Abstract. Numerous and large dimensional data is now a default setting in modern ma- chine learning (ML). Standard ML algorithms, starting with kernel methods such as support vector machines and graph-based methods like the PageRank algorithm, were however initially designed out of small-dimensional intuitions and tend to misbehave, if not completely collapse, when dealing with real-world large datasets. Random matrix theory has recently developed a broad spectrum of tools to help understand this new "curse of dimensionality" to help repair or completely recreate the suboptimal algorithms, and most importantly to provide new intuitions to deal with modern data mining. This book primarily aims to deliver these intuitions, by providing a digest of the recent theoretical and applied breakthroughs of random matrix theory into ML.

click me to see the title and abstract of today's second talk (50+50 minutes)
Speaker. Lin, Po-Yi
Title. Leveraging Language Models to Detect Greenwashing
Abstract. In recent years, climate change repercussions have increasingly captured public interest. Consequently, corporations are emphasizing their environmental efforts in sustainability reports to bolster their public image. Yet, the absence of stringent regulations in review of such reports allows potential greenwashing. In this study, we introduce a novel preliminary methodology to train a language model on generated labels for greenwashing risk. Our primary contributions encompass: developing a preliminary mathematical formulation to quantify greenwashing risk, a fine-tuned Climate-BERT model for this problem, and a comparative analysis of results. On a test set comprising of sustainability reports, our best model achieved an average accuracy score of 86.34% and F1 score of 0.67, demonstrating that our proof-of-concept methodology shows a promising direction of exploration for this task.
08.05.2025 13:10-16:00 志希070116 Week 12 - Lecture (Thursday): Solving elliptic PDE by using Fredholm theory
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Chang, Wen-Ling
Title. 控制理論——倒單擺的介紹
Abstract. 以控制理論中的倒單擺為例,介紹何為可控系統與LQG,並說明生活中倒單擺的實際例子以及展示控制倒單擺系統的 matlab。
15.05.2025 13:10-16:00 志希070116 Week 13 - Lecture (Thursday)
22.05.2025 13:10-16:00 志希070116 Week 14 - Lecture (Thursday): Fourier series and Fourier transform
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Fan, Min-Yuan
Title. Optimal Transport and its Applications in Machine Learning
Abstract. This report introduces the theory and applications of Optimal Transport (OT), a mathematical framework for comparing probability distributions by computing the most efficient way to move mass from one distribution to another. Starting from Monge’s formulation and Kantorovich’s convex relaxation, we explore Wasserstein distances and their geometric properties. To enable efficient computation, we introduce entropic regularization and the Sinkhorn algorithm. We then highlight key machine learning applications, including Wasserstein GANs for stable generative modeling, domain adaptation through distribution alignment, and image color transfer using OT-based style matching. This work demonstrates how OT serves as a powerful tool bridging theory and practice across multiple fields.
29.05.2025 13:10-16:00 志希070116 Week 15 - Lecture (Thursday)
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Wu, Ju-Yen
Title. A Comparative Overview of Convergence Theories in Real Analysis and Mathematical Statistics, with an Introduction to Statistical and Ideal Convergence
Abstract. Convergence theory is central to both analysis and statistics, with each field offering distinct perspectives. In this talk, we will review and compare major types of convergence in real analysis and mathematical statistics, and will briefly introduce modern extensions, namely statistical convergence and ideal convergence.
05.06.2025 13:10-16:00 志希070116 Week 16 - Lecture (Thursday): talks
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Cheng, Huan-Chang
Title. Foundational random matrix result
Abstract. We introduce some result of random matrix theory, within the unified framework of resolvent- and deterministic-equivalent approach. Historical and foundational random matrix results are presented in the proposed framework. The resolvent of a matrix gives access to its spectral measure, to the location of its isolated eigenvalues, to the statistical behavior of their associated eigenvectors when random, and consequently provides an entry-door to the performance analysis of numerous machine learning methods.

click me to see the title and abstract of today's second talk (50 minutes)
Speaker. Wu, Ping-Hsun
Title. About Riemann Zeta Function
Abstract. In this talk we'll review the definition of Riemann Zeta Function and discuss it's extension to the whole complex plane with its functional equation. And we'll also discuss its zeros, which leads to the Riemann's Hypothesis.

click me to see the title and abstract of today's third talk (50 minutes)
Speaker. Wu, Ju-Yen
Title. The Statement and Proof of the Strong Law of Large Numbers: An Exploration via Etemadi's Approach
Abstract. The Strong Law of Large Numbers (SLLN) states that the sample mean of independent, identically distributed random variables converges almost surely to the expected value. This talk introduces Etemadi’s proof from Rick Durrett’s textbook, which requires only pairwise independence and finite first moment, and briefly explains truncation and subsequence techniques.
12.06.2025 13:10-16:00 志希070116 Week 17 - No class
19.06.2025 13:10-16:00 志希070116 Week 18 - Lecture (Thursday): talks
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Chang, Wen-Ling
Title. 演算法觀點的圖論——平面圖
Abstract. This talk introduces planar graphs in graph theory, exploring their geometric properties and practical applications. We will also review Euler's formula and Kuratowski's theorem, and further examine special cases such as outerplanar graphs. These concepts are not only mathematically significant but also widely applied in circuit design and network planning.

click me to see the title and abstract of today's second talk (50 minutes)
Speaker. Fan, Min-Yuan
Title. Wasserstein Distance and KL Regularization
Abstract. This talk explores the mathematical structure and properties of the Wasserstein distance, focusing particularly on why it qualifies as a true metric and how it differs from its squared version. We then derive the closed-form expression for the Wasserstein distance in one dimension using cumulative distribution functions (CDFs), revealing its connection to quantile transport. Finally, we introduce KL regularization within optimal transport, examining its theoretical role in smoothing and computational acceleration, along with its implementation through the Sinkhorn algorithm. The presentation aims to balance theoretical depth with practical insight for an audience familiar with analysis and optimization.

click me to see the title and abstract of today's third talk (50 minutes)
Speaker. Wu, Ping-Hsun
Title. TBA
Abstract. TBA

Completion

  • The course can be taken for credit by attending the lectures, returning written solutions (60%) in LaTeX and giving (at least) 2 presentations (each 20%).