Differential Equations 1 (751873001) 微分方程式 (一)
General Information
- Begins ~ ends: September 9, 2024 ~ January 10, 2025
- Instructor: Pu-Zhao Kow
- Email: pzkow [at] g.nccu.edu.tw
- Office hour: Thursday (16:10 ~ 17:00)
- Teaching Language: Chinese and English
- Lecture Note: Pu-Zhao Kow, An introduction to ordinary differential equations. Note: The lecture note may update during the course.
References
- J. W. E. Boyce and R. C. DiPrima, Elementary differential equations and boundary value problems, John Wiley and Sons, Inc., Hoboken, NJ, 12th edition, 2022. MR0179403, Zbl:1492.34001
- P.-F. Hsieh and Y. Sibuya, Basic theory of ordinary differential equations, Universitext, Springer-Verlag, New York, 1999. MR1697415, doi:10.1007/978-1-4612-1506-6
Prerequisite
- Real and complex analysis
Homeworks
- Homework 1: Return by September 26, 2024 (Thursday) 23:59
- Homework 2: Return by October 10, 2024 (Thursday) 23:59
- Homework 3: Return by October 31, 2024 (Thursday) 23:59
- Homework 4: Return by November 7, 2024 (Thursday) 23:59
- Homework 5: Return by December 12, 2024 (Thursday) 23:59
Schedule
- The lectures are on Thursday (13:10-16:00) at 志希070116.
Time | Room | Activities |
---|---|---|
12.09.2024 13:10-16:00 | 志希070116 | Week 1 - Lecture (Thursday): Well-posedness of ODE |
19.09.2024 13:10-16:00 | 志希070116 | Week 2 - No class |
26.09.2024 13:10-16:00 | 志希070116 | Week 3 - Lecture (Thursday): First order linear PDE [Return Homework 1 by September 26, 2024 (Thursday) 23:59] |
3.10.2024 13:10-16:00 | 志希070116 | Week 4 - No class due to Typhoon Krathon |
10.10.2024 13:10-16:00 | 志希070116 | Week 5 - No class (Public holiday) [Return Homework 2 by October 10, 2024 (Thursday) 23:59] |
17.10.2024 13:10-16:00 | 志希070116 | Week 6 - Lecture (Thursday): First order quasilinear PDE, homogeneous linear ODE with constant coefficients |
24.10.2024 13:10-16:00 | 志希070116 | Week 7 - Lecture (Thursday): Computations of the exponential, the matrix logarithm |
31.10.2024 13:10-16:00 | 志希070116 | Week 8 - No class due to Typhoon Kong-rey [Return Homework 3 by October 31, 2024 (Thursday) 23:59] |
7.11.2024 13:10-16:00 | 志希070116 | Week 9 - Lecture (Thursday): matrix Lie group and Lie algebra [Return Homework 4 by November 7, 2024 (Thursday) 23:59]
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Lin, Po-Yi Title. Theory of Transformer Abstract. The dominant sequence transduction models are based on complex recurrent or convolutional neural networks that include an encoder and a decoder. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely. Experiments on two machine translation tasks show these models to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 28.4 BLEU on the WMT 2014 English-to-German translation task, improving over the existing best results, including ensembles, by over 2 BLEU. On the WMT 2014 English-to-French translation task, our model establishes a new single-model state-of-the-art BLEU score of 41.0 after training for 3.5 days on eight GPUs, a small fraction of the training costs of the best models from the literature. |
14.11.2024 13:10-16:00 | 志希070116 | Week 10 - Lecture (Thursday): Homogeneous ODE with variable coefficients
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Cheng, Huan-Chang Title. An introduction to the Simplex Method Abstract. Linear programming problems are applied in many fields, such as finance, transportation, management, and agriculture. In this talk, we will provide a brief overview of some linear programming theories and explain the design principles of the simplex method, an algorithm used to solve linear programming problems. |
21.11.2024 13:10-16:00 | 志希070116 | Week 11 - Lecture (Thursday): nonhomogeneous equations, higher order linear ODE |
28.11.2024 13:10-16:00 | 志希070116 | Week 12 - Lecture (Thursday) |
5.12.2024 13:10-16:00 | 志希070116 | Week 13 - Lecture (Thursday)
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Fan, Min-Yuan Title. An introduction to divergence and curl Abstract. We will introduce the definitions of divergence and curl and discuss their physical significance and some properties. click me to see the title and abstract of today's second talk (50 minutes)
Speaker. Chang,Wen-Ling Title. An introduction to Natural Language Processing (NLP) and its application - Use AI to continue writing the Dream of the Red Chamber (紅樓夢) Abstract. We will introduce the main concept of the NLP, its key technology and an application to the literature "Dream of the Red Chamber" via the Natural Language Understanding (NLU) and the Natural Language Generator (NLG) technology. We will exhibit how the AI learn the literary style of the literature and extend it by using the NLP technology. click me to see the title and abstract of today's third talk (50 minutes)
Speaker. Wu, Ju-Yen Title. An introduction to the Monte Carlo method and its applications Abstract. Monte Carlo method is a numerical computation approach based on probability and statistics, through random number generation to solve complex problems. In this talk, we will explore the core principles of this method, its advantages and limitations, as well as its extended algorithms and applications. |
12.12.2024 13:10-16:00 | 志希070116 | Week 14 - No class [Return Homework 5 by December 12, 2024 (Thursday) 23:59] |
19.12.2024 13:10-16:00 | 志希070116 | Week 15 - Lecture (Thursday)
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Chang,Wen-Ling Title. An introduction to the Kalman Filter Abstract. Kalman Filter is an algorithm based on linear algebra and probability, which is widely used to approximate the "status" of dynamical systems. It gives the best approximation (and reduce noises) for the "status" based on data from sensors and mathematical models. We will introduce the mathematical principle and model of the Kalman Filter from the probability point of view and explain its applications. click me to see the title and abstract of today's second talk (50 minutes)
Speaker. Lin, Po-Yi Title. Brief Introduction of Mamba Abstract. Foundation models, now powering most of the exciting applications in deep learning, are almost universally based on the Transformer architecture and its core attention module. Many subquadratic-time architectures such as linear attention, gated convolution and recurrent models, and structured state space models (SSMs) have been developed to address Transformers’ computational inefficiency on long sequences, but they have not performed as well as attention on important modalities such as language. We identify that a key weakness of such models is their inability to perform content-based reasoning, and make several improvements. First, simply letting the SSM parameters be functions of the input addresses their weakness with discrete modalities, allowing the model to selectively propagate or forget information along the sequence length dimension depending on the current token. Second, even though this change prevents the use of efficient convolutions, we design a hardware-aware parallel algorithm in recurrent mode. We integrate these selective SSMs into a simplified end-to-end neural network architecture without attention or even MLP blocks (Mamba). Mamba enjoys fast inference (5x higher throughput than Transformers) and linear scaling in sequence length, and its performance improves on real data up to million-length sequences. As a general sequence model backbone, Mamba achieves state-of-the-art performance across several modalities such as language, audio, and genomics. On language modeling, our Mamba-3B model outperforms Transformers of the same size and matches Transformers twice its size, both in pretraining and downstream evaluation. click me to see the title and abstract of today's third talk (50 minutes)
Speaker. Wu, Ping-Hsun Title. Gamma function and its extension to complex numbers Abstract. Gamma function is usually known as the extension of factorial functions , it can also applied to computing integrals or learning probability theories and statistics , while most of the use we're familiar with are only related to real numbers .We will introduce how do we get the gamma function and some properties of it, also using the method of analytic continuation to show that how gamma function can be extended to complex numbers. |
26.12.2024 13:10-16:00 | 志希070116 | Week 16 - Lecture (Thursday)
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Wu, Ju-Yen Title. An introduction to Bayesian Networks and Bayesian Classification: Theory and Applications Abstract. Bayesian networks and Bayesian classification, both grounded in Bayes' theorem, are essential tools for handling uncertainty and classification tasks. Bayesian networks reveal causal relationships among data, and Bayesian classifiers predict data categories, widely applied in fields such as machine learning, natural language processing, and spam filtering. In this introduction, we will mention Bayes' theorem, Bayesian networks, various types of Bayesian classifiers, and practical examples shown in Python using Google Colab. click me to see the title and abstract of today's second talk (50 minutes)
Speaker. Cheng, Huan-Chang Title. An introduction to the Quantum computing Abstract. Quantum computing is a revolutionary field of computing that leverages the principles of quantum mechanics to process information in ways that classical computers cannot. Unlike traditional computers, which use bits to represent data as either 0 or 1, quantum computers use quantum bits or "qubits". Qubits have a unique property called superposition, which allows them to exist in multiple states simultaneously, rather than just being 0 or 1. Quantum computing holds great potential for fields such as cryptography, drug discovery, artificial intelligence, and material science. However, it is still in its early stages, with many challenges to overcome, such as error correction and maintaining the stability of qubits. Nonetheless, the field is progressing rapidly and has the potential to transform industries by solving problems that are currently intractable for classical computers. In this talk, we will introduce quantum states (qubits), quantum operators (quantum gates) and some quantum algorithms. |
2.1.2025 13:10-16:00 | 志希070116 | Week 17 - Lecture (Thursday)
click me to see the title and abstract of today's first talk (50 minutes)
Speaker. Fan, Min-Yuan Title. Cauchy residue theorem and Egorychev method Abstract. In this talk, we discuss the Cauchy residue theorem and its application in combinatorial problem by solving complex integrals, known as Egorychev method. click me to see the title and abstract of today's second talk (50 minutes)
Speaker. Wu, Ping-Hsun Title. Introduction to Liouville's Theorem and its Application to Antiderivatives Abstract. We have learned the fact that there are a lot of antiderivatives of an elementary function cannot be expressed as elementary functions from Calculus. And Liouville's theorem shows some properties of the functions if its antiderivatives can be expressed as elementary functions. In this talk, we will state Liouville's theorem and show its application to check if a function's antiderivative can be expressed as elementary functions. |
Completion
- The course can be taken for credit by attending the lectures, returning written solutions (60%) in LaTeX and giving (at least) 2 presentations (each 20%).