• 深度學習於化工過程 (Deep Learning for Chemical Processes)

    • Artificial Intelligence (AI) has made remarkable progress over the past decade. The long-standing goal of developing intelligent systems that can think and act like humans—only faster and more accurately—is quickly becoming a reality. One of the most transformative advancements in this area is deep learning (DL), which has become a central focus in process systems engineering, attracting significant research and industry attention. DL is a rapidly evolving field, delivering outstanding results in tasks that were once dominated by human expertise.

    • In this course, we will explore the perceptron and other artificial neurons, which form the foundational building blocks of deep neural networks—the driving force behind the deep learning revolution. We will study fully connected feedforward networks and convolutional networks, applying them to solve practical industrial chemical engineering problems, such as handling high-dimensional data or diagnosing process faults. The course will cover essential deep learning components, including perceptrons, deep neural networks (DNNs), recurrent neural networks (RNNs), and popular deep learning frameworks. It will progressively build towards more advanced architectures, such as attention mechanisms, transformer models, and GPT systems.

    • The goals of this course are to:
      • Provide students with advanced knowledge of deep learning techniques and process data analytics.
      • Equip students with problem-solving skills to address complex challenges in chemical processes.
    • The course requires basic knowledge in principles of chemical engineering, numerical methods, probability, statistics, calculus, optimization and some computer programming experience,  although it certainly is advantageous to have prior exposure--not strictly required.
    • (Spring 2025) Class Meeting Times: 13:00-14:30 (or 15:00) (TUE) and 10:00-11:30  (or 12:00) (THU)
    • Office hours:  WED (10:00~12:00)
    • Course Structure: All course related materials can be downloaded from CYCU's ilearn system.
      • (2-18-2025)
        • To schedule the class meeting
      • (2-20-2025)  
        • An Overview of Deep Learning and Its Applications
      • (2-25-2025)
        • (Mini Course) Learning Python (Part I)

      • (3-4-2025)  
        • (Mini Course) Learning Python (Part II)
        • (Mini Course) Learning Numpy
        • (Mini Course) Learning Matplotlib
      • (3-11-2025)
        • The Rosenblatt Perception
      • (3-13-2025)
        • The Rosenblatt Perception
        • Gradient-Based Learning
        • Pop Assignment 1
      • (3-18-2025)
        • Gradient-Based Learning
      • (3-20-2025)
        • Gradient-Based Learning
        • Backpropagation Learning
      • (3-25-2025)
        • Backpropagation Learning
      • (3-28-2025)
        • Fully Connected Network for Multiclass Classification
        • Frameworks of DL
          • Saturated Neurons
          • Vanishing Gradients
      • (4-1-2025)
        • Fully Connected Network for Regression
        • Regularization
        • Dropout
      • (4-8-2025)
        • Convolutional Neural Networks for Image Classification
        •  AlexNet
        • Translation Invariance
        • Feature Maps
      • (4-10-2025)
        • From Convolutional Layers to Fully Connected Layers
        • Sparse Connections and Weight Sharing
      • (4-22-2025)
        • Image Classification with CNN
        • Deeper CNNs (VGGNet, GoogLeNet, ResNet)
      • (4-24-2025)
        • Special Topics:
          • Model-Based Identification of Continuous-Time RNNs Using Reinforcement Learning for Nonlinear Batch Processes
          • Control-Informed Reinforcement Learning for Nonlinear Feedforward-Feedback Control
      • (4-29-2025)
        • Recurrent Neural Networks: Sequential Prediction Problem
          • Regression
          • Binary Classification
          • Multiclass Classification
        • Limitations of Feedforward Networks
        • Math Representation of a Recurrent Layer
        • Unrolled in Time for a Recurrent Layer
      • (5-1-2025)
        • Recurrent Neural Networks: Sequential Prediction Problem
        • Programming Example: Forecasting Book Sales
      • (5-6-2025)
        • Long Short-Term Memory (LSTM): Gradient Healthy: Vanishing and Exploding Gradient Problems
          • Weight Initialization
          • Batch Normalization
          • Nonsaturating Active Function
          • Gradient Clipping
          • Constant Error Carousel (CEC)
          • Skip Connections
      • (5-8-2025)
        • LSTM: From CEC to Gated Units
      • (5-13-2025)
        • Presentations for Assignment 2.
      • (5-15-2025)
        • LSTM: A Network of LSTM Cells
        • Two Different Views of LSTM
      • (5-20-2025)
        • LSTM & Highway Networks & Skip Connection
        • Text Autocompletion
        • Longer-term Prediction and Autoregressive Models
      • (5-22-2025)
        • Presentations for Assignment 2. (Make-up)
        • Beam Search
        • LSTM with Beam Search
      • (5-27-2025)
        • Presentations for Assignment 2. (Make-up)
        • Seq-2-Seq Networks
      • (5-29-2025)
        • Presentations for Assignment 2. (Make-up)
        • Neural Language Models
        • Word Embeddings
        • Seq-2-Seq Networks
      • (6-3-2025)
        • Word2Vec
        • GloVe
        • Seq-2-Seq Networks
      • (6-5-2025)
        • Presentations
  •  
  • Homework: Python would be used for the programming portions of the assignments. During the first week, a tutorial session would be hosted to jump-start your transition into working in Python
    • (3-14-2025)  Pop Assignment: Steepest Gradient Optimization
    • (3-4-2025) HW#1: Practice Python, Numpy and Matplotlib (Due: 3-18-2025)
    • (4-8-2025) HW#2: Process Fault Diagnosis and Efficiency Estimation for ORC (Due: 5-1-2025)
    • (5-1-2025) HW#3: Wafer Map Defect Classification Using Convolutional Neural Networks (Due: 5-29-2025)
    • (5-?-2025) HW#4:  Evaluating the Performance of Different Operational Strategies for Start-up Processes (Due: 6-??-2025)
  • Grading Distribution
    • No quiz and exam will be given. Several homework will be assigned. Homework will be assigned periodically. The homework will consist of applying one of the techniques design methods presented in the course to a problem chosen by students. Analysis and simulation will be expected. Hope that these applications can inspire your fantasy, which contributes to new applications of these techniques in chemical problems or other fields
    • Homework (four to five times): 60%
    • Final Assignment with Official Presentation: 20%
    • Class activity (Q&A and presentation): 20%
    • Note that late assignments will not be allowed unless a legitimate reason (illness, religious convention, etc) exists and is discussed with the instructor.
  • Grades (Submission Records)