2026年4月9日 星期四

EX#7 Mathematical Optimization

 課堂練習 

Deadline:  Next Saturday at 23:59 (one more week)

Send all the share links to  me chang212@gmail.com by email with subject EX#7 [your id, your name]

Optimization in Hyperspace — Homework Assignments

md version

Based on the "Optimization in Hyperspace" artifact — which visualizes how algorithms like Gradient Descent, Nelder-Mead, A* Search, Simulated Annealing, and Global Optimization navigate a 3D fitness landscape full of local peaks and valleys — here are 3 homework assignments at different levels:


Homework Assignment 1 — Conceptual Understanding

Title: Reading the Fitness Landscape

Objective: Connect visual intuition to algorithmic concepts.

Instructions: Using the "Optimization in Hyperspace" visualizer, observe how each of the five algorithms (Gradient Descent, Nelder-Mead, A* Search, Simulated Annealing, and Global Optimization) moves across the 3D fitness landscape. Then write a 1–2 page reflection answering the following:

  1. Describe the fitness landscape in your own words. What do the peaks represent? What do the valleys represent? In a real optimization problem, what might each correspond to?
  2. Which algorithms appear to get "stuck" at a local optimum (a smaller peak that is not the tallest)? Explain why this happens based on how those algorithms work.
  3. Which algorithms eventually find the global optimum (the tallest peak)? What property of those algorithms allows them to escape local optima?
  4. Define the exploration vs. exploitation trade-off in optimization. For each algorithm in the visualizer, classify it as leaning toward exploration, exploitation, or a balance of both — and justify your answer in one sentence each.

Deliverable: Written reflection (300–500 words), submitted as a PDF.


Homework Assignment 2 — Mathematical Analysis

Title: Gradient Descent in Your Own Hands

Objective: Implement and analyze gradient descent on a multimodal function.

Instructions: Consider the 2D function:

f(x,y)=sin(x)cos(y)+0.1(x2+y2)f(x, y) 

This function has multiple local maxima and minima, similar to the landscape in the visualizer.

  1. Plot the function surface over the domain x[5,5]x \in [-5, 5] y[5,5]y \in [-5, 5]  using Python (matplotlib) or any tool of your choice.
  2. Implement gradient descent (minimization) from scratch using the update rule:
xt+1=xtαf(xt,yt)

Test at least three different starting points and three different learning rates (α=0.01,0.1,0.5\alpha = 0.01, 0.1, 0.5 ).

  1. Record for each run: the starting point, the final converged point, the function value at convergence, and whether it found a global or local minimum.
  2. Answer: How does the choice of starting point affect which minimum is found? Relate this back to what you observed in the 3D visualizer — why is gradient descent susceptible to the landscape's topology?

Deliverable: Code + a short written report (1 page) comparing your runs and drawing conclusions.





Homework Assignment 3 — Comparative Algorithm Design

Title: When Does Simulated Annealing Beat Gradient Descent?

Objective: Empirically compare a local vs. global optimization strategy.

Instructions: Using Python (or pseudocode + written analysis), design an experiment that compares Gradient Descent and Simulated Annealing on a fitness landscape of your choice (you may use the function from HW2 or define your own multimodal function with at least 3 local optima).

  1. Implement both algorithms. For Simulated Annealing, use an exponential cooling schedule:
T(t)=T0γtT(t) = T_0 \cdot \gamma^t

where T0T_0  is the initial temperature and γ(0,1)\gamma \in (0,1)  is the cooling rate.

  1. Run each algorithm 50 times from random starting points uniformly sampled from the domain. Record what fraction of runs find the global optimum for each algorithm.
  2. Vary the SA temperature schedule — test at least two values each for T0T_0  (high vs. low initial temperature) and γ\gamma  (fast vs. slow cooling). How does the temperature schedule affect the balance between exploration and exploitation? Which setting performs best?
  3. Create a summary table and at least one plot (e.g., success rate vs. cooling rate) comparing the algorithms.
  4. Conclude: In what problem scenarios would you recommend Simulated Annealing over Gradient Descent? Are there situations where the reverse is true? Use your results as evidence.

Deliverable: Code + a 1–2 page written analysis with your table and plot included.





These three assignments build progressively — from conceptual understanding, to mathematical implementation, to comparative experimental design — directly grounded in what the "Optimization in Hyperspace" visualizer demonstrates. 

HW#7 Analog IC Design

 

課堂練習 

Deadline: This Saturday at 23:59

Send all the share links to  me chang212@gmail.com by email with subject HW#6 [your id, your name]

任選1題



1  (a) A BJT Differential Pair IC Die  (share) is incorrectly designed. Fix the die. 

(b) Make a schematic

(c) Do parameter optimization via SA (simulated annealing)

(d) Placement & Routing for the die

For example you may use Quadratic Placement  & ILP+PathFinder+A* Routing














2. Design 2-stage diff pair (share from very simple diff pair)

(a)  Make a schematic,  

(b) Draw a  silicon die, considering Miller Compensation

(c) Do P&R (Placement & Routing) for the IC
See example of  P&R+opt compo

2026年4月6日 星期一

HW#6 PLL

 本次習題基本說明

進階說明


 1. (a) Build a 9.0 GHz 65 nm Fractional-N PLL Synthesizer with NM Optimization 


(b)  Cross check your PLL  by MNA

Hint: What is NM method? NM Algo


2.

(a) Build Apple C1 ADPLL — 7.0 GHz (spec) with A* optimization, TSMC N7

Apple C1 ADPLL — 7.0 GHz