About
Posts
Portfolio
Testimonials

Unseen Opportunities: Upwork + GPT

Summer, 2022. I quit my lucrative AI job not out of dissatisfaction, but out of exhilaration. My past couple of years were brimming with happiness, which sparked an awakening within me: if I did not leap into the unknown, doing so might eventually become difficult or even impossible. Within two hours of this epiphany, I sent my company 2-week notice.

July 8, 2023
Why Does Batch Normalization Work?

Why Does Batch Normalization Work?

Batch Normalization is often explained as a method for reducing “Internal Covariate Shift,” but growing evidence points instead to its ability to smooth the optimization landscape and make training more robust. Through interactive demos and reproducible experiments, this overview shows how Batch Normalization enhances convergence, remains effective even under artificially introduced covariate shifts, and reduces dependence on initialization—ultimately speeding up and stabilizing neural network training.

January 15, 2019