Review of Multivariate Calculus
Review of Multivariate Calculus (for Optimization & Machine Learning)
1. Functions of Several Variables
A multivariate function maps vectors to scalars:
Examples
-
-
-
Loss functions in ML:
2. Limits and Continuity (Brief Review)
-
A function is continuous at if:
📌 In multiple dimensions, limits must be independent of path.
3. Partial Derivatives
Definition
The partial derivative measures change with respect to one variable while holding others constant.
Example
4. Gradient Vector
Definition
Properties
-
Direction of steepest ascent
-
Orthogonal to level curves
-
Used in gradient descent
Example
5. Directional Derivative
Definition
Rate of change of in direction (unit vector):
📌 Maximum directional derivative occurs in the gradient direction.
6. Hessian Matrix (Second Derivatives)
Definition
Importance
-
Determines curvature
-
Used in Newton’s method
-
Classifies critical points
7. Taylor Expansion (Multivariate)
Second-Order Approximation
📌 Central in optimization theory.
8. Critical Points
Definition
Types
-
Local minimum
-
Local maximum
-
Saddle point
9. Classification Using Hessian
| Hessian Type | Nature |
|---|---|
| Positive definite | Local minimum |
| Negative definite | Local maximum |
| Indefinite | Saddle point |
| Semi-definite | Inconclusive |
10. Convexity in Multivariate Functions
Definition
Hessian Condition
-
is convex ⇔ Hessian is positive semidefinite
📌 Convex ⇒ every local minimum is global.
11. Constrained Optimization (Preview)
Lagrange Multipliers
Condition:
12. Geometry of Multivariate Functions
| Concept | Interpretation |
|---|---|
| Level sets | Contours of equal value |
| Gradient | Normal to level sets |
| Hessian | Curvature of surface |
| Saddle point | Mixed curvature |
13. Connection to Machine Learning
| Concept | ML Use |
|---|---|
| Gradient | Backpropagation |
| Hessian | Second-order methods |
| Convexity | Guarantees global optimum |
| Saddle points | Optimization challenges |
Comments
Post a Comment