- Partial Derivative and Gradient
- Differentiating vector-valued functions
- For a multivariable function, like , computing partial derivatives looks something like this:
\partial∂, called "del", is used to distinguish partial derivatives from ordinary single-variable derivatives.
|Symbol||Informal understanding||Formal understanding|
|A tiny nudge in the direction.||A limiting variable which goes to , and will be added to the first component of the function's input.|
|The resulting change in the output of after the nudge.||The difference between and , taken in the same limit as .|
- The second partial derivatives which involve multiple distinct input variables, such as and , are called "mixed partial derivatives".
- The two mixed partial derivatives are the same.
- Schwarz's theorem or Clairaut's theorem, which states that symmetry of second derivatives will always hold at a point if the second partial derivatives are continuous around that point.
- the order of differentiation is indicated by the order of the terms in the denominator from right to left.
- The gradient of a function , denoted as , is the collection of all its partial derivatives into a vector.
The most important thing to remember about the gradient:
- The gradient of , is evaluated at an input , points in the direction of steepest ascent.
- The gradient is perpendicular to contour lines.
Example differential operators
- If you have some multivariable function, and some vector in the function's input space, , the directional derivative of along on top tells you the rate at which will change while the input moves with velocity vector .
- The notation here is , and it is computed by taking the dot product between the gradient of and the vector , that is, .
- Remember: If the directional derivative is used to compute slope, either must be a unit vector or you must remember to divide by at the end.
- Because the slope of a graph in the direction of only depends on the direction of not the magnitude
- Alternate definition of directional derivative:
- Which is the product of two vectors.
- And Cauchy-Schwarz inequality tells us:
- Let , then
- And , iff .
- So the gradient points in the direction of steepest ascent is the unit vector in the direction .
- nudge [nʌdʒ] n. 推动；用肘轻推；没完没了抱怨的人 vt. 推进；用肘轻推；向…不停地唠叨 vi. 轻推；推进；唠叨
- parametrization [pə,ræmitrai'zeiʃən, -tri'z-] n. [数] 参数化；参数化法；[计] 参量化
- parallelogram [,pærə'leləɡræm] n. 平行四边形
- magnitude ['mæɡnitju:d] n. 大小；量级；[地震] 震级；重要；光度