Partial derivatives are like usual derivatives, but you just think of the other variables as constants. For example,
∂x∂exy=exy(∂x∂xy)=exyy.
There's also a useful theorem about mixed partial derivatives:
Theorem (Clairaut's theorem)
If the mixed partial derivatives fxy (i.e., you take the x-derivative first and then the y-derivative second) and fyx (y first and x second) exist and are continuous in an open set around (a,b), then
fxy(a,b)=fyx(a,b).
In a nutshell, this theorem tells you that for most functions you can write down, you can take the partial derivatives in any order you want. Sometimes, one order is easier than the other:
Example 1.
Calculate fyyyyx, where f(x,y)=ey2siny+xy4.
Solution.
As you can imagine, if you take four y-derivatives, you will get something very complicated. However, f is differentiable infinitely many times, so you can apply Clairaut's theorem here:
fyyyyx(x,y)=fxyyyy(x,y)
Immediately, you get
fx(x,y)=y4,
since there's no x in the first term. From here, you can easily take the remaining y-derivatives:
fxyyyy(x,y)=24.
Linearization
In one variable, the linearization of f is just the tangent line:
f(x)≈f(x0)+f′(x0)(x−x0)
In several variables, a similar formula holds. We don't have "normal" derivatives in several variables, but instead of f′, we have the gradient
∇f(x0,y0)=⟨fx(x0,y0),fy(x0,y0)⟩.
We can't multiply vectors though, but given two vectors, we can get a scalar by using the dot product, so we get
f(x,y)≈f(x0,y0)+∇f(x0,y0)⋅⟨x−x0,y−y0⟩.
The right-hand side is the equation of the tangent plane for f at (x0,y0), and you can write it as