Gradient Vs Divergence
Gradient,Divergence,Curl andRelatedFormulae The gradient, the divergence, and the curl are rst-order dierential operators acting on elds. The easiest way to describe them is via a vector nabla whose components are partial derivatives WRT Cartesian coordinates x,y,z x x y y z z. 1
Gradient, divergence and curl also have properties like these, which indeed stem often easily from them. First, here are the statements of a bunch of them. A memory aid and proofs will come later. In fact, here are a very large number of them. Many are included just for completeness. Only a relatively small number are used a lot.
We begin with the formal definition of the gradient vector Grad and a visualisation of what it represents for a multivariable function. We then look at some examples with explicit calculation and 3D plots. The Divergence Div of a vector function is then introduced - both as an equation and via the physical interpretation of what it
MIT OpenCourseWare is a web based publication of virtually all MIT course content. OCW is open and available to the world and is a permanent MIT activity
lines in the gas will converge i.e. divergence is not zero Another term for the divergence operator is the 'del vector', 'div' or 'gradient operator' for scalar fields. The divergence operator acts on a vector field and produces a scalar. In contrast, the gradient acts on a scalar field to produce a vector field.
Interpretation of Gradient, Divergence and Curl Gradient The rate of change of a function f per unit distance as you leave the point x 0,y 0,z 0 moving in the direction of the unit vector n is given by the directional derivative D nfx 0,y 0,z 0 fx 0,y 0,z 0n fx 0,y 0,z 0 cos where is the angle
Geometrically, the gradient points in the direction of fastest increase of a function, and its magnitude is the rate of change in that direction. If you follow the gradient of a function you will eventually either get to a local maximum or infinity. Divergence. The divergence represents how quickly a vector valued function is quotspreading outquot.
As the name implies, the divergence is a local measure of the degree to which vectors in the field diverge. The divergence of a tensor field of non-zero order k is written as , a contraction of a tensor field of order k 1. Specifically, the divergence of a vector is a scalar.
The gradient, divergence and Laplacian all have obvious generalizations to dimensions other than three. That is not the case for the curl. It does have a, far from obvious, generalization, which uses differential forms. Differential forms are well beyond our scope, but are introduced in the optional 4.7.
Both the gradient concept and the divergence concept can be defined using the nabla operator 92nabla, which might be what you are referring to. But note that this operator is used in simple scalar multiplication in the case of the gradient because we there are dealing with a scalar function, whereas it is used in a dot product in the case of