Constrained variation. Minimising/maximising something with constraints.
1Unconstrained variation¶
Determine the extreme values (min or max) of
$$I = \int_{x_1}^{x_2} F(x, y, y') \, dy$$
such that the boundary conditions are fixed, with $y_1 = y(x_1)$ and $y_2 = y(x_2)$.
1.1The theory¶
Taylor series expansion, first and second variations, somehow derive the Euler-Lagrange equation? Really hope we don't need to know this.
1.2The solution¶
Use the Euler-Lagrange equation:
$$F_y -\frac{d}{dt}F_{y'} = 0$$
So if you find $F_y$ and $F_{y'}$ then substitute those into the formula above, you'll get an ODE. Solve it to get a function $y(t)$ which minimises/maximises $I$. Remember that you have the boundary conditions $y_1 = y(x_1)$ etc (which should be given).
1.2.1Sufficient conditions¶
To check the sufficient condition, just look at $F_{y'y'}$. For it to be a minimum, we must have $F_{y'y'} > 0$; for it to be a maximum, we must have $F_{y'y'} < 0$. If $F_{y'y'} = 0$, then it's a saddle point.
2Constrained variation¶
Determine the extreme values (min or max) of
$$I = \int_{x_1}^{x_2} F(x, y, y') \, dy$$
such that the boundary conditions are fixed, with $y_1 = y(x_1)$ and $y_2 = y(x_2)$, and the integral constraint
$$J = \int_{x_1}^{x_2} G(x, y, y') \, dx = k$$
is satisfied where $k$ is some constant.
2.1Fuck the theory¶
Using Lagrange multipliers, we have:
$$\int_{x_1}^{x_2} \bigg [ F(x, y, y') + \lambda G(x, y, y') \bigg ] \, dx$$
Somehow we get the Euler-Lagrange equation:
$$F_y + \lambda G_y - \frac{d}{dx} \bigg (F + \lambda G \bigg ) y'$$
Remember the boundary conditions.
2.2The solution¶
We know $F$, and we know $G$. Plug those in the Euler-Lagrange equation above.
Continue this later