Previous | Next --- Slide 95 of 105
Back to Lecture Thumbnails
YutianChen

If we are using gradient descent method to iteratively find numerical solution for $u, v$, shouldn't we apply some update function on $u, v$ directly?

Maybe something like $$ u_{t + 1} = u_t + \eta \frac{\partial E}{\partial u} $$

why do we calculate the extrema of $E$ here instead?

YutianChen

Also, if we are setting derivative to zero, is it possible for us to end up on a maxima or saddle point instead of minima for $E$?

motoole2

Horn-Schunck optical flow is making use of the Gauss-Newton method. Instead of computing the gradient to perform gradient descent, Gauss-Newton approximate second order (curvature) information, which has the advantage of converging to our solution faster.

As with many other non-linear methods, we can absolutely get stuck in local minima. Given the formulation of the objective function, I would not expect it to get stuck at maxima or saddle points though---though I could be wrong. (There are those that have studied the convergence properties of Horn-Schunck in much more detail.)