Why is the kernel not the one on the top again? [-1, 0, 1]
mpotoole
@the Good question. I somewhat skimmed over this in the lecture.
It seems upon first glance that it should indeed be $[-1, 0, 1]$. So what's going on here? Well, if we go back to our definition of the convolution on this slide, we recall that there's a sign flip:
$(k*f)(x) = \sum_{i} k(i) f(x-i)$
where $k(i)$ is the kernel in this case. A convolution operation would result in $(k*f)(x) = k(-1) f(x+1) + k(0) f(x) + k(1) f(x-1)$. So to do the derivative properly here, the kernel needs to be $[1, 0, -1]$ to account for this sign flip.
the
Thank you. I'll ponder over this. I get the arrow example, but I need to think about this one.
Why is the kernel not the one on the top again? [-1, 0, 1]
@the Good question. I somewhat skimmed over this in the lecture.
It seems upon first glance that it should indeed be $[-1, 0, 1]$. So what's going on here? Well, if we go back to our definition of the convolution on this slide, we recall that there's a sign flip:
$(k*f)(x) = \sum_{i} k(i) f(x-i)$
where $k(i)$ is the kernel in this case. A convolution operation would result in $(k*f)(x) = k(-1) f(x+1) + k(0) f(x) + k(1) f(x-1)$. So to do the derivative properly here, the kernel needs to be $[1, 0, -1]$ to account for this sign flip.
Thank you. I'll ponder over this. I get the arrow example, but I need to think about this one.