I have a very exciting assignment for my optimization course: calculate the Hessian matrix for a feedforward neural network. I don’t know if there is any simple trick for simplifying things, but the way I’m extending the usual backpropagation to the second order derivatives are really exhausting. I guess one might have published a paper just for this calculation a few years ago.
(At least, I hope that Bishop’s book has a solution for it. I really don’t want to implement a code without checking it before.)