Quantcast
Channel: First steps - JuliaLang
Viewing all articles
Browse latest Browse all 2795

Different results between Zygote, ForwardDiff, and ReverseDiff

$
0
0

Hi, I’m quite new to Julia and trying to understand automatic differentiation.

I have a loss function for a model, and I’m trying to get the gradient and hessian w.r.t parameters fitted to the model. In doing so, comparing Zygote, ForwardDiff and ReverseDiff, I noticed that the values for the gradient and hessian are very different depending on which method I am using.
Is this to be expected or am I missing something?

function loss(p,data,model,dens,cons,t)
  Σ_sol = run_model(model,p,data,dens,cons,t)
  sum(abs2, (Σ_sol - data)) #, Σ_sol
end
loss_in = (p) -> loss(p,dataset,model,dens',[],t)
loss(p,dataset,model,dens',[],t)
 # rates are fitted parameters
grad_zyg = Zygote.gradient(loss_in,rates)
grad_for = ForwardDiff.gradient(loss_in,rates)
grad_rev = ReverseDiff.gradient(loss_in,rates)
hes_for = ForwardDiff.hessian(loss_in,rates)
hes_rev = ReverseDiff.hessian(loss_in,rates)

Thanks!

1 post - 1 participant

Read full topic


Viewing all articles
Browse latest Browse all 2795

Trending Articles