Rob Hicks (Posts about autograd)http://rlhick.people.wm.edu/enTue, 13 Feb 2024 12:35:36 GMTNikola (getnikola.com)http://blogs.law.harvard.edu/tech/rssUsing PyTorch for Maximum Likelihood Estimationhttp://rlhick.people.wm.edu/posts/pytorch-mle.html<div><p>
This post investigates using <code>pytorch</code> for econometric research in a maximum likelihood setting. Packages like <a href="https://gsbdbi.github.io/torch-choice/">torch-choice</a> (designed to run discrete choice or multinomial logit models) show that huge speedups can be achieved compared to running on typical platforms due to being able to use graphical processing units (GPU) and from benefits from <a href="https://rlhick.people.wm.edu/posts/mle-autograd.html">autograd</a>.
</p>
<p>
This post explores three issues:
</p>
<ol class="org-ol">
<li>Installing the necessary software to get Pytorch running on AMD hardware (very short discussion)</li>
<li>How to use <code>pytorch</code> for generic settings/custom log-likelihoods, where existing packages like <code>torch-choice</code> are insufficient for your needs</li>
<li>How to use <code>pytorch-minimize</code> to further streamline optimization using <code>pytorch</code></li>
</ol>
<p><a href="http://rlhick.people.wm.edu/posts/pytorch-mle.html">Read more…</a> (8 min remaining to read)</p></div>autogradgpumaximum likelihoodpytorchrocmhttp://rlhick.people.wm.edu/posts/pytorch-mle.htmlWed, 22 Nov 2023 17:16:50 GMTUsing Autograd for Maximum Likelihood Estimationhttp://rlhick.people.wm.edu/posts/mle-autograd.html<div><p>
Thanks to an excellent series of posts on the python package <code>autograd</code> for automatic differentiation by John Kitchin (e.g. <a href="http://kitchingroup.cheme.cmu.edu/blog/2017/11/22/More-auto-differentiation-goodness-for-science-and-engineering/">More Auto-differentiation Goodness for Science and Engineering</a>), this post revisits some earlier work on <a href="http://rlhick.people.wm.edu/posts/estimating-custom-mle.html">maximum likelihood estimation in Python</a> and investigates the use of auto differentiation. As pointed out in <a href="https://arxiv.org/pdf/1502.05767.pdf">this article</a>, auto-differentiation "can be thought of as performing a non-standard interpretation of a computer program where this interpretation involves augmenting the standard computation with the calculation of various derivatives."
</p>
<p>
Auto-differentiation is neither symbolic differentiation nor numerical approximations using finite difference methods. What auto-differentiation provides is code augmentation where code is provided for derivatives of your functions free of charge. In this post, we will be using the <code>autograd</code> package in python after defining a function in the usual <code>numpy</code> way. In python, another auto-differentiation choice is the Theano package, which is used by PyMC3 a Bayesian probabilistic programming package that I use in my research and teaching. There are probably other implementations in python, as it is becoming a must-have in the machine learning field. Implementations also exist in C/C++, R, Matlab, and probably others.
</p>
<p>
The three primary reasons for incorporating auto-differentiation capabilities into your research are
</p>
<ol class="org-ol">
<li>In nearly all cases, your code will run faster. For some problems, much faster.</li>
<li>For difficult problems, your model is likely to converge closer to the true parameter values and may be less sensitive to starting values.</li>
<li>Your model will provide more accurate calculations for things like gradiants and hessians (so your standard errors will be more accurately calculated).</li>
</ol>
<p>
With auto-differentiation, gone are the days of deriving analytical derivatives and programming them into your estimation routine. In this short note, we show a simple example of auto-differentiation, expand on that for maximum likelihood estimation, and show that for problems where likelihood calculations are expensive, or for which there are many parameters being estimated there can be dramatic speed-ups.
</p>
<p><a href="http://rlhick.people.wm.edu/posts/mle-autograd.html">Read more…</a> (8 min remaining to read)</p></div>autogradipythonmaximum likelihoodhttp://rlhick.people.wm.edu/posts/mle-autograd.htmlTue, 06 Mar 2018 08:30:50 GMT