Introduction The curse of dimensionality is at the heart of dynamic programming. And dynamic programming is the cornerstone of modern economic theory. But what is the curse of dimensionality exactly? The aim of this blog post is to answer this question and to show one method that alleviates this curse: adaptive sparse grids.
TL;DR: adaptive sparse grids help you to limit the bite of the curse of dimensionality
You can download the notebook for this post here.
Introduction In a previous post, I discussed why Artificial Neural Networks (ANN) are very popular tools: (i) they can approximate a very large set of functions (ii) they work well in high-dimensional spaces (iii) we can train them efficiently using gradient descent (even better if you have a GPU). In the application part, I showed how to use them in practice using Julia and Flux.jl with two toy examples.
Introduction Artificial Neural networks (ANN) are very trendy at the moment, and rightly so.
They are being used everywhere in big tech companies. For instance, when you use Google translate, or when recommandations appear on your Netflix feed, complex artificial neural networks are being used behind the scene. Behind the success of Alpha Go at the game of Go against Lee Sedol, an ANN was used to identify the next best move.