optimization

Simple query refactor - 100x faster

TL;DR - Recently set out to speed up a slow query behind a multi-column cursor paginated endpoint. A simple switch to represent the where filters as a tuple makes a huge difference in performance on the exact same data with the same indices. Eg - change where ( a > c or (a = c and b > d )) to where (a, b) > (c, d). In the “tuple” case postgres can use the index more efficiently and get more of the needed rows using the index condition, versus walking it, reading rows in and filtering.
Read more

Notes: Gradient Descent, Newton-Raphson, Lagrange Multipliers

TL;DR: A quick “non-mathematical” introduction to the most basic forms of gradient descent and Newton-Raphson methods to solve optimization problems involving functions of more than one variable. We also look at the Lagrange Multiplier method to solve optimization problems subject to constraints (and what the resulting system of nonlinear equations looks like, eg what we could apply Newton-Raphson to, etc). Introduction Optimization problems are everywhere in engineering and science. If you can model your problem in a way that can write down some function that should be minimized or maximized (the objective function) to get the solution you want, even in cases where there is no analytical solution (most real cases), you can often obtain a numerical solution.
Read more

Memoization in the Wild

Overview Memoization or memoisation is a method used to optimize programs. Usually, at least in my experience, it’s one of the first topics introduced when dynamic programming algorithms are being discussed. With a quick google search you can find the Wiki or a trillion other blogs about it - most will show the canonical example - the “hello world” of the topic - that is, using memoization to optimize a recursive implementation of a function that generates the n-th Fibonacci number (or sometimes a function computing factorials).
Read more