The highlight of SODA for me was the plenary talk Computational Geometry for Non-Geometers by Timothy Chan, especially the part in which he discussed instance optimality. The basic idea is to consider algorithm analyses that take all inputs into account and operate optimally over all inputs in comparison to an algorithm designed specifically for that input (but not for that input's order). I believe this was first considered in a geometric setting (Instance-Optimal Geometric Algorithms), so you could imagine that the input is a set of points and the algorithm you're competing against doesn't know in advance in what order the points will be given to you. This wonderfully fixes the problem with worst-case analyses in which an algorithm might perform poorly on most input instances but there is some hard instance for which the performance is reasonable. Instead, under an instance-optimal analysis, the algorithm is expected to perform well on all input instances.
In the study of data structures for moving objects, a similar idea known as motion sensitivity was previously introduced under a motion-specific setting. Motion sensitivity generally describes algorithms that perform well under orderly motion and have performance that degrades as the point motion becomes more random. In my own work, I consider entropy-based analyses in order to capture this concept. In fact, the instance-optimal analysis Timothy described also resulted in an entropy-like inequality, so perhaps this relationship is inevitable/necessary for these analyses.
It's an exciting new area that I'm hoping to see a lot more of. It was certainly an excellent talk.
Wildfires rage across California
3 hours ago