Nearest neighbors and vector models – epilogue – curse of dimensionality

1 · Erik Bernhardsson · Oct. 20, 2015, 4 a.m.
This is another post based on my talk at NYC Machine Learning. The previous two parts covered most of the interesting parts, but there are still some topics left to be discussed. To go back and read the meaty stuff, check out Part 1: What are vector models useful for? Part 2: How to search in high dimensional spaces – algorithms and data structures You should also check out the slides and the video if you’re interested. Anyway, let’s talk about the curse of dimensionality today. This pic was o...