writeup.ai

1 · Shane Mulligan · Oct. 15, 2019, 4 p.m.
Original article https://senrigan.io/blog/how-writeupai-runs-behind-the-scenes/#h%5F6068056784021570782144062 Glossary 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 cross entropy loss (between two probability distributions) (and over the same underlying set of events) Higher loss is bad. Measures the performance of a classification model whose output is a probability value between 0 and 1. Measures the average number of bits needed to identify an event from the set....