Python implementation of the self (scaled-dot product) attention mechanism originally proposed in "Attention Is All You Need"....