Detailed explanation and code implementation of self-attention mechanism in Transformer