This technical analysis traces striking structural similarities between neural networks and cryptographic ciphers across multiple design levels—from sequential state processing (RNNs vs SHA-3) through parallel architectures (Transformers vs Message Authentication Codes). Rather than cross-field borrowing, the author argues both fields independently converged on identical patterns due to three shared constraints: minimal correctness requirements, emphasis on complex information mixing, and extreme hardware-performance optimization.
Research
Why are neural networks and cryptographic ciphers so similar? (2025)
RNNs structurally mirror SHA-3 while Transformers parallel MACs—evidence of independent convergent evolution in neural networks and cryptography, both optimized for hardware performance and complex information mixing.
Monday, May 4, 2026 12:00 PM UTC2 MIN READSOURCE: Hacker NewsBY sys://pipeline
Tags
research