Recent experimental studies have revealed surprising properties of biological fitness landscapes, characterized by highly connected, degenerate manifolds of high-fitness states rather than isolated peaks separated by fitness valleys. These findings challenge traditional theoretical frameworks and bear striking similarities to the loss landscapes encountered in deep learning, where stochastic gradient descent (SGD) effectively navigates complex, high-dimensional spaces and exhibits implicit regularization towards flat minima. Here, we investigate these parallels by comparing different optimization strategies on a simplified landscape with a flat, degenerate optimal manifold. We analyze the dynamics of SGD, Natural Evolutionary Strategy (NES), and population-based evolutionary algorithms with mutation and selection. Our results demonstrate that evolutionary dynamics, like SGD, displays implicit regularization by preferentially exploring flat regions in degenerate high-fitness landscapes. This finding suggests fundamental principles underlying optimization in both artificial and biological systems, with important implications for understanding evolutionary robustness, biological degeneracy, and the role of high dimensionality in facilitating adaptive processes. Our comparative analysis
bridges the gap between machine learning optimization and evolutionary dynamics, offering insights into how different systems navigate complex fitness landscapes.
Advisor : Prof Naama Brenner