torginus 3 hours ago

The biggest weakness of genetic algorithms is they can't make use of gradients - meaning they have no idea how to 'move' towards the solution - they end up guessing and refining their guesses, which means they're much slower to converge.

Their advantage is they don't require gradients (so the fitness function to be differentiable), but I don't think they're going to be the next big thing.

jokoon 3 hours ago

Probably much better than gradient descent and co

dang 3 hours ago

The submitted title was "Evolution is still a valid machine learning technique" but that sentence doesn't appear in the article, which is really about (interesting!) specific work on a specific program. I couldn't find a better representative sentence in the article so I went with the page's own title, which is what the HN guidelines default to anyway (https://news.ycombinator.com/newsguidelines.html).