I was trying to make sense of the event-stream vulnerability. My take:
Is evolution just another hill-climbing-like algorithm where the gradient is inefficiently estimated via sampling rather than direct differentiation? No. The usual formulations of evolutionary computation (EC) capture little of the algorithmically interesting aspects of evolution, which has led many researchers to wonder (perhaps correctly) whether there’s really any point to EC compared to more rationalized and efficient approaches to ML. This reaction is a bit like deeming human powered-flight a fruitless endeavor after seeing the first ineffectual ornithopers. Let’s not throw the baby out with the bathwater!
I read a fascinating book recently on the Fermi Paradox, which asks the question: if there are supposedly millions of other civilizations in the galaxy, why haven’t we detected any real evidence of their existence? One class of solutions says: what happened on Earth is a fluke. We’re alone in the galaxy. Gosh, that would be disappointing! Let’s have a closer look at one piece of the puzzle.
Haskell “enforces” typeclass coherence by asking that you define all the instances for a type in the same module (really, file) where you define that type. Whenever you import anything from that module, you bring all these instances into scope. It “works”, but it’s a kludge - when defining the type
Foo a, you can’t anticipate (nor are you necessarily aware of) all the instances you might wish to give for
Foo. We want an “open universe” where people can after the fact discover common structure and layer it onto existing data types. I feel this is an important property for a language ecosystem.
On this blog I’ve mused about the problems with snail mail: anyone with knowledge of your address gets a lifetime ability to cause mail and packages to show up at your house, as many times as they want. Email has the same problem with virtual message delivery. The result is that our inboxes, both physical and virtual, are filled mostly with content whose delivery we never actually authorized. They are mostly noise, and we spend lots of time just processing that noise because there is some amount of signal that we do want to be aware of and respond to with higher fidelity.