Plastic Elastic 🕸

on

Differentiable Plasticity looks really interesting. Basically a “plastic” weight is added to every connection within a neural network that can change over time as leaning continues. You keep the static weights from training like all other neural nets but have this additional weight changing as needed. 🎛

The idea is based on “synaptic plasticity”. No, that’s not a jam band album. It’s a feature of the brain that allows connections between synapses to change over time. 🧠

This could obviously be immensely helpful for models to actually learn over time. The early results look impressive. I also wonder if this would help minimize the risk of overfitting? 🤔

Src: Towards Data Science