We evolve small continuous-time recurrent neural networks with fixed weights that perform Hebbian learning behavior. We describe the performance of the best and smallest successful system, providing an in-depth analysis of its evolved mechanisms. Learning is shown to arise from the interaction between the multiple timescale dynamics. In particular, we show how the fast-time dynamics alter the slow-time dynamics, which in turn shapes the local behavior around the equilibrium points of the fast components by acting as a parameter to them.