That's where Hyperopt shines -- it's useful not only for tuning hyperparameters like learning rate, but also for tuning more sophisticated parameters in a flexible way. Hyperopt can change the number of layers of different types, the number of neurons in one layer or another, or even the type of layer to use at a certain place in the network given an array of choices -- each of which may have nested, tunable hyperparameters.