Additional Campus Affiliations
Assistant Professor, Electrical and Computer Engineering
Assistant Professor, Siebel School of Computing and Data Science
Assistant Professor, Coordinated Science Lab
External Links
Recent Publications
Engelken, R. (2023). Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians. In A. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, & S. Levine (Eds.), Advances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023 (Advances in Neural Information Processing Systems; Vol. 36). Neural information processing systems foundation.
Engelken, R., Wolf, F., & Abbott, L. F. (2023). Lyapunov spectra of chaotic recurrent neural networks. Physical Review Research, 5(4), Article 043044. https://doi.org/10.1103/PhysRevResearch.5.043044
Engelken, R. (2023). SparseProp: Efficient Event-Based Simulation and Training of Sparse Recurrent Spiking Neural Networks. In A. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, & S. Levine (Eds.), Advances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023 (Advances in Neural Information Processing Systems; Vol. 36). Neural information processing systems foundation.
Engelken, R., & Goedeke, S. (2022). A time-resolved theory of information encoding in recurrent neural networks. In S. Koyejo, S. Mohamed, A. Agarwal, D. Belgrave, K. Cho, & A. Oh (Eds.), Advances in Neural Information Processing Systems 35 - 36th Conference on Neural Information Processing Systems, NeurIPS 2022 (Advances in Neural Information Processing Systems; Vol. 35). Neural information processing systems foundation.
Engelken, R., Ingrosso, A., Khajeh, R., & Abbott, S. G. L. F. (2022). Input correlations impede suppression of chaos and learning in balanced firing-rate networks. PLoS computational biology, 18(12), Article e1010590. https://doi.org/10.1371/journal.pcbi.1010590