Chris Barrick
Monday, November 6, 2017
The Vanishing Gradient Problem and Its Solutions
This talk will review the vanishing gradient problem in deep learning and some heuristics to avoid it. The topics discussed will include Xavier and He weight initialization, ReLU activation and its variations, batch normalization, and self-normalizing networks.
Chris Barrick is a 2nd year master's student in the Institute for Artificial Intelligence at UGA, where he develops models for solar radiation forecasting. Before grad school, Chris received bachelor's degrees from UGA studying both Computer Science and Cognitive Science.