Deep Learning for Biomedical and Science Time Series
Deep Learning has been applied to many time series and sequence to sequence mappings but in many areas, the best way forward is not clear. Probably Industry is in the leads with audio applications and ride-hailing. We discuss a few research examples including Covid daily data, solutions of ordinary differential equations, hydrology and earthquakes. We consider both recurrent neural networks (LSTM) and Transformer architectures. We stress the importance of representing dependence on time and region (such as City where Covid data measured) and propose a framework for this. We show how working with the industry consortium MLPerf, we may be able to establish best practices and help the community discover and apply ideas to new fields.
Dr. Geoffrey Charles Fox received a Ph.D. in Theoretical Physics from Cambridge University, where he was Senior Wrangler. He is now a distinguished professor of Engineering, Computing, and Physics at Indiana University, where he is the director of the Digital Science Center. He previously held positions at Caltech, Syracuse University, and Florida State University after being a postdoc at the Institute for Advanced Study at Princeton, Lawrence Berkeley Laboratory, and Peterhouse College Cambridge. He has supervised the Ph.D. of 73 students and published around 1500 papers (over 540 with at least ten citations) in physics and computing with a hindex of 82 and over 38500 citations. He received the High-Performance Parallel and Distributed Computing (HPDC) Achievement Award and the ACM - IEEE CS Ken Kennedy Award for Foundational contributions to parallel computing in 2019. He is a Fellow of APS (Physics) and ACM (Computing) and works on the interdisciplinary interface between computing and applications. He is involved in several projects to enhance the capabilities of Minority Serving Institutions. He has experience in online education and its use in MOOCs for areas like Data and Computational Science. He is active in Industry consortium MLPerf.