Category Archives: Uncategorized

Dr. Yoon’s collaborative work with COVID ACT NOW and Mass General Hospital is featured on the cover of Patterns from Cell Press

The paper “Looking back on forward-looking COVID models”, in which Dr. Byung-Jun Yoon is a co-author, has been featured on the cover of the July 2022 issue of Patterns, a premium open access journal from Cell Press. This paper is an outcome of Dr. Yoon’s ongoing collaboration with COVID ACT NOW (CAN) – the COVID-focused U.S initiative of Act Now Coalition – and the Massachusetts General Hospital (MGH).

The cover art of Patterns July 2022 issue features Dr. Yoon’s collaborative project with Covid Act Now and Mass General Hospital.

In this work, we presented the epidemiological model by Covid Act Now (CAN) and evaluated its performance by back-testing against historical data. For comparison, similar analyses were performed for several other COVID models and the obtained results were compared. It was found that all models generally captured the potential magnitude and directionality of the pandemic in the short term. While there are limitations to epidemiological models, understanding these limitations enables these models to be utilized as tools for “data-driven decision-making” in viral outbreaks.

“Face the Music” – the cover image on the July 2022 issue of Patterns – was created by Anna Maybach, resident artist at Act Now Coalition.

The cover image is entitled “Face the Music”. The American idiom “face the music” means to accept consequences. It is thought to originate from an exhortation to face one’s stage fright. The sound waves in this image were created by superimposing Covid Act Now trend graphs representing cases, hospitalizations, ICU hospitalizations, and deaths in the United States since March 2020. This visualization represents the path of accepting the consequences of our actions and facing our fears in order to navigate the COVID pandemic: to “face the music.”

The paper can be accessed at the link below:

Paul Chong, Byung-Jun Yoon, Debbie Lai, Michael Carlson, Jarone Lee, Shuhan He, “Looking Back on Forward-Looking COVID Models,” Patterns, volume 3, issue 7, 100492, July 08, 2022, https://doi.org/10.1016/j.patter.2022.100492.

Craig H. Neilsen Foundation will fund a new research project on reinforcement learning for optimal electrical stimulation for spinal cord injury (SCI) patients

According to the National Spinal Cord Injury Statistical Center, approximately 18,000 new spinal cord injuries (SCI) occur each year in the United States. Spinal cord injuries often lead to serious constipation or incontinence, which can lead to decreased quality of life and may even be life-threatening. After a spinal cord injury, 41% of patients rated bowel dysfunction as a severe life-limiting problem.

Craig H. Neilsen Foundation recently announced that they will fund a new research project that aims to develop an optimal electrical stimulation method via reinforcement learning (RL) to help bowel dysfunction of spinal cord injury patients. In this project, Dr. Byung-Jun Yoon will collaborate with Dr. Hangue Park (PI, Texas A&M Electrical and Computer Engineering) and Dr. Cedric Geoffroy (Texas A&M College of Medicine) to develop a closed-loop stimulation scheme, which ultimately aims to improve the quality of life for SCI patients as well as their caregivers.

For more information, please visit:

https://engineering.tamu.edu/news/2022/06/ecen-electrical-stimulation-could-help-bowel-dysfunction-after-spinal-cord-injury.html

Our recent work on transfer learning for error estimation has been featured on ACM Tech News & DOE Office of Science website

This week’s ACM Tech News has featured our recent work on Transfer Learning for Bayesian Error Estimation (TL-BEE):

https://technews.acm.org/archives.cfm?fo=2022-03-mar/mar-11-2022.html

This study has also received spotlight on the Department of Energy (DOE), Office of Science website:

https://www.energy.gov/science/office-science

The work has been recently published in Cell Press Patterns, and the full article can be accessed at the link below:

Omar Maddouri, Xiaoning Qian, Francis J. Alexander, Edward R. Dougherty, Byung-Jun Yoon, “Robust Importance Sampling for Error Estimation in the Context of Optimal Bayesian Transfer Learning,” Patterns, 2022, DOI: https://doi.org/10.1016/j.patter.2021.100428

Omar Maddouri’s recent work on transfer learning for Bayesian error estimation featured on Texas A&M College of Engineering website

Recent work by Omar Maddouri, currently a Ph.D. candidate in the BioMLSP lab, on transfer learning for Bayesian error estimation has been featured in an article entitled “Doctoral student offers new insight into machine-learning error estimation”, which has been published on the Texas A&M College of Engineering website.

The article can be found at: https://engineering.tamu.edu/news/2022/03/ecen-doctoral-student-offers-new-insight-into-machine-learning-error-estimation.html

Our paper on error estimation via optimal Bayesian transfer learning has now been published in Patterns

Our recent study on Bayesian error estimation via optimal Bayesian transfer learning has been published in Patterns, a premium open access journal from Cell Press that publishes ground-breaking original research across the full breadth of data science.

Omar Maddouri, Xiaoning Qian, Francis J. Alexander, Edward R. Dougherty, Byung-Jun Yoon, “Robust Importance Sampling for Error Estimation in the Context of Optimal Bayesian Transfer Learning,” Patterns, DOI:https://doi.org/10.1016/j.patter.2021.100428.

Continue reading

Finding robust biomarkers for more accurate and reproducible disease diagnosis/prognosis

Identifying robust diagnostic/prognostic biomarkers from gene expression data that can lead to accurate and reproducible predictions is a challenging problem.

In our recent paper “Deep graph representations embed network information for robust disease marker identification”, we show that deep graph representations using graph convolutional networks (GCNs) can identify effective markers that significantly improve the predictive performance and their reproducibility across different datasets and platforms.

Omar Maddouri, Xiaoning Qian, Byung-Jun Yoon, “Deep graph representations embed network information for robust disease marker identification,” Bioinformatics, btab772, https://doi.org/10.1093/bioinformatics/btab772.

Continue reading

Our NeurIPS 2021 paper on efficient active learning for Gaussian process classification is now online

We are happy to announce that our NeurIPS 2021 paper entitled “Efficient Active Learning for Gaussian Process Classification by Error Reduction” is now available in OpenReview.net: https://openreview.net/pdf?id=UK15Hj9qX6I

Guang Zhao, Edward Dougherty, Byung-Jun Yoon, Francis Alexander, Xiaoning Qian, “Efficient Active Learning for Gaussian Process Classification by Error Reduction,” Thirty-Fifth Conference on Neural Information Processing Systems (NeurIPS 2021), Dec. 6 – 14, 2021.

Continue reading

Post-doc position available in Scientific Machine Learning (SciML)

We are happy to announce the availability of a new post-doc position in areas relevant to Scientific Machine Learning (SciML).

The position resides in the Applied Mathematics Group of the Computational Science Initiative (CSI) at Brookhaven National Laboratory (BNL), and the post-doctoral researcher will work with Dr. Byung-Jun Yoon and Dr. Nathan Urban on a project focused on scientific data reduction.

We invite outstanding candidates to apply for a post-doctoral research associate position in applied mathematics, machine learning, and scientific computing. This position offers a unique opportunity to conduct research in emerging interdisciplinary research problems at the intersection of applied mathematics, machine learning, and high-performance computing (HPC) with applications in diverse scientific domains of interest to BNL and the Department of Energy (DOE).

Topics of specific interest include:

  • optimal decision-guided data reduction
  • feature extraction/engineering in high-dimensional compositional workflows that involve machine learning (ML) models
  • Bayesian inference and uncertainty quantification in scientific ML models
  • learning/optimization of low-dimensional latent feature spaces for ML surrogates.

The position includes access to world-class HPC resources, such as the BNL Institutional Cluster and DOE leadership computing facilities. Access to these platforms will allow computing at scale and will ensure that the successful candidate will have the necessary resources to solve challenging DOE problems of interest.

This program provides full support for a period of two years at CSI with possible extension. Candidates must have received a doctorate (Ph.D.) in applied mathematics, statistics, computer science, or a related field (e.g., mathematics, engineering, operations research, physics) within the past five years. This post-doc position presents a unique chance to conduct interdisciplinary collaborative research in BNL programs with a highly competitive salary.

For further information, please visit: [post-doc position announcement]

For inquiries, contact Dr. Byung-Jun Yoon or Dr. Nathan Urban.

KBTX news on scientific data reduction: is having more data always better for achieving scientific goals?

KBTX has featured Dr. Yoon and his team’s research project on scientific data reduction in their recent news.

“The team says people tend to think the more data someone has, the better it is for achieving their goal, which is not always the case. That’s why they’re working on a mechanism that can get rid of the unnecessary data without compromising what’s needed.”

Andy Krauss, Reporter – KBTX news

Further details can be found at the link below:

https://www.kbtx.com/2021/10/07/texas-am-researchers-awarded-24-million-grant-efficiently-reduce-size-data-sets/

SLATE Future Tense features Dr. Yoon’s research on scientific data reduction

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society, exploring how emerging technologies will change the way we live.

Future Tense has recently posted an article entitled “Why the Department of Energy Is Spending Millions to Get Rid of Scientific Data”, in which they featured Dr. Yoon and his team’s research project on objective-driven reduction of scientific data, recently funded by the U.S. Department of Energy (DOE). The news article can be accessed at:

https://slate.com/technology/2021/10/big-data-information-overload-department-of-energy.html

What is Future Tense?

“A partnership of Slate, New America, and Arizona State University, Future Tense explores how emerging technologies will change the way we live. The latest consumer gadgets are intriguing, but we focus on the longer-term transformative power of robotics, information and communication technologies, synthetic biology, augmented reality, space exploration, and other technologies. Future Tense seeks to understand the latest technological and scientific breakthroughs, and what they mean for our environment, how we relate to one another, and what it means to be human. Future Tense also examines whether technology and its development can be governed democratically and ethically. Future Tense asks these questions in daily commentary published on Slate and through public events featuring conversations with leading scientists, technologists, policymakers, and journalists.”

Quoted from Future Tense website. For further info, visit: https://slate.com/future-tense