Our paper on error estimation via optimal Bayesian transfer learning has now been published in Patterns

Our recent study on Bayesian error estimation via optimal Bayesian transfer learning has been published in Patterns, a premium open access journal from Cell Press that publishes ground-breaking original research across the full breadth of data science.

Omar Maddouri, Xiaoning Qian, Francis J. Alexander, Edward R. Dougherty, Byung-Jun Yoon, “Robust Importance Sampling for Error Estimation in the Context of Optimal Bayesian Transfer Learning,” Patterns, DOI:https://doi.org/10.1016/j.patter.2021.100428.

In this paper, we investigate the problem of accurate estimation of classification error in a small sample setting, showing that optimal Bayesian transfer learning can enhance the estimation results by leveraging data in different yet relevant domains. In scientific domains with limited data availability, accurate classification error estimation is practically challenging. Although transfer learning (TL) may provide a promising solution under such circumstances by learning from data available in other relevant domains, it has not been explored for enhancing error estimation. In this work, we place the problem of estimating the classification error in a Bayesian paradigm and introduce a TL-based error estimator that can significantly enhance the accuracy and robustness of error estimates under data scarcity. We demonstrate that our proposed TL-based Bayesian error estimation framework effectively models and exploits the relatedness between different domains to improve error estimation. Experimental results based on both synthetic data as well as real-world data show that our proposed error estimator clearly outperforms existing error estimators, especially in a small sample setting, by enabling us to tap into the data from other relevant domains.

For further details, please visit: