A tutorial on distribution-free uncertainty quantification using conformal prediction

This article has 0 evaluations Published on
Read the full article Related papers
This article on Sciety

Abstract

Statistical prediction models are ubiquitous in psychological research and practice.Increasingly, machine learning models are used. Quantifying the uncertainty of suchpredictions is rarely considered, partly because prediction intervals are not defined for manyof the algorithms used. However, generating and reporting prediction models withoutinformation on the uncertainty of the predictions carries the risk of over-interpreting theiraccuracy. Conventional methods for prediction intervals (such as those defined for OrdinaryLeast Squares regression) are sensitive to violations of several distributional assumptions.This tutorial introduced conformal prediction, a model-agnostic, distribution-free method forgenerating prediction intervals with guaranteed marginal coverage, to psychologicalresearch. We start by introducing the basic rationale of prediction intervals using amotivating example. Then, we proceed to conformal prediction, which is illustrated in threeincreasingly complex examples, using publicly available data and R code.

Related articles

Related articles are currently not available for this article.