Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Setting up ML Evaluation Standards to Accelerate Progress

A meta analysis of data-driven newsvendor approaches

Simone Buttler · Andreas Philippi · Nikolai Stein · Richard Pibernik


Abstract:

Recently, a number of publications in leading operations management and operations research journals proposed new models that combine machine learning and mathematical optimization techniques to predict inventory decisions directly from historical demand and additional feature information. This paper presents the results of a meta analysis of recent machine learning based approaches for solving the most prominent problem in operations management, the newsvendor problem. We find that the reproducibility of existing studies is typically low, because authors evaluate their new approaches based on small and proprietary datasets, do not share data and code, and use different benchmarks. We develop a reproducible, unified evaluation procedure and apply various models to a large and heterogeneous dataset. Our results do not support the findings and claims of most of the recent papers and, in several cases, we even obtain contradicting results. Overall, the robustness of the newly proposed models appears to be low. To support both researchers and practitioners in the development and evaluation of new models, we provide extensive benchmark data and a python library that contains open source implementations of most existing models.

Chat is not available.