Evaluating Evaluation Metrics: A Framework for Analyzing NLG Evaluation Metrics using Measurement Theory
Article Status
    Published
Authors/contributors
    - Xiao, Ziang (Author)
 - Zhang, Susu (Author)
 - Lai, Vivian (Author)
 - Liao, Q. Vera (Author)
 
Title
    Evaluating Evaluation Metrics: A Framework for Analyzing NLG Evaluation Metrics using Measurement Theory
Abstract
    We address a fundamental challenge in Natural Language Generation (NLG) model evaluation -- the design and evaluation of evaluation metrics. Recognizing the limitations of existing automatic metrics and noises from how current human evaluation was conducted, we propose MetricEval, a framework informed by measurement theory, the foundation of educational test design, for conceptualizing and evaluating the reliability and validity of NLG evaluation metrics. The framework formalizes the source of measurement error and offers statistical tools for evaluating evaluation metrics based on empirical data. With our framework, one can quantify the uncertainty of the metrics to better interpret the result. To exemplify the use of our framework in practice, we analyzed a set of evaluation metrics for summarization and identified issues related to conflated validity structure in human-eval and reliability in LLM-based metrics. Through MetricEval, we aim to promote the design, evaluation, and interpretation of valid and reliable metrics to advance robust and effective NLG models.
Repository
    arXiv
Archive ID
    arXiv:2305.14889
Date
    2023-10-22
Accessed
    25/10/2023, 17:36
Short Title
    Evaluating Evaluation Metrics
Library Catalogue
    
Extra
    tex.ids= xiaoEvaluatingEvaluationMetrics2023a, xiaoEvaluatingEvaluationMetrics2023b
arXiv: 2305.14889 [cs]
<标题>: 评估评价指标:使用测量理论分析自然语言生成评价指标的框架
Citation Key: xiao2023
Citation
    Xiao, Z., Zhang, S., Lai, V., & Liao, Q. V. (2023). Evaluating Evaluation Metrics: A Framework for Analyzing NLG Evaluation Metrics using Measurement Theory (arXiv:2305.14889). arXiv. https://doi.org/10.48550/arXiv.2305.14889
        Link to this record