dataset
2024 年 9 月 9 日
AGR Age Group fairness Reward for Bias Mitigation in LLMs
title: AGR Age Group fairness Reward for Bias Mitigation in LLMs
publish date:
2024-09-06
authors:
Shuirong Cao et.al.
paper id
2409.04340v1
download
abstracts:
LLMs can exhibit age biases, resulting in unequal treatment of individuals across age groups. While much research has addressed racial and gender biases, age bias remains little explored. The scarcity of instruction-tuning and preference datasets for age bias hampers its detection and measurement, and existing fine-tuning methods seldom address age-related fairness. In this paper, we construct age bias preference datasets and instruction-tuning datasets for RLHF. We introduce ARG, an age fairness reward to reduce differences in the response quality of LLMs across different age groups. Extensive experiments demonstrate that this reward significantly improves response accuracy and reduces performance disparities across age groups. Our source code and datasets are available at the anonymous \href{https://anonymous.4open.science/r/FairRLHF-D445/readme.md}{link}.
QA:
coming soon
编辑整理: wanghaisheng 更新日期:2024 年 9 月 9 日