Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Radovanović, Sandroa; * | Petrović, Andrijab | Dodevska, Zoricac | Delibašić, Borisa
Affiliations: [a] Faculty of Organizational Sciences, University of Belgrade, Serbia | [b] Singidunum University, Serbia | [c] The Institute for Artificial Intelligence Research and Development of Serbia, Serbia
Correspondence: [*] Corresponding author: Sandro Radovanović, Faculty of Organizational Sciences, University of Belgrade, Serbia. E-mail: sandro.radovanovic@fon.bg.ac.rs.
Abstract: With growing awareness of the societal impact of decision-making, fairness has become an important issue. More specifically, in many real-world situations, decision-makers can unintentionally discriminate a certain group of individuals based on either inherited or appropriated attributes, such as gender, age, race, or religion. In this paper, we introduce a post-processing technique, called fair additive weighting (FairAW) for achieving group and individual fairness in multi-criteria decision-making methods. The methodology is based on changing the score of an alternative by imposing fair criteria weights. This is achieved through minimization of differences in scores of individuals subject to fairness constraint. The proposed methodology can be successfully used in multi-criteria decision-making methods where the additive weighting is used to evaluate scores of individuals. Moreover, we tested the method both on synthetic and real-world data, and compared it to Disparate Impact Remover and FA*IR methods that are commonly used in achieving fair scoring of individuals. The obtained results showed that FairAW manages to achieve group fairness in terms of statistical parity, while also retaining individual fairness. Additionally, our approach managed to obtain the best equality in scoring between discriminated and privileged groups.
Keywords: Additive weighting, multi-criteria decision making, algorithmic decision making, fairness, bias mitigation
DOI: 10.3233/IDA-226898
Journal: Intelligent Data Analysis, vol. 27, no. 4, pp. 1023-1045, 2023
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl