By: Waqar Hussain (@infinity4upk )
Associate Editor: Muneera Bano (@DrMuneeraBano)
Values such as fairness, sustainability and pleasure are ideals and aspirations that are important to humans. Values transcend situations, guide human decision making and influence their evaluation of worldly things [2]. When values are not designed for and delivered in software, there are ramifications. I call value deficiencies or omissions in software, Values Debt (VD). Unlike other forms of software debt often remedied by spending additional cost or effort, high VD renders software unsalvageable. It poses an existential threat to the ‘owner’ organization - something that happened to Volkswagen and Facebook [3].
VD is present in all kinds of software. For example, the mundane software like flight reservation systems has an inherent VD as they are built on economic drivers of supply and demand rather than on the values of empathy and compassion. Delta airline’s reservation system manifested its VS when it started price gouging people trying to escape from the devastation of Hurricane Irma [4].
More alarmingly, VD in also present in Machine Learning (ML) and Artificial Intelligence (AI) based systems that can easily scale and impact masses of people. These ‘intelligent’ ML/AI based systems are often trained on biased data hence spew out biased results such as longer sentencing for criminals of a particular skin type, denial of loans and jobs based on certain ethnicity or gender and so on. With the prevalence of VD in all software, the point of emphasis is, while software may be eating the world, VD is dining on software!
Contrary to the common (practitioners’) belief, not all ills of software can be put to rest by adding ‘more’ code. For example, the tech giant Facebook had to undergo social audit rather than technical overhauls of its platform to fix privacy, discrimination and other human rights issues.
But why is it so hard to embed even a small subset of human values in software? Firstly, it is due to the subjective nature of values and their imprecise definitions. Software engineers, especially those with little or no background in social sciences, can hardly relate values to their ‘day to day’ work. Secondly, the interconnected of values. Implementing one needs to address the willful implementation of its interconnected set.
Poor implementation of one e.g. privacy might impact other connected values such as perceived autonomy and freedom. Thirdly, prioritizing stakeholder values (individual vs. corporate vs. societal) and aligning them in software delivery is extremely challenging.
But how do software engineers deal with values in practice? Our research shows that there is a general lack of awareness in the creators of software about human values and how they relate to their work [4]. Some avoid dealing with the murky and philosophical area of values and divorce themselves from taking up any responsibility. Others get deterred by the cost and effort required to achieve the desired level of maturity in implementing values and are content with covering enough regulatory requirements just to be compliant. Yet others consider the delivery of software itself takes care of human values - and the attitude is like ‘after all we are solving people’s problems and keeping ourselves from doing any harm ’.
What can be done to address this? Besides providing awareness of the importance of human values in SE and providing practitioners with the right tools and techniques, more accountability is needed. While codes of professional conduct for software practitioners and standards to support the development of ethical AI and ML-like IEEE P7000 suite are useful. Appealing merely to the good conscience of people alone, however, will not achieve the desired results. Regulations like General Data Protection Regulation (GDPR) and others are helpful [5]. Regulations deter technological advancements from deliberately breaching societal values and hold organizations accountable (and even individuals) when software negatively impacts people or the environment.
Where to now? Values Debt metaphor is an ice breaker to discuss, understand and thereby avoid the implications of human-values-deficient software. Imagine how does the world start to look like if we could engineer in values from the get-go? Do we move to avoid human rights deficiencies in software? Can we solve failures like Delta Air ticketing system labelled as one with “no built-in ethical values”; can we ‘educate’ and restrain intelligent algorithms in-time from sending pictures of self-harm to children and avoid future suicides? [6] I believe we can if we reduce software Values Debt.
References
[1] Boehm, B. Value-based software engineering, ACM SIGSOFT Software Engineering Notes, 28(2), 4., 2003
[2] Schwartz, S.H., et al., Refining the theory of basic individual values. Journal of personality and social psychology, 2012. 103(4): p. 663.
[3 ] Smith D. Mark Zuckerberg vows to fight election meddling in marathon Senate grilling. The Guardian. 2018 Apr;11.
[4] Justin Sablich. 2017. ’Price Gouging’ and Hurricane Irma: What Happened and What to Do. https://www.nytimes.com/2017/09/17/travel/price-gouginghurricane-irma-airlines.html.
[5] Voigt, P. and A. Von dem Bussche, The EU General Data Protection Regulation (GDPR). A Practical Guide, 1st Ed., Cham: Springer International Publishing, 2017.
[6] Crawford, A. Instagram 'helped kill my daughter'. BBC News. Retrieved on 24 July 2019 from https://www.bbc.com/news/av/uk-46966009/instagram-helped-kill-my-daughter, 2019
What can be done to address this? Besides providing awareness of the importance of human values in SE and providing practitioners with the right tools and techniques, more accountability is needed. While codes of professional conduct for software practitioners and standards to support the development of ethical AI and ML-like IEEE P7000 suite are useful. Appealing merely to the good conscience of people alone, however, will not achieve the desired results. Regulations like General Data Protection Regulation (GDPR) and others are helpful [5]. Regulations deter technological advancements from deliberately breaching societal values and hold organizations accountable (and even individuals) when software negatively impacts people or the environment.
Where to now? Values Debt metaphor is an ice breaker to discuss, understand and thereby avoid the implications of human-values-deficient software. Imagine how does the world start to look like if we could engineer in values from the get-go? Do we move to avoid human rights deficiencies in software? Can we solve failures like Delta Air ticketing system labelled as one with “no built-in ethical values”; can we ‘educate’ and restrain intelligent algorithms in-time from sending pictures of self-harm to children and avoid future suicides? [6] I believe we can if we reduce software Values Debt.
References
[1] Boehm, B. Value-based software engineering, ACM SIGSOFT Software Engineering Notes, 28(2), 4., 2003
[2] Schwartz, S.H., et al., Refining the theory of basic individual values. Journal of personality and social psychology, 2012. 103(4): p. 663.
[3 ] Smith D. Mark Zuckerberg vows to fight election meddling in marathon Senate grilling. The Guardian. 2018 Apr;11.
[4] Justin Sablich. 2017. ’Price Gouging’ and Hurricane Irma: What Happened and What to Do. https://www.nytimes.com/2017/09/17/travel/price-gouginghurricane-irma-airlines.html.
[5] Voigt, P. and A. Von dem Bussche, The EU General Data Protection Regulation (GDPR). A Practical Guide, 1st Ed., Cham: Springer International Publishing, 2017.
[6] Crawford, A. Instagram 'helped kill my daughter'. BBC News. Retrieved on 24 July 2019 from https://www.bbc.com/news/av/uk-46966009/instagram-helped-kill-my-daughter, 2019
No comments:
Post a Comment