疫情下的机器人护理问题:算法中的种族与性别歧视
Robot Nursing Problems in Epidemic Diseases: Racial and Gender Discrimination in Algorithms

吴佳佳    海南师范大学
时间:2023-06-20 语向:中-英 类型:人工智能 字数:5447
  • 疫情下的机器人护理问题:算法中的种族与性别歧视
    Robot Nursing Problems in Epidemic Diseases: Racial and Gender Discrimination in Algorithms
  • 编者按:随着新冠对全世界人民的健康安全造成了巨大威胁,科技公司将目光投向了能够安全地为病人和老年人提供护理的机器人开发领域。但是,看似先进且无害的人工智能护理机器人可能蕴含着开发者本身的性别和种族偏见。护理领域非常依赖于有色人种贫困妇女的劳动,考虑到机器人护理的高昂花费,未来的工作仍然由“必不可少的”低工资劳动力组成。她们往往缺乏维持生活的工资、无法保障工作场所的安全、没有带薪病假和探亲假,以及适当的医疗保剑
    Editor's Note: As the new crown poses a great threat to the health and safety of people all over the world, technology companies have turned their eyes to the field of robot development that can safely provide care for patients and the elderly. However, seemingly advanced and harmless artificial intelligence nursing robots may contain the developer's own gender and racial prejudice. The nursing field relies heavily on the labor of poor women of color. Considering the high cost of robot nursing, future work still consists of "essential" low-wage labor. They often lack subsistence wages, safety in the workplace, paid sick leave and family leave, and adequate medical care
  • 然而,从使用白人女性的形象作为格蕾丝的“面孔”,到使用白人的外貌抹去EngKey机器人背后的“菲律宾英语教师”,再到被设计为“美丽女性”的护理机器人爱丽卡,这些例子揭露了设计师的认识:照顾世界的劳动责任主要应该由妇女来承担,而“白人”教师与护理人员则要优于其他种族的工作人员。
    However, from using the image of white women as Grace's "face", From the use of white appearance to erase the "Filipino English teacher" behind EngKey robot, to Erica, a nursing robot designed as a "beautiful woman", these examples reveal the designer's understanding that the labor responsibility for taking care of the world should mainly be borne by women, while "white" teachers and nursing staff are superior to staff of other races.
  • 据统计,到2030年,有偿与无偿的护理人员数量都将面临巨大空缺;但是,机器人自动化是否是解决这一短缺的最佳方式仍有争议。即使假设在不久的将来,同理心、情感劳动和创造力可以随时被机械化(许多机器人专家对此表示怀疑),如果代码中的种族主义和性别偏见得不到解决,将这些类型的护理外包出去可能会给接受护理的人带来严重的后果。本文原载于《波士顿书评》,本文作者安娜罗米娜格瓦拉(Anna Romina Guevarra)是芝加哥伊利诺斯大学全球亚洲研究的创始董事和副教授,公众之声专题项目研究员。她是《营销梦想和制造英雄:菲律宾工人的跨国劳工中介》一书的作者。
    According to statistics, by 2030, the number of paid and unpaid nursing staff will face huge vacancies. However, whether robot automation is the best way to solve this shortage is still controversial. Even assuming that empathy, emotional labor and creativity can be mechanized at any time in the near future (many robotics experts doubt this), if racism and gender bias in the code are not addressed, outsourcing these types of care may bring serious consequences to the people receiving care. This article was originally published in the Boston Review of Books. The author of this article, Anna Romina Guevarra, is a founding director and associate professor of global Asian studies at the University of Illinois in Chicago and a researcher on the Voice of the Public project. She is the author of "Marketing Dreams and Making Heroes: Transnational Labor Intermediaries for Filipino Workers".
  • 在新冠疫情最为严重的时候,由SingularityNET(SNET)和汉森机器人公司(Hanson robotics)两家机器人公司合资成立的觉醒健康有限公司(AHL)推出了格蕾丝(Grace)第一个拥有逼真人类外观的医疗机器人。格蕾丝通过让病人参与治疗互动、接受认知刺激以及收集和管理病人数据,来提供急性医疗和老年护理。到2021年底,汉森机器人公司希望能够大规模生产名为索菲亚(Sophia)的机器人,将其融入其最新产品格蕾丝中,并将产品推向全球市常
    At the height of the new crown epidemic, Awakening Health Co., Ltd. (AHL), a joint venture between SingularityNET (SNET) and Hanson Robotics, launched Grace's first medical robot with realistic human appearance. Grace provides acute medical care and elderly care by involving patients in therapeutic interactions, receiving cognitive stimuli, and collecting and managing patient data. By the end of 2021, Hansen Robots hopes to mass produce a robot called Sophia, integrate it into its latest product Grace, and bring the product to the global market.
  • 格蕾丝
    Grace
  • 虽然格蕾丝是首位看起来如此像人的机器人,但她并不是第一个医疗机器人。同汤米(Tommy)、尤米(Yumi)、斯蒂维(Stevie)、艾娃(Ava)和莫西(Moxi)一样,她是在世界各地的医院和老年护理机构工作的、日益增多的机器人护理人员中的一员。从床边护理和监测,到储存医疗用品、欢迎客人,甚至为孤立无援的居民共同主持卡拉OK之夜,它们做了一切能做的。它们一起被誉为解决我们大流行困境的方案。
    Although Grace is the first robot to look so human, she is not the first medical robot. Like Tommy, Yumi, Stevie, Ava and Moxi, she is one of the increasing number of robot nurses working in hospitals and elderly care institutions around the world. From bedside care and monitoring to storing medical supplies, welcoming guests and even co-hosting karaoke nights for isolated residents, they have done everything they can. Together, they are hailed as a solution to our pandemic dilemma.
  • 在过去几年中,全球专业服务机器人的销售额增长了32%(112亿美元);仅在2018年至2019年期间,老年人辅助机器人的销量就增长了17%(9100万美元)。在新冠大流行期间,安全地提供护理和服务的独特挑战只会增加它们的吸引力。全世界越来越依赖机器人系统进行消毒、执行戴口罩和保持社会距离协议、监测患者的生命体征、运送物资和杂货、进行虚拟参观,甚至为毕业典礼提供便利。
    In the past few years, global sales of professional service robots have increased by 32% (US $11.2 billion). Between 2018 and 2019 alone, sales of assisted robots for the elderly increased by 17% (US $91 million). During the new crown pandemic, the unique challenge of providing care and services safely will only increase their attractiveness. The world is increasingly relying on robot systems for disinfection, implementing agreements to wear masks and maintain social distance, monitoring patients' vital signs, transporting materials and groceries, conducting virtual visits, and even facilitating graduation ceremonies.
  • 但对机器人的兴趣和投资的增加对人类工人意味着什么?
    But what does the increase in interest and investment in robots mean to human workers?
  • 毫无疑问,从短期来看,尽管机器人可能会为人类工人提供一些支持,帮助人们尽可能地减少不安全的环境,但它们无法取代人类工人。以目前的机器人技术水平上,将人类完全替换成机器人员工,需要工作环境的可预测性达到一个不可能的程度。作为社交机器人领域的先驱之一,露西萨克曼(Lucy Suchman)指出:“当世界按照机器人所需要的方式被安排时,它们工作得最好。”机器人可以在工厂和仓库中良好地工作,因为流水线工作提供了一个统一的环境;在家庭和保健设施中,这种统一性则更难实现。
    There is no doubt that in the short term, although robots may provide some support to human workers and help people reduce unsafe environment as much as possible, they cannot replace human workers. At the current level of robot technology, completely replacing human beings with robot employees requires an impossible degree of predictability in the working environment. As one of the pioneers in the field of social robots, Lucy Suchman pointed out: "When the world is arranged the way robots need, they work best." Robots can work well in factories and warehouses because assembly line work provides a unified environment. This uniformity is even more difficult to achieve in families and health facilities.
  • 不过,从长远来看,机器人并不会总是如此“功能有限”。因此,至关重要的是,我们不仅要考虑机器人是否能取代人类工人(因为总有一天,这必然会发生),而且还要考虑它们是否应该取代人类工人。事实上,格蕾丝和其他机器人所代表的自动化尝试,不仅引发了人们对工作本质的普遍质疑,更加对关于做护理工作的意义提出了质疑。照顾另一个人意味着什么?反过来说,对于算法而言,关怀又意味着什么?
    However, in the long run, robots will not always be so "limited in function." Therefore, it is of vital importance that we not only consider whether robots can replace human workers (because this will inevitably happen one day), but also whether they should replace human workers. In fact, the automation attempt represented by Grace and other robots has not only aroused people's general doubts about the nature of work, but also questioned the significance of nursing work. What does it mean to take care of another person? On the other hand, what does care mean for algorithms?
  • 护理工作领域在很大程度上依赖于有色人种贫困妇女的劳动(而且她们往往是移民)。长期以来,她们被告知文明取决于她们的工作,但她们的工作报酬并不高;因此,关于是否要采用昂贵的机器人系统的问题就显得尤为突出了。从这个意义上说,任何关于护理机器人变革潜力的讨论,都必须受到现实的制约:正如蒲爱仁(Ai-jen Poo)和帕拉克沙阿(Palak Shah)所指出的,可预见的“工作未来”不是自动化。未来的工作仍然由“必不可少的”低工资劳动力组成,其中有色人种妇女的比例非常之高,她们往往缺乏维持生活的工资、无法保障工作场所的安全、没有带薪病假和探亲假,以及适当的医疗保剑事实上,卫生保健工作者是受大流行影响最严重的人群之一。美国疾病控制中心(Centers for Disease Control)的最新数据显示,迄今为止,已有450517名卫生保健人员感染了新冠病毒,超过1497人死亡,以有色人种为主。由于存在严重漏报,而且并非所有数据都可以被直接获得,实际数字可能会更高。
    The field of nursing work depends to a large extent on the labor of poor women of color (and they are often immigrants). For a long time, they have been told that civilization depends on their work, but their work is not well paid. Therefore, the question of whether to adopt expensive robot systems is particularly prominent. In this sense, any discussion on the potential for change of nursing robots must be restricted by reality: as Ai-jen Poo and Palak Shah pointed out, the foreseeable "working future" is not automation. Future jobs still consist of "essential" low-wage labour, Among them, the proportion of women of color is very high. They often lack subsistence wages, safety in the workplace, paid sick leave and family visit leave, and appropriate medical care. In fact, health care workers are one of the people most affected by the pandemic. The latest data from the U.S. Centers for Disease Control show that 450,517 health care workers have been infected with the new crown virus so far, and more than 1,497 people have died, mainly of color. Due to serious underreporting and not all data are directly available, the actual number may be higher.
  • 与许多未来学家的期望相反,自动化不会自动产生一个更公正的劳动力市常只有当劳动公正成为采用自动化的一个条件时,更公正的劳动力市场才会出现。否则,自动化只会使问题复杂化,使得那些在社会和经济上最脆弱的人所经历更为严重的不平等。
    Contrary to the expectations of many futurologists, automation will not automatically produce a fairer labor market. Often, a fairer labor market will only appear when labor justice becomes a condition for the adoption of automation. Otherwise, automation will only complicate the problem and make the most socially and economically vulnerable people experience more serious inequality.
  • 这是因为算法倾向于复制我们世界中已经存在的偏见。近年来,人工智能的批评者已经证明了这一点,比如乔伊博拉姆维尼、萨菲娅乌莫亚诺布尔、以及鲁哈本杰明(Ruha Benjamin),他们注意到算法的偏见已经被反映在了方方面面上,从面部识别系统无法识别有色人种,到搜索引擎中与黑人相关的搜索结果遭到技术标记。综上所述,这些都是用新技术编码的系统性种族偏见的“新吉姆代码”(New Jim Code)。虽然我们对这些系统内部工作原理的了解经常遭到掩盖,因为它们的算法是“恰当的”,但输出结果很清楚:博拉姆维尼、诺布尔、本杰明和其他人的工作毫无疑问地表明,种族主义制度成为了计算系统和机器学习的基矗这些新技术带来了这些问题,却被认为是中立的因为机器被认为没有偏见,而这只会加剧问题的严重性。
    This is because algorithms tend to duplicate existing prejudices in our world. In recent years, critics of artificial intelligence have proved this point. For example, Joe Bora Mwini, Safia Umoya Noble and Ruha Benjamin have noticed that the bias of the algorithm has been reflected in all aspects, from the failure of the facial recognition system to recognize colored people to the technical marking of search results related to blacks in search engines. To sum up, these are the "New Jim Code" of systematic racial prejudice coded with new technologies. Although our knowledge of how these systems work inside is often obscured, Because their algorithms are "appropriate", However, the output is clear: the work of Boramwini, Noble, Benjamin and others undoubtedly shows that racist systems have become the foundation of computing systems and machine learning. These new technologies have brought these problems, but they are considered neutral because machines are considered unbiased, which will only aggravate the seriousness of the problem.
  • 回到格蕾丝的例子,她将作为机器人索菲亚的更新版本,其经过调整的功能将服务于医疗保健领域。她的制作者声称,这些机器人不仅承诺提供安全,还会提供 “人类的温暖,人类的联系”,并将作为“人类专业知识的自主延伸”。然而,通过选择让格蕾丝看起来像一个白人女性,设计师们传播了一种对人类专业知识的特殊理解,这种理解既是种族主义的,也是性别主义的。
    Returning to Grace's example, she will be an updated version of Sofia, a robot, whose adjusted functions will serve the health care field. Her producers claimed that these robots not only promised to provide safety, but also provided "human warmth and human connection" and would serve as "an autonomous extension of human expertise". However, by choosing to make Grace look like a white woman, designers have spread a special understanding of human professional knowledge, which is both racist and sexist.
  • 在一个名为EngKey的远程教学机器人的案例中,也可以看到类似的偏见。这款机器人的设计目的,是帮助菲律宾呼叫中心的教师向韩国的小学生提供远程英语教学。尽管授课的是一位菲律宾老师,但EngKey机器人的脸上带着白色的头像,在韩国的教室里转来转去。EngKey的开发者说,使用白色头像的理由有两个:一是尽量减少用户在与谁(人还是机器人)互动方面的困惑;二是加强英语教学的全球“权威”。然而,在这样做的过程中,机器人专家介入了劳动的地缘政治,加强了对理想的合格工人的全球幻想,抹去了不符合这个理想的实际工人。而他们自称的“机器人教师”,在这种安排下被迫通过一套强化白人的新词汇来运作,即使这种脚本剥削了他们自己的“第三世界”劳动。与我交谈的机器人教师表达出了一种深刻的分离感,这种感觉来自于一张金发碧眼的白人机器人脸,同时进行一种既脱离了身体、又依赖于身体,既不能移动、又能移动的情感劳动。值得注意的是,这样做并不是为了保证语言教学的成功(语言教学当然并不依赖于这种分裂),而是为了按照机器人专家的要求,创造人与机器之间的无缝集成。
    A similar bias can be seen in the case of a distance learning robot called EngKey. The robot is designed to help Philippine call center teachers provide distance English teaching to Korean primary school students. Although the teacher was a Filipino teacher, the EngKey robot, with a white head on its face, walked around the classrooms in South Korea. EngKey's developers said there are two reasons for using white avatars: one is to minimize users' confusion about who (people or robots) to interact with; The second is to strengthen the global "authority" of English teaching. However, in the process of doing so, robotics experts intervened in the geopolitics of labor, strengthening the global illusion of ideal qualified workers and erasing actual workers who did not conform to this ideal. And their self-proclaimed "robot teachers" are forced to operate under this arrangement through a new set of words to strengthen whites, even if this script exploits their own "third world" labor. The robot teacher I talked to expressed a deep sense of separation, which came from a blonde white robot face, and at the same time carried out an emotional labor that was not only separated from the body, but also dependent on the body, unable to move and able to move. It is worth noting that this is not to ensure the success of language teaching (language teaching certainly does not depend on this split), but to create seamless integration between people and machines according to the requirements of robot experts.
  • 由日本机器人专家石黑浩(Hiroshi Ishiguro)发明的机器人艾丽卡(Erica),是机器人专家在追求人性的过程中重现性别规范的另一个例子。石黑浩希望目前仍处于早期原型阶段的艾丽卡有助于引领机器与人类共存的世界,机器人将通过做最乏味和不受欢迎的工作来改善人类的生活,包括照顾病人和老人。但是,通过对艾丽卡进行编程以模拟一种“温暖、温柔和关怀”的情感,石黑的团队加倍强调了这种护理工作的性别化区分。艾丽卡的设计是以这样一种看法为前提的:照顾世界的劳动责任主要是妇女来承担,她们将提供温暖和温柔,同时减轻人类的负担和痛苦。艾丽卡还被设计为“遵守传统的性别美”的标准。正如石黑所指出的,他把艾丽卡设计成“最美丽的”机器人,吸取了(他所认为的)“三十个美丽女人”的综合形象。显然,仅仅成为一个女性护理机器人是不够的;你还必须是一个“美丽”的机器人。
    Erica, a robot invented by Japanese robot expert Hiroshi Ishiguro, is another example of robot experts recreating gender norms in the pursuit of human nature. Ishiguro hopes Erica, which is still in the early prototype stage, will help lead the world where machines coexist with human beings. Robots will improve human life by doing the most boring and unpopular jobs, including caring for patients and the elderly. However, by programming Erica to simulate an emotion of "warmth, gentleness and care", Ishiguro's team has redoubled its emphasis on the gender distinction in this kind of nursing work. Erica's design is premised on the view that the labor responsibility for taking care of the world is mainly borne by women, who will provide warmth and gentleness while reducing the burden and pain of human beings. Erica was also designed to "abide by the traditional gender beauty" standard. As Ishiguro pointed out, he designed Erica as the "most beautiful" robot and absorbed the comprehensive image of "30 beautiful women". Obviously, it is not enough to be a female nursing robot. You must also be a "beautiful" robot.
  • 日本大阪大学智能机器人研究所所长石黑浩(左)和机器人在交流。
    Ishiguro (left), director of the Institute of Intelligent Robots of Osaka University in Japan, is communicating with robots.
  • 对机器人技术中的种族问题的关注,不仅延伸到了机器人的外观,还包括它们如何与他人互动。像格蕾丝和爱丽卡这样的机器人将如何识别和解读各种各样的面孔呢?可能被植入格蕾丝的算法中的种族主义假设,是否会决定患者将接受的治疗干预类型?人工智能的面部识别系统出了名的糟糕,例如,在解释肤色较深的人的情绪反应方面。有些人根本无法感知深色皮肤的人的脸,更不用说理解他们的表情了。社会学家鲁哈本杰明(Ruha Benjamin)发现,健康保险公司在评估健康风险时最常用的工具之一,将理论上的黑人病人定义为比具有相同健康标志的白人病人的风险要小,因为该工具的算法存在种族偏见。换句话说,我们看到的是另一个压迫性系统的出现,它通过促进对护理工作的“调节理解(mediated understanding)”和提供护理来构建社会关系。借用Joy Buolamwini关于算法的“编码的凝视”概念,我将医疗保健人工智能中的种族主义失衡称为“编码的护理”。
    Attention to race in robotics extends not only to the appearance of robots, but also to how they interact with others. How will robots like Grace and Erica recognize and interpret all kinds of faces? Will the racist hypothesis that may be implanted into Grace's algorithm determine the type of treatment intervention patients will receive? Artificial intelligence facial recognition systems are notoriously bad, for example, in explaining the emotional reactions of people with darker skin color. Some people simply cannot perceive the faces of people with dark skin, let alone understand their expressions. Sociologist Ruha Benjamin found that one of the most commonly used tools for health insurance companies to assess health risks defines black patients as less risky than white patients with the same health signs because the algorithm of the tool is racially biased. In other words, what we see is the emergence of another oppressive system, which constructs social relations by promoting "mediated understanding" of nursing work and providing nursing. Borrowing Joy Buolamwini's concept of "coded gaze" on algorithms, I call the racist imbalance in medical care artificial intelligence "coded care".
  • 编码的护理的想法给我们提供了一个词汇,让我们思考护理工作自动化的潜在危害。越来越多的人认为,使用机器人来实现护理工作的自动化是必要的,它可以帮助老龄化人口,最大限度地减少职业危害,减轻护理人员的负担,并解决护理人员的高离职率和职业倦怠问题。一项研究表明,到2030年,将出现15.1万名有偿直接护理人员和380万名无偿家庭护理人员的短缺。但考虑到这些对编码护理的担忧,机器人自动化是否是解决这一短缺的最佳方式仍有争议。即使假设在不久的将来,同理心、情感劳动和创造力可以随时被机械化(许多机器人专家对此表示怀疑),如果代码中的种族主义和性别偏见得不到解决,将这些类型的护理外包出去可能会给接受护理的人带来严重的后果。
    The idea of coded nursing provides us with a vocabulary to think about the potential hazards of nursing automation. More and more people believe that it is necessary to use robots to realize the automation of nursing work. It can help the aging population, minimize occupational hazards, reduce the burden on nursing staff, and solve the problems of high turnover rate and job burnout of nursing staff. A study shows that by 2030, there will be a shortage of 151,000 paid direct care workers and 3.8 million unpaid home care workers. However, considering these concerns about coding care, whether robot automation is the best way to solve this shortage is still controversial. Even assuming that empathy, emotional labor and creativity can be mechanized at any time in the near future (many robotics experts doubt this), if racism and gender bias in the code are not addressed, outsourcing these types of care may bring serious consequences to the people receiving care.
  • 此外,我们必须严肃对待我们开始时提出的问题自动化对人类工人意味着什么,以及坚持劳动公正必须是采用自动化的先决条件。因为这些决定将对妇女和有色人种社区产生巨大的影响,它们很可能会让位给金钱利益和富裕白人的福祉。但护理是一种独特的劳动,当护理人员受到虐待时,我们都会受到损失。回顾EngKey的机器人教师,他们与学生的脱节感,转化成为了情感劳动,这种机械化、种族化和性别化的劳动方式对教师和学生都造成了伤害。EngKey的工作教给它的韩国学生的,是令人羡慕的白人女性的力量,就像教给他们英语一样。同样,格蕾丝的创造者承诺,这项技术将以“自然地、以情感的方式和病人建立联系”,它将真正为那些在大流行期间被隔离的人提供安慰、同情和善意吗?还是说,这仅仅是其创造者对女性气质的模拟?
    In addition, we must take seriously the questions we raised at the beginning, what automation means to human workers, and that insisting on labor justice must be a prerequisite for the adoption of automation. Because these decisions will have a huge impact on women and colored communities, they are likely to give way to financial benefits and the well-being of wealthy whites. However, nursing is a unique kind of labor. When nursing staff are abused, we all suffer losses. Looking back on EngKey's robot teachers, their sense of disconnection from students has turned into emotional labor, which has caused harm to both teachers and students. EngKey's work teaches its Korean students the power of enviable white women, just as it teaches them English. Similarly, Grace's creator promised that the technology would "naturally and emotionally establish contact with patients". Will it really provide comfort, sympathy and goodwill to those isolated during the pandemic? Or is this just a simulation of femininity by its creator?
  • 与以往任何时候相比,大流行为扩大格蕾丝等护理机器人的使用提供了理由,而我们比以往任何时候都更应该注意这些干预措施的编码方式。我们需要推动更公平和负责任的人工智能,与算法正义联盟(AJL)等集体合作,以实现这一目标。算法正义联盟和其他人的工作提醒我们:“谁编写代码很重要,我们如何编写代码很重要,我们可以编写一个更好的未来。”如果我们真的要追求机器人劳动,那么劳动正义必须是自动化的先决条件。否则,机器人只会提供另一个借口,通过简单地取代人类劳工以及自动化他们的劳动,来忽视人类工人面临的不平等。
    More than ever before, the pandemic provides a reason to expand the use of nursing robots such as Grace, and we should pay more attention to the coding of these interventions than ever before. We need to promote fairer and more responsible artificial intelligence and cooperate with collectives such as the Algorithm Justice Alliance (AJL) to achieve this goal. The work of the Algorithm Justice Alliance and others reminds us: "It is very important who writes the code, how we write the code is very important, and we can write a better future." If we really want to pursue robot labor, then labor justice must be a prerequisite for automation. Otherwise, robots will only provide another excuse to ignore the inequality faced by human workers by simply replacing human workers and automating their labor.
  • 我们需要继续探索发展护理机器人的伦理学,并且以普拉蒙德P哈戈内卡(Pramod P. Khargonekar)和米拉萨姆帕斯(Meera Sampath)等研究人员对当前自动化模式的批评,以及他们提出的“社会责任自动化” 为依据。这种自动化模式表明,企业可以在追求自动化的同时,投资于“培训和培养人类工人的技能”,以适应这种技术驱动的工作场所。因此,我们的想法不是简单地用更有效的技术来取代人类工人,而是要开发一个机器人和人类工人能够真正共存的工作场所。
    We need to continue to explore the ethics of developing nursing robots, and based on the criticism of the current automation mode by researchers such as Pramod P. Khargonekar and Meera Sampath and their proposal of "social responsibility automation". This automation model shows that while pursuing automation, enterprises can invest in "training and developing the skills of human workers" to adapt to this technology-driven workplace. Therefore, our idea is not simply to replace human workers with more effective technologies, but to develop a workplace where robots and human workers can truly coexist.
  • 但更重要的是,我认为开发护理机器人的伦理,必须建立在劳动公正的框架之上,这个框架将提供“对支配基本工人生活和劳动的结构性不平等”的补救措施。这可以通过支持和采纳参议员伊丽莎白沃伦和罗卡纳提出的《基本工人权利法案》来得以实现。该法案的条款将确保护理人员不仅能获得生活工资和医疗保障,还能获得儿童护理和带薪病假。
    But more importantly, I think the ethics of developing nursing robots must be based on the framework of labor justice, which will provide remedial measures for "structural inequalities that dominate the lives and labor of basic workers". This can be achieved by supporting and adopting the Basic Workers' Rights Act proposed by Senators Elizabeth Warren and Rocana. The provisions of the bill will ensure that nursing staff can not only receive living wages and medical insurance, but also receive child care and paid sick leave.
  • 我认为,我们无法想象一个既没有人类工人又没有机器人的社会。因此,当机器人专家致力于开发护理技术时,我们需要关注种族化和性别化的认知,如何被编码到设计中去。指导原则不能只关心如何最好地模拟人性,更要关注如何在编码护理的设计中将公正和公平的原则作为中心。只有到那时,才有可能产生真正带有关怀的算法。
    I don't think we can imagine a society without human workers and robots. Therefore, when robotics experts are committed to developing nursing technology, we need to pay attention to how racial and gender cognition is encoded into design. The guiding principles should not only focus on how to best simulate human nature, but also on how to focus on the principles of justice and fairness in the design of coded care. Only then will it be possible to produce truly caring algorithms.
  • 赞助本站
    Sponsor this site

400所高校都在用的翻译教学平台

试译宝所属母公司