TE||A Skinner box for software

1

导读


Facebook前总裁:远离令你上瘾的社交软件

2

音乐| 精读 | 翻译 | 词组

A Skinner box for software

软件界的“斯金纳箱”---图灵箱

本文英文部分选自经济学人Science and Technology版块

The behavioural ecology of machines

机器的行为生态学

注释:

Behavioral ecology, also spelled behavioural ecology, is the study of the

evolutionary basis for animal behavior due to ecological pressures. Behavioral ecology emerged from ethology after Niko Tinbergen outlined

four questions to address when studying animal behaviors that are the

proximate causes, ontogeny, survival value, and phylogeny of behavior.

Link: https://en.wikipedia.org/wiki/Behavioral_ecology

行为生态学(behavioural ecology)是研究生物行为与其环境的相互关系,研究生物在一定的栖息地的行为方式、行为机制、行为的生态学意义的科学。行为科学与生态学交叉,并涉及生理学、心理学、遗传学、进化论、社会学和经济学的学科。

链接:

http://www.baike.com/wiki/%E8%A1%8C%E4%B8%BA%E7%94%9F%E6%80%81%E5%AD%A6

To understand digital advertising, study its algorithms

想要了解数字广告,就应研究其算法

A Skinner box for software

软件界的“斯金纳箱”---图灵箱

注释:斯金纳箱(Skinner box)是心理学实验装置。行为主义者斯金纳1938年发明,并于动物操作条件作用实验。其基本结构:在箱壁的一边有一个可供按压的杠杆(大都是一块金属板),在杠杆旁边有一个承受食物的小盒紧靠着箱壁上的小孔,小孔外是食物释放器,其中贮有颗粒形食物。动物在箱内按一下杠杆,即有一粒食物从小孔口落入小盒内,动物可取食。一只白鼠禁食24小时后被放入箱内,开始它在箱内探索,偶尔按压了杠杆,获得食丸。白鼠开始可能并没有注意到食物落下,但若干次重复后,就形成了压杆取食的条件反射。以后稍有改进,如外包隔音箱,食物释放装置由程序控制等,可测试动物能否学会按三次杠杆以得到食物,或间隔一定时间按压杠杆才能得到食物。对不同物种的动物,其设计稍有不同。该装置实际是对桑代克迷箱的改进,后被用于研究动物学习能力和自我刺激与合作行为等心理学研究。已采用了电子线路,使用更方便。

链接:

https://baike.baidu.com/item/%E6%96%AF%E9%87%91%E7%BA%B3%E7%AE%B1/5378961?fr=aladdin

ALAN MISLOVE studies algorithms. Recently, his research at Northeastern University, in Boston, has shown that Facebook’s software was leaking users’ phone numbers to advertisers. He has also found new ways to audit that same software for racial bias. But work like his faces challenges. Scraping data from public-facing websites often sails close to breaching their terms and conditions. And the companies those websites belong to are generally unwilling to give researchers more direct access to their systems.

波士顿东北大学(Northeastern University)的阿伦·梅丝洛夫(Alan Mislove)主要从事算法研究。近期,他的一项研究表明,Facebook的软件向广告商泄露用户的手机号码。他还发明了新算法来评估软件的种族偏见。然而这项工作有不少挑战性。通常,从公共网站挖掘数据往往会违反公司的条款及细则。何况这些网站的所属公司不愿意让研究人员直接访问他们的系统。

注释:

1. Alan Mislove

链接: https://web.northeastern.edu/nulab/about/people-faculty/

2. Northeastern University:波士顿东北大学(Northeastern University),简称NEU,是位于美国东北部马萨诸塞州州府波士顿市的一所顶尖私立研究型大学,以录取率低著称。学校一共汇聚了来自全世界53个国家的精英,在实践性学习、跨学科研究以及社区参与方面都处于世界领先地位。东北大学由8个学院组成,设有65个本科专业和125个研究生专业,可以授予硕士、博士和职业教育学位。2014年东北大学收到的捐赠达到了7亿美元。2017年USNews美国大学排名,东北大学排第39名。

链接:

https://baike.baidu.com/item/%E6%B3%A2%E5%A3%AB%E9%A1%BF%E4%B8%9C%E5%8C%97%E5%A4%A7%E5%AD%A6/7860179?fr=aladdin

Moreover, examining other people’s algorithms requires the creation of your own to do so. Dr Mislove’s group often spends months just writing the code needed to gather any data at all about the objects of its inquiry. This means that only those with sufficient computer-science skills can study the computer programs that play an ever-growing role in society—not just in commerce, but also in politics, economics, justice and many other areas of life. This is bad for research and for the public.

此外,先创造自己的算法才能去评估他人的算法。梅丝洛夫博士的小组通常花费数月编程,用于收集和调查目标相关的所有数据。这表明,只有那些掌握卓越计算机科学技能的人编写的程序才能在社会中起着越来越重要的作用—— 不仅在商业领域,在政治,经济,司法和生活中的其它领域也是如此。但这不利于研究和广大公众。

Now, as Facebook finds itself in the throes of a scandal over its handling of data and the power of its hyper-targeted advertising software, Dr Mislove is working with a group of researchers at the Massachusetts Institute of Technology (MIT) who think they have an answer to these problems. This group, based at MIT's Media Lab and led by Iyad Rahwan, has taken a leaf out of the book of B.F. Skinner, an animal behaviourist who, several decades ago, worked down the road from MIT at Harvard. Skinner invented a device, now known as a Skinner box, which standardised the process of behavioural experimentation. He used his boxes to control input stimuli (food, light, sound, pain) and then observed output behaviour in an attempt to link the one to the other. Though by no means perfect, the Skinner box was a big advance in the field. Dr Rahwan hopes to do something similar to software using what he calls a Turing box.

现在,Facebook意识到自己陷在数据信息处理以及精准定向广告软件的丑闻里,梅丝洛夫博士和麻省理工的科研团队合作,这些科研人员相信他们能够解决这些问题。这个由奥德·拉赫万(lyad Rahwan)领导的麻省理工媒体实验室,已经在学习动物行为学家斯金纳(B.F. Skinner)的经验。斯金纳多年前在哈佛工作。他发明了一种现在广为人知的装置“斯金纳箱”, 其作用是把动物行为的实验规范化标准化。他用这个箱子控制刺激源的输入(比如食物,光,声音或者疼痛感),再观察对应的行为,以便找出两者之间的关联。尽管不可能做到尽善尽美,斯纳金箱是这个领域的一个巨大进步。 拉赫万博士希望利用他称之为图灵箱(Turing box)的东西在软件中做些类似尝试。

注释:

1. Lyad Rahwan

https://en.wikipedia.org/wiki/Iyad_Rahwan

2.take a leaf from/out of sb's book

to copy sb's behaviour and do things in the same way that they do, because they are successful 效仿,模仿(成功之人的举止和行为)

This “box” is itself a piece of software. Place an algorithm in it, control the data inputs, measure the outcomes, and you will be able to work out exactly how it behaves in different circumstances. Anyone who wants to study an algorithm could upload it to a Turing box. The box’s software would then start running the algorithm through a standard data set of the kind it was designed to crunch. All face-recognition algorithms, for example, would be given the same scientifically validated set of faces. The algorithm’s output—in this case how it classifies different faces—would be recorded and analysed. Dr Rahwan’s hope is that companies will want political and social scientists to use the box to scrutinise their algorithms for potentially harmful flaws (eg, treating faces differently on racial grounds), and that researchers will line up to do the testing.

这个箱子本身就是一个软件。在其中放置一种算法,这个箱子控制输入算法的数据,评估对应的输出,你就能精确地计算出这种算法在不同情况下的表现。任何人想要研究一种算法,都可以把算法上传到图灵箱。这个箱子软件就会运用一套标准数据集来运行算法。这套数据集能高速地处理数据。比如,所有人脸识别算法,都会得到一系列经过科学验证的相同的人脸。这种算法的输出--在这里具体指如何对不同的人脸进行分类—会被记录和分析。拉赫万博士希望,企业会希望政治和社会科学家利用这个箱子去仔细检查他们算法的潜在危害,(比如因种族原因区别对待不同人脸),也希望研究人员争先恐后地用这个箱子做测试。

注释:

Ground: good reason充分的理由

[可数名词, 常用复数] ~ for sth/for doing sth a good or true reason for saying, doing or believing sth 充分的理由;根据

You have no grounds for complaint.

你没有理由抱怨。摘自《牛津高阶英汉双解词典》

Indeed, his ambitions go further still. His intention is that the Turing box should become just one component of a new field, the scientific study of the behaviour exhibited by intelligent machines, and of the impact of that behaviour on people. A demonstration paper he and his colleagues have submitted for publication to the International Joint Conference on Artificial Intelligence describes the system, as well as the broader details of this new field of machine behaviour. He plans to finish the Turing-box software by the summer, and says he will publish the code under an open-source licence shortly thereafter, for anyone to reuse. The running of the platform will then be left to a not-for-profit firm that he plans to spin out of MIT.

事实上,拉赫万的野心不止于此,他试图让“图灵箱”成为新生领域的组成部分,以研究智能机器的行为以及对人类的影响。拉赫万及其同事提交给世界人工智能联合会议出版的实证研究论文中,描述了该系统以及机器行为这一新生领域的更多细节。拉赫万计划夏天结束前完成图灵箱软件研究,随后根据开源许可公开代码,供所有人使用。同时拉赫万计划从MIT剥离出一个非营利性企业,并由该企业运营这个开源平台。

Boxing clever

智能化的“箱子”

It is a neat idea, and timely. Algorithms are being developed far faster than their impacts are being studied and understood (see chart). The Turing box, if it works as intended, could help turn the tide. Understanding algorithms’ behaviour is particularly urgent in the existing digital-advertising “ecosystem”, in which individual users of software are, in effect, in their own Skinner boxes—with their actions constantly monitored, and tailored rewards fed to them. The Facebook furore, for example, revolves around allegations that Cambridge Analytica, a digital lobbying firm, improperly obtained data from Facebook, then used them to aim advertisements which influenced the American presidential election in 2016 (the firm has denied any wrongdoing).

这个好主意来的恰逢其时。算法研发的速度远远快于人类研究理解算法对人类的影响的速度(如图所示)。若是图灵箱能够如愿的发挥作用,那倒能够扭转这一局面。现下数字广告生态系统里,软件的个体用户事实上正处在他们自己的斯金纳盒里面,在这里,软件时刻监控他们的行为,推送特定的内容奖励他们。举个例子,Facebook的风波因一家数字游说公司——剑桥数据分析公司(Cambridge Analytica)遭到指控而起。该公司采用不恰当的方式从脸书窃取数据,然后利用这些数据精准投放广告,从而影响了2016年的美国总统大选(该公司否认有任何不当行为)。

注释:Cambridge Analytica : 起底FB泄密丑闻背后的Cambridge Analytica

http://tech.qq.com/a/20180320/016568.htm

Dr Rahwan recognises that the reluctance of many companies which form part of the digital-advertising ecosystem to upload their algorithms for inspection make it a bad place to start. So, to begin with, he will work elsewhere, studying less controversial and commercially sensitive systems such as open-source algorithms for processing natural language.

拉赫万博士认识到,许多公司不愿意上传算法进行检测,而这些公司又是数字公告生态系统形成的一部分,这就使得起步就很艰难。因此, 他打算先研发一个争议少些,商业敏感度低些的系统,比如处理自然语言的开源算法。

He says, though, that the ultimate goal is to enable the study of the algorithms which some of the world’s most valuable IT firms hold dearest: Facebook’s newsfeed, for example, or Amazon’s product-recommendation software. That means looking at the behaviour of these algorithms in an environment which is as close as possible to that in which they normally operate, so that their impact on the real world can be measured. This in turn will require the firms that own them giving independent researchers access to their systems and data.

然而,拉赫万博士称,最终目标是对世界上一些最有价值的IT公司藏得最深的算法进行研究,比如Facebook的新闻反馈,或者亚马逊的产品推荐软件。这意味着要在尽可能接近正常运作的环境下仔细观察这些算法,以便评估它们对现实世界的影响。这就需要这些公司对独立研究人员开放其系统和数据的访问权。

Recent years have seen things go in the opposite direction. According to Michal Kosinski of the Stanford Graduate School of Business, who in 2012 pioneered the use of Facebook data to study personality, “academic researchers have virtually no access even to publicly available data without breaking a given platform’s terms of service.” Firms’ scruples in these matters are not driven only by desire for commercial secrecy. As this week’s events have shown, a leak of personal data from an academic inquiry can be just as damaging as one from a sloppy business partner.

近几年事情的走向与初衷背道而驰。斯坦福大学商学院的迈克尔·科辛斯基(Michal Kosinski),率先利用脸书的数据进行性格研究,他说道,“学术研究者如果不违反平台规定的服务条款,甚至无法访问任何公开数据”。企业对这些事情的顾虑并不只是出于保护商业秘密的考虑。本周发生的事情表明个人数据,无论是从学术调查中泄露的还是从粗心的商业伙伴那里泄露的,一样具有破坏性。

So, research on particularly sensitive data may require academics to be physically present inside an organization in order to gain access to those data, a process akin to studying in the rare-books section of a library. It might also be a good idea to have independent umpires of some sort, to ensure that both firms and researchers stay on the straight and narrow.

因此,对特别敏感的数据研究可能要求学术人员必须在机构内部访问这些数据,这一过程类似于在图书馆的珍本藏馆里学习。也有一个好办法,让独立的仲裁机构监督,确保企业和研究人员遵纪守法。

注释:the straight and narrow

(informal) the honest and morally acceptable way of living 诚实正当的生活;正路

His wife is trying to keep him on the straight and narrow.

他妻子想方设法让他要诚实正派。摘自《牛津高级英汉双解词典》

Facebook seems open to the idea of working with researchers in this way. In a statement given to The Economist a few days before the Cambridge Analytica story broke, the firm stated a desire to work with researchers in order to understand the impact of its systems, but warned that it had to shield its users’ data from third parties. Facebook also said that it is “actively working” on an approach which achieves both goals, although it declined to provide any details of that work. Contacted later in the week, the firm’s data-science team declined to issue any additional statement.

Facebook似乎愿意以这种方式与研究人员进行合作。剑桥数据分析公司的丑闻爆出之前,经济学人收到一份声明写到,该企业想要和研究人员合作,以试图了解系统的影响,但是警告必须保护用户的数据不被第三方得到。Facebook也表示,公司正在积极寻求一种能实现这两个目标的方法。不过,该公司拒绝透露相关工作的任何细节。本周晚些时候联系时,该公司的数据科学团队拒绝发表任何附加声明。

The Turing box is only in the earliest stages of development, but it, and systems like it, offer to inject something vital into the discussion of digital practices—independently gathered causal evidence. Without that, people may never get out of the Skinner boxes in which the tech firms have put them.

图灵箱仅处于开发的最初阶段,但是它和类似的系统,为数字实践提供了至关重要的东西-独立收集的因果证据。没有它,人们可能永远走不出科技公司的斯金纳箱。

翻译组:

Xingyi,男,小硕,经济学人爱好者

Jane,女,卫生民工,经济学人爱好者

Minjia,女,广告策划,经济学人读者

Grace , 女,市场公关,经济学人爱好者

Helga,女,笔译民工,经济学人爱好者

Vambie,女,互联网民工,经济学人爱好者

校核组:

Li Xia, 女, HR, 经济学人发烧友

Damon ,男,钢管搬运工,经济学人打酱油狂人

3

观点 |评论|思考

本次观点为Joel独家奉献

Joel,男,数据分析,科技类外刊爱好者

电影《黑客帝国》中描述了这样一个场景:在一个个巨大的发电厂中,所有人都像实验室尸体标本一样泡在胶囊里,统治了世界的机器人通过算法构建出一个“真实”的世界,所有人类活体“幸福”的活在虚构的世界中,思考、恋爱、结婚、生子、变老、死去,在算法中,人过完了的一生,并用全部的脑电波为机器人提供源源不断的电力。在这个算法统治的世界,整个大地看上去一片死寂,可泡在胶囊中的人脸上却洋溢着幸福的微笑。。。

一个算法统治的世界就像末日,人成了单纯的工具,没有人权,没有尊严,没有隐私,更没自由。

现实中,我们何曾不是生活在一个算法的世界的边缘,脸书的新闻反馈算法,亚马逊的产品推荐算法,谷歌的广告算法等等,这些科技巨头无时不刻收集着我们的隐私,影响着我们的生活。

未来如何避免一个被算法统治的世界,如何监督规范拥有核心算法的公司或者团体,是我们应该认真思考的一个问题。

4

愿景

打造
独立思考 | 国际视野 | 英文学习

小组

小编理工男,建筑底层民工,经济学人铁粉,和小伙伴(经济学人小群不超过8个人)看经济学人到现在已经将近700多天。现有一经济学人大群,如果您也有兴趣,可加入我们学习小组,群规甚严,请三思后而入群,WeChat : foxwulihua

长按关注个人公众号
英文部分转自《经济学人》,非商业用途,仅限于小组学习,如有任何翻译错误,请大家留言更正,谢谢!
(0)

相关推荐