卡一卡二卡三国色天香永不失联-看a网站-看黄视频免费-看黄网站免费-4虎影院最近地址-4虎最新地址

人工智能“gay達”可憑一張照片判斷性向 準確率超80%

雕龍文庫 分享 時間: 收藏本文

人工智能“gay達”可憑一張照片判斷性向 準確率超80%

據國外媒體報道,斯坦福大學研究人員開發出的一個算法能夠根據一張面部照片判斷你的性取向。通過對3.5萬多張面部照片的分析,研究人員基于所得數據創建了一套算法模型。研究表明,該算法判斷男性性向的準確率高達81%,而對女性的性向判斷的準確率為74%。

Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research suggesting that machines can have significantly better "gaydar" than humans.

一項新研究顯示,人工智能可以通過人臉照片精確識別出這個人是直男還是同性戀,該研究認為,機器的“gay達”(同志雷達)比人類準確得多。

The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology and the potential for this kind of software to violate people's privacy or be abused for anti-LGBT purposes.

這項斯坦福大學的研究發現,計算機算法能正確區分直男與同性戀,準確率高達81%,對女性性取向判別的準確率為74%。這一研究引發了人們對性向的生物學起源、人臉識別科技的道德倫理以及此類軟件對個人隱私可能造成的侵犯,或被濫用于反同性戀、雙性戀及變性人群體等問題的爭議。

The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website. The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using "deep neural networks", meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.

這項研究率先被《經濟學人》報道,并發表在《人格與社會心理學》雜志上。這種人工智能分析了美國某交友網站上公開發布的35000多張男女面部圖像樣本。研究人員邁克?科辛斯基和Yilun Wang利用“深層神經網絡”從圖像中提取相關性別特征,這是一個從大量數據中學會視覺分析的復雜數學系統。

The research found that gay men and women tended to have "gender-atypical" features, expressions and "grooming styles", essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.

研究發現,同性戀男女往往具有“非典型性別”特征、表情和“打扮風格”,也就是說男同性戀一般趨向于女性化,而女同反之。研究數據還發現了一些其他趨勢,如男同性戀的下巴比直男更窄,鼻子更長,前額更寬。而同性戀女性相比直女下巴更寬,前額更窄。

Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means "faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain", the authors wrote.

人類在這方面的的判斷表現遜于機器算法,其判斷男性性向的準確率僅為61%,女性的為54%。當人工智能軟件能夠瀏覽5張測試對象的照片時,準確率則更高:對男性性向判斷的準確率為91%,對女性的為83%。研究人員在論文中寫道,從廣義上講,這意味著“人類面孔包含的性取向信息比人類大腦可以感知和解讀的更多”。

The paper suggested that the findings provide "strong support" for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine's lower success rate for women also could support the notion that female sexual orientation is more fluid.

文中指出,有理論認為胎兒出生前接觸到的某些激素決定了其性向,也就是說同性戀是天生的,而不是后天的選擇,該研究結果對此提供了“有力支持”。而機器對于女性性向識別成功率較低的現象,則印證了女性性取向更加易變的說法。

While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people's sexual orientation without their consent.

雖然研究結果對性別和性征有明顯的局限,有色人種沒有被納入研究,而變性者和雙性戀也沒有納入考量,但這已經顯示了人工智能的巨大影響,并給人類敲響了警鐘。社交網絡和政府數據庫中存儲了數十億人像圖片,研究人員認為這些公共數據都可能在未經本人同意的情況下,被人用來進行性取向識別。

It's easy to imagine spouses using the technology on partners they suspect are closeted, or teenagers using the algorithm on themselves or their peers. More frighteningly, governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications.

可想而知,夫妻可能會用這項技術測試被他們懷疑是深柜的另一半,青少年也可以使用這種算法來識別自己和同齡人。更加可怕的是,一些對LGBT群體進行法律制裁的國家可能會利用該技術讓人出柜。這說明開發并公開此類軟件的行為本身存在爭議,因為這可能會導致有危害性的應用軟件出現。

But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.

但該論文的作者表示,這些技術早已存在,曝光其功能很關鍵,因為這樣政府和公司才能主動關注其隱私風險,以及進行管理防范的必要性。

"It's certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes," said Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar. "If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that's really bad."

多倫多大學心理學教授尼克?魯爾曾發表過關于“同志雷達”的研究。他表示:“這當然是令人不安的,它就像任何新工具一樣,如果心術不正的人得到它,就會用來做壞事。如果我們開始以外表來分析一個人,由此得出判斷,并對他們做出恐怖的事情,那就太糟糕了。”

據國外媒體報道,斯坦福大學研究人員開發出的一個算法能夠根據一張面部照片判斷你的性取向。通過對3.5萬多張面部照片的分析,研究人員基于所得數據創建了一套算法模型。研究表明,該算法判斷男性性向的準確率高達81%,而對女性的性向判斷的準確率為74%。

Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research suggesting that machines can have significantly better "gaydar" than humans.

一項新研究顯示,人工智能可以通過人臉照片精確識別出這個人是直男還是同性戀,該研究認為,機器的“gay達”(同志雷達)比人類準確得多。

The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology and the potential for this kind of software to violate people's privacy or be abused for anti-LGBT purposes.

這項斯坦福大學的研究發現,計算機算法能正確區分直男與同性戀,準確率高達81%,對女性性取向判別的準確率為74%。這一研究引發了人們對性向的生物學起源、人臉識別科技的道德倫理以及此類軟件對個人隱私可能造成的侵犯,或被濫用于反同性戀、雙性戀及變性人群體等問題的爭議。

The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website. The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using "deep neural networks", meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.

這項研究率先被《經濟學人》報道,并發表在《人格與社會心理學》雜志上。這種人工智能分析了美國某交友網站上公開發布的35000多張男女面部圖像樣本。研究人員邁克?科辛斯基和Yilun Wang利用“深層神經網絡”從圖像中提取相關性別特征,這是一個從大量數據中學會視覺分析的復雜數學系統。

The research found that gay men and women tended to have "gender-atypical" features, expressions and "grooming styles", essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.

研究發現,同性戀男女往往具有“非典型性別”特征、表情和“打扮風格”,也就是說男同性戀一般趨向于女性化,而女同反之。研究數據還發現了一些其他趨勢,如男同性戀的下巴比直男更窄,鼻子更長,前額更寬。而同性戀女性相比直女下巴更寬,前額更窄。

Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means "faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain", the authors wrote.

人類在這方面的的判斷表現遜于機器算法,其判斷男性性向的準確率僅為61%,女性的為54%。當人工智能軟件能夠瀏覽5張測試對象的照片時,準確率則更高:對男性性向判斷的準確率為91%,對女性的為83%。研究人員在論文中寫道,從廣義上講,這意味著“人類面孔包含的性取向信息比人類大腦可以感知和解讀的更多”。

The paper suggested that the findings provide "strong support" for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine's lower success rate for women also could support the notion that female sexual orientation is more fluid.

文中指出,有理論認為胎兒出生前接觸到的某些激素決定了其性向,也就是說同性戀是天生的,而不是后天的選擇,該研究結果對此提供了“有力支持”。而機器對于女性性向識別成功率較低的現象,則印證了女性性取向更加易變的說法。

While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people's sexual orientation without their consent.

雖然研究結果對性別和性征有明顯的局限,有色人種沒有被納入研究,而變性者和雙性戀也沒有納入考量,但這已經顯示了人工智能的巨大影響,并給人類敲響了警鐘。社交網絡和政府數據庫中存儲了數十億人像圖片,研究人員認為這些公共數據都可能在未經本人同意的情況下,被人用來進行性取向識別。

It's easy to imagine spouses using the technology on partners they suspect are closeted, or teenagers using the algorithm on themselves or their peers. More frighteningly, governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications.

可想而知,夫妻可能會用這項技術測試被他們懷疑是深柜的另一半,青少年也可以使用這種算法來識別自己和同齡人。更加可怕的是,一些對LGBT群體進行法律制裁的國家可能會利用該技術讓人出柜。這說明開發并公開此類軟件的行為本身存在爭議,因為這可能會導致有危害性的應用軟件出現。

But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.

但該論文的作者表示,這些技術早已存在,曝光其功能很關鍵,因為這樣政府和公司才能主動關注其隱私風險,以及進行管理防范的必要性。

"It's certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes," said Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar. "If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that's really bad."

多倫多大學心理學教授尼克?魯爾曾發表過關于“同志雷達”的研究。他表示:“這當然是令人不安的,它就像任何新工具一樣,如果心術不正的人得到它,就會用來做壞事。如果我們開始以外表來分析一個人,由此得出判斷,并對他們做出恐怖的事情,那就太糟糕了。”

信息流廣告 周易 易經 代理招生 二手車 網絡營銷 旅游攻略 非物質文化遺產 查字典 社區團購 精雕圖 戲曲下載 抖音代運營 易學網 互聯網資訊 成語 成語故事 詩詞 工商注冊 注冊公司 抖音帶貨 云南旅游網 網絡游戲 代理記賬 短視頻運營 在線題庫 國學網 知識產權 抖音運營 雕龍客 雕塑 奇石 散文 自學教程 常用文書 河北生活網 好書推薦 游戲攻略 心理測試 石家莊人才網 考研真題 漢語知識 心理咨詢 手游安卓版下載 興趣愛好 網絡知識 十大品牌排行榜 商標交易 單機游戲下載 短視頻代運營 寶寶起名 范文網 電商設計 免費發布信息 服裝服飾 律師咨詢 搜救犬 Chat GPT中文版 經典范文 優質范文 工作總結 二手車估價 實用范文 古詩詞 衡水人才網 石家莊點痣 養花 名酒回收 石家莊代理記賬 女士發型 搜搜作文 石家莊人才網 鋼琴入門指法教程 詞典 圍棋 chatGPT 讀后感 玄機派 企業服務 法律咨詢 chatGPT國內版 chatGPT官網 勵志名言 河北代理記賬公司 文玩 語料庫 游戲推薦 男士發型 高考作文 PS修圖 兒童文學 買車咨詢 工作計劃 禮品廠 舟舟培訓 IT教程 手機游戲推薦排行榜 暖通,電地暖, 女性健康 苗木供應 ps素材庫 短視頻培訓 優秀個人博客 包裝網 創業賺錢 養生 民間借貸律師 綠色軟件 安卓手機游戲 手機軟件下載 手機游戲下載 單機游戲大全 免費軟件下載 石家莊論壇 網賺 手游下載 游戲盒子 職業培訓 資格考試 成語大全 英語培訓 藝術培訓 少兒培訓 苗木網 雕塑網 好玩的手機游戲推薦 漢語詞典 中國機械網 美文欣賞 紅樓夢 道德經 標準件 電地暖 網站轉讓 鮮花 書包網 英語培訓機構 電商運營
主站蜘蛛池模板: 国内精品福利在线视频 | 深夜释放自己糖心vlog | 国产青草视频 | 国产高清不卡一区二区三区 | 欧美日韩福利视频 | 国产乱视频 | 成人福利免费观看体验区 | 国产极品一区 | 国产精品免费观在线 | 夜夜春精品视频 | 高清国产精品久久久久 | 日本一区二区三区久久 | 波多野结衣福利 | 国产自在自线午夜精品视频在 | 日韩国产欧美一区二区三区在线 | 看a网址| 黄色视屏日本 | 黄色一级国产 | 国产精品免费视频一区一 | 日本黄a三级三级三级 | 亚洲福利天堂网福利在线观看 | 一级女性全黄生活片看看 | 国产午夜精品一区二区三区不卡 | 国产一区二区三区视频 | 奇米影视亚洲狠狠色777不卡 | 国产精品123| 日韩欧美在线综合 | 日韩精品免费一区二区三区 | 97午夜理伦片在线影院 | 日本黄色小视频在线观看 | 午夜精品网站 | 久久99精品久久久久久秒播放器 | 亚洲综合一区二区三区四区 | 噜噜噜狠狠夜夜躁 | 一级做a爰性色毛片免费 | 麻豆国产高清在线播放 | 狂野欧美性猛交xxxx乱大交 | 一区在线观看 | 日本jizz强视频69视频 | 男人爱看的视频网站免费 | 欧美一级免费在线观看 |