{"id":47773,"date":"2017-10-09T08:50:26","date_gmt":"2017-10-09T12:50:26","guid":{"rendered":"http:\/\/isarta.com\/infos\/?p=47773"},"modified":"2017-10-06T16:50:58","modified_gmt":"2017-10-06T20:50:58","slug":"le-probleme-des-biais-dans-les-algorithmes-de-decision","status":"publish","type":"post","link":"https:\/\/isarta.com\/infos\/le-probleme-des-biais-dans-les-algorithmes-de-decision\/","title":{"rendered":"Le probl\u00e8me des biais dans les algorithmes de d\u00e9cision"},"content":{"rendered":"<p><span style=\"font-size: 20px; color: #000080;\">Un\u00a0<span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"http:\/\/www.psychomedia.qc.ca\/categorie\/1127\">biais cognitif<\/a><\/span>\u00a0est une forme de pens\u00e9e qui d\u00e9vie de la\u00a0<span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"http:\/\/www.psychomedia.qc.ca\/lexique\/definition\/pensee-analytique\">pens\u00e9e logique ou rationnelle<\/a><\/span>\u00a0et qui a tendance \u00e0 \u00eatre syst\u00e9matiquement utilis\u00e9e dans certaines situations.<\/span><\/p>\n<p><img loading=\"lazy\" class=\"aligncenter wp-image-47860 \" src=\"http:\/\/isarta.com\/infos\/wp-content\/uploads\/2017\/09\/Fotolia_117569127_S.jpg\" width=\"605\" height=\"403\" srcset=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2017\/09\/Fotolia_117569127_S.jpg 849w, https:\/\/isarta.com\/infos\/wp-content\/uploads\/2017\/09\/Fotolia_117569127_S-300x200.jpg 300w, https:\/\/isarta.com\/infos\/wp-content\/uploads\/2017\/09\/Fotolia_117569127_S-768x512.jpg 768w\" sizes=\"(max-width: 605px) 100vw, 605px\" \/><\/p>\n<p><span style=\"font-size: 12px; color: #808080;\">9 octobre 2017<\/span><\/p>\n<p>Dans la litt\u00e9rature, il en existe une foule d\u2019<span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"http:\/\/www.businessinsider.com\/cognitive-biases-2015-10\">exemples<\/a>,<\/span> dont les causes et les cons\u00e9quences ont fait l\u2019objet de nombreuses<strong> \u00e9tudes.<\/strong><\/p>\n<p>Dans certaines situations, un <strong>programme informatique<\/strong> qui analyse un probl\u00e8me quelconque peut, en th\u00e9orie, \u00eatre apte \u00e0 prendre une <strong>d\u00e9cision<\/strong> <span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"http:\/\/theundercoverrecruiter.com\/uproot-unconscious-bias\/?utm_source=ReviveOldPost&amp;utm_medium=social&amp;utm_campaign=ReviveOldPost\">exempte de biais<\/a>.<\/span><\/p>\n<p>Ce n\u2019est toutefois pas toujours le cas.<\/p>\n<p>En avril dernier, le magazine <span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"http:\/\/science.sciencemag.org\/content\/356\/6334\/183\"><em>Science<\/em><\/a><\/span> a mis en lumi\u00e8re un probl\u00e8me d\u00e9j\u00e0 connu des experts: la pr\u00e9sence de <strong>biais<\/strong> dans les algorithmes.<\/p>\n<p>Les machines apprenant beaucoup en lisant et en \u00e9coutant le langage humain, elles assimilent du m\u00eame coup certains <strong>pr\u00e9jug\u00e9s<\/strong> entretenus par ces derniers. Concr\u00e8tement, ces biais peuvent avoir un <strong>impact<\/strong> sur des syst\u00e8mes servant \u00e0 accorder des <span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"https:\/\/www.forbes.com\/forbes\/welcome\/?toURL=https:\/\/www.forbes.com\/sites\/julianmitchell\/2017\/08\/22\/this-company-uses-ai-to-help-lenders-automate-the-mortgage-loan-process\/&amp;refURL=https:\/\/www.google.ca\/&amp;referrer=https:\/\/www.google.ca\/\">pr\u00eats financiers<\/a>,<\/span> \u00e0 s\u00e9lectionner des <strong>candidatures<\/strong> pour des entrevues ou m\u00eame \u00e0 rendre des <span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"https:\/\/www.nytimes.com\/2017\/05\/01\/us\/politics\/sent-to-prison-by-a-software-programs-secret-algorithms.html?mcubz=0\">d\u00e9cisions judiciaires<\/a>.<\/span><\/p>\n<p>Dans leur \u00e9tude publi\u00e9e dans <strong><em>Science<\/em>,<\/strong> les chercheurs Aylin Caliskan, Joanna J. Bryson\u00a0et Arvind Narayanan se sont int\u00e9ress\u00e9s \u00e0 une m\u00e9thode d\u2019apprentissage appel\u00e9e <strong>\u00abword embedding\u00bb,<\/strong> o\u00f9 un terme est d\u00e9fini par les mots desquels il se rapproche.<\/p>\n<p>\u00c0 travers leurs recherches, ils ont notamment d\u00e9couvert que le mot<strong>\u00a0<em>femme<\/em><\/strong>\u00a0\u00e9tait davantage associ\u00e9 \u00e0 des carri\u00e8res dans le domaine des arts et des lettres, alors que le mot\u00a0<strong><em>homme<\/em><\/strong>\u00a0\u00e9tait plut\u00f4t li\u00e9 \u00e0 des postes dans le domaine des math\u00e9matiques ou de l\u2019ing\u00e9nierie.<\/p>\n<p>De plus, certaines\u00a0<strong>origines ethniques<\/strong> \u00e9taient associ\u00e9es \u00e0 un ensemble de mots g\u00e9n\u00e9ralement positifs. C\u2019\u00e9tait le cas entre autres pour les Am\u00e9ricains de descendance europ\u00e9enne. Dans d\u2019autres cas, elles \u00e9taient li\u00e9es \u00e0 des groupements beaucoup plus<strong> n\u00e9gatifs.<\/strong><\/p>\n<h2><span style=\"font-size: 24px; color: #000080;\"><strong>Un probl\u00e8me qui m\u00e9rite plus d&rsquo;attention?<\/strong><\/span><\/h2>\n<p>D\u00e8s 2014, un <span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"https:\/\/obamawhitehouse.archives.gov\/sites\/default\/files\/docs\/big_data_privacy_report_5.1.14_final_print.pdf\">rapport<\/a> <\/span>produit par l\u2019administration <strong>Obama<\/strong> soulevait la difficult\u00e9 de rep\u00e9rer, quantifier et corriger les biais provenant de processus de d\u00e9cision automatis\u00e9s.<\/p>\n<p>Depuis, certains <strong>experts<\/strong> se sont pench\u00e9s sur le dossier. C&rsquo;est notamment l\u2019un des sujets de recherche du groupe<span style=\"color: #3366ff;\"> <a style=\"color: #3366ff;\" href=\"https:\/\/iapp.org\/news\/a\/ai-now-addresses-biased-algorithms\/\">AI Now initiative<\/a>,<\/span> qui compte \u00e0 sa t\u00eate <strong>Meredith Whittaker,<\/strong> chercheuse chez Google, ainsi que <strong>Kate Crawford,<\/strong> chercheuse chez Microsoft.<\/p>\n<p>Malgr\u00e9 tout, plusieurs chercheurs jugent que le probl\u00e8me n\u2019est pas encore suffisamment <span style=\"color: #3366ff;\"><a style=\"color: #3366ff;\" href=\"https:\/\/www.technologyreview.com\/s\/608248\/biased-algorithms-are-everywhere-and-no-one-seems-to-care\/\">pris au s\u00e9rieux<\/a> <\/span>par les organisations qui d\u00e9veloppent et utilisent ces syst\u00e8mes. De plus, les <strong>algorithmes<\/strong> d\u00e9velopp\u00e9s sont souvent comme des<span style=\"color: #3366ff;\"> <a style=\"color: #3366ff;\" href=\"https:\/\/www.technologyreview.com\/s\/604122\/the-financial-world-wants-to-open-ais-black-boxes\/\">bo\u00eetes noires<\/a> <\/span>qu&rsquo;il est difficile d&rsquo;explorer.<\/p>\n<p>Alors que l&rsquo;utilisation de l&rsquo;<strong>intelligence artificielle<\/strong> prend de plus en plus d&rsquo;importance, il semble donc qu&rsquo;il faudra travailler en mode <strong>rattrapage<\/strong> pour am\u00e9liorer les algorithmes en place.<\/p>\n<div class=\"brdr2\"><\/div>\n<div class=\"likebtn_container\" style=\"clear:both;text-align:center;\"><!-- LikeBtn.com BEGIN --><span class=\"likebtn-wrapper\"  data-identifier=\"post_47773\"  data-site_id=\"5cc1ecd16fd08b776710d1e9\"  data-theme=\"drop\"  data-btn_size=\"100\"  data-icon_l=\"false\"  data-icon_d=\"false\"  data-icon_l_url=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2019\/04\/happy.png#64794\"  data-icon_d_url=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2019\/04\/sleep.png#64796\"  data-icon_size=\"58\"  data-icon_l_c=\"#771414\"  data-bg_c=\"#ffffff\"  data-bg_c_v=\"#ffffff\"  data-brdr_c=\"#ffffff\"  data-f_size=\"11\"  data-f_family=\"Tahoma\"  data-counter_fs=\"b\"  data-lang=\"fr\"  data-ef_voting=\"wobble\"  data-dislike_enabled=\"false\"  data-counter_type=\"percent\"  data-counter_show=\"false\"  data-tooltip_enabled=\"false\"  data-tooltip_like_show_always=\"true\"  data-white_label=\"true\"  data-i18n_like=\"Aimer \/ Partager\"  data-i18n_dislike=\"Pas aim\u00e9\"  data-style=\"\"  data-unlike_allowed=\"\"  data-show_copyright=\"\"  data-item_url=\"https:\/\/isarta.com\/infos\/le-probleme-des-biais-dans-les-algorithmes-de-decision\/\"  data-item_title=\"Le probl\u00e8me des biais dans les algorithmes de d\u00e9cision\"  data-item_image=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2017\/09\/Fotolia_117569127_S.jpg\"  data-item_date=\"2017-10-09T08:50:26-04:00\"  data-engine=\"WordPress\"  data-plugin_v=\"2.6.54\"  data-prx=\"https:\/\/isarta.com\/infos\/wp-admin\/admin-ajax.php?action=likebtn_prx\"  data-event_handler=\"likebtn_eh\" ><\/span><!-- LikeBtn.com END --><\/p>\n<hr size=\"3\"><\/div>\n","protected":false},"excerpt":{"rendered":"<p>Un biais cognitif est une forme de pens\u00e9e qui d\u00e9vie de la pens\u00e9e logique ou rationnelle et qui a tendance \u00e0 \u00eatre syst\u00e9matiquement utilis\u00e9e dans certaines situations.<\/p>\n<div class=\"likebtn_container\" style=\"clear:both;text-align:center;\"><!-- LikeBtn.com BEGIN --><span class=\"likebtn-wrapper\"  data-identifier=\"post_47773\"  data-site_id=\"5cc1ecd16fd08b776710d1e9\"  data-theme=\"drop\"  data-btn_size=\"100\"  data-icon_l=\"false\"  data-icon_d=\"false\"  data-icon_l_url=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2019\/04\/happy.png#64794\"  data-icon_d_url=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2019\/04\/sleep.png#64796\"  data-icon_size=\"58\"  data-icon_l_c=\"#771414\"  data-bg_c=\"#ffffff\"  data-bg_c_v=\"#ffffff\"  data-brdr_c=\"#ffffff\"  data-f_size=\"11\"  data-f_family=\"Tahoma\"  data-counter_fs=\"b\"  data-lang=\"fr\"  data-ef_voting=\"wobble\"  data-dislike_enabled=\"false\"  data-counter_type=\"percent\"  data-counter_show=\"false\"  data-tooltip_enabled=\"false\"  data-tooltip_like_show_always=\"true\"  data-white_label=\"true\"  data-i18n_like=\"Aimer \/ Partager\"  data-i18n_dislike=\"Pas aim\u00e9\"  data-style=\"\"  data-unlike_allowed=\"\"  data-show_copyright=\"\"  data-item_url=\"https:\/\/isarta.com\/infos\/le-probleme-des-biais-dans-les-algorithmes-de-decision\/\"  data-item_title=\"Le probl\u00e8me des biais dans les algorithmes de d\u00e9cision\"  data-item_image=\"https:\/\/isarta.com\/infos\/wp-content\/uploads\/2017\/09\/Fotolia_117569127_S.jpg\"  data-item_date=\"2017-10-09T08:50:26-04:00\"  data-engine=\"WordPress\"  data-plugin_v=\"2.6.54\"  data-prx=\"https:\/\/isarta.com\/infos\/wp-admin\/admin-ajax.php?action=likebtn_prx\"  data-event_handler=\"likebtn_eh\" ><\/span><!-- LikeBtn.com END --><hr size=\"3\"><\/div>","protected":false},"author":2,"featured_media":47860,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[175,137,1],"tags":[314,206,705,896,1491],"_links":{"self":[{"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/posts\/47773"}],"collection":[{"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/comments?post=47773"}],"version-history":[{"count":6,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/posts\/47773\/revisions"}],"predecessor-version":[{"id":48347,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/posts\/47773\/revisions\/48347"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/media\/47860"}],"wp:attachment":[{"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/media?parent=47773"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/categories?post=47773"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/isarta.com\/infos\/wp-json\/wp\/v2\/tags?post=47773"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}