{"id":175113,"date":"2022-02-02T11:30:33","date_gmt":"2022-02-02T16:30:33","guid":{"rendered":"https:\/\/irpp-policy-options.local\/issues\/ai-accountability-crtc-oversight\/"},"modified":"2025-04-14T03:25:24","modified_gmt":"2025-04-14T07:25:24","slug":"ai-accountability-crtc-oversight","status":"publish","type":"issues","link":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/","title":{"rendered":"AI accountability can\u2019t be left to the CRTC"},"content":{"rendered":"<p>For those wondering about the extent of Canada\u2019s commitment to artificial intelligence (AI) accountability and transparency, we now have an answer: not much. Buried in a recent decision by Canada\u2019s media regulator (the Canadian Radio-television Telecommunications Commission or CRTC) was a clear admission that AI accountability was not a priority. The decision all but closes the door on hopes that the CRTC would push for algorithmic accountability, raising further doubts about its future in looming reforms to Canada\u2019s institutions of Internet governance and the future of AI governance.<\/p>\n<p><strong>Toward zero-knowledge networks<\/strong><\/p>\n<p>On December 9, 2021, the <a href=\"https:\/\/crtc.gc.ca\/eng\/archive\/2021\/2021-403.htm\">CRTC approved Bell Canada\u2019s<\/a> request to use an artificial intelligence system to block fraudulent calls. We\u2019d offer a more detailed description, but there are few details to be had. Our research team intervened to learn how AI works in the field, but our efforts devolved into a fight for basic explanations about this system. We struggle to explain its human oversight, how it\u2019s automated, or how it even works in much detail.<\/p>\n<p>What we know is Bell Canada monitors call patterns in Canada with an AI, looking for anomalies that it reviews, verifies and then blocks. Having a communication provider block anything is a serious matter, because it can cut off legitimate communication between people.<\/p>\n<p>For the newly approved system, Bell is making decisions network-wide. The system is applied to all calls transmitting across Bell Canada\u2019s network, affecting millions of Canadians making calls through Canada\u2019s largest telecommunications provider. The system has blocked more than 1.120 billion calls since it started on July 15, 2020. A big number, but one that is difficult to evaluate. It is not clear if the total volume of spam or scam calls Canadians receive has actually gone up or down since the introduction of the blocking system.<\/p>\n<p><strong>No assessments, no explanations, no tangible oversight<\/strong><\/p>\n<p>Bell Canada, to its credit, brought its proposed blocking system before the regulator. Unfortunately, the CRTC decided that there was not \u201c<a href=\"https:\/\/crtc.gc.ca\/eng\/archive\/2021\/2021-403.htm#:~:text=Furthermore%2C%20given%20the,at%20this%20time.\">any need for regulatory framework regarding the use of AI at this time<\/a>.\u201d That malaise weakens public evidence and public policy.<\/p>\n<p>The standard of public evidence is poorer now, too. The commission let the case be settled under non-disclosure agreements between trusted parties and declined to use it as a chance to elaborate a fulsome approach to automated content moderation \u2013 which is another way of saying spam blocking.<\/p>\n<p>In particular, the federal regulatory agency did not consider the complexity and importance of there being an explanation of decisions made by AI systems, a critical requirement for ensuring the accountability of automated decision-making for the welfare of humans.<\/p>\n<blockquote><p><a href=\"https:\/\/potestlaunch.irpp.org\/fr\/magazines\/november-2020\/policy-makers-must-get-up-to-speed-on-ai\/\">Policy-makers must get up to speed on AI<\/a><\/p>\n<p><a href=\"https:\/\/potestlaunch.irpp.org\/fr\/magazines\/mai-2021\/is-the-government-picking-the-wrong-place-to-start-regulating-algorithms\/\">Is the government picking the wrong place to start regulating algorithms?<\/a><\/p>\n<p><a href=\"https:\/\/potestlaunch.irpp.org\/fr\/magazines\/april-2021\/as-the-digital-divide-widens-telecom-policy-is-still-an-afterthought\/\">As the digital divide widens, telecom policy is still an afterthought<\/a><\/p><\/blockquote>\n<p>The lack of technical competency in the decision is alarming. AI is learning and changing \u2013 that\u2019s the whole point of the technology. Presumably, ongoing monitoring might be called for since the system could change. However, this ruling has little in the way of AI oversight. Bell Canada has only to submit annual reports and update the CRTC on any major changes to the algorithm within 60 days. What happens when the CRTC receives that information is anyone\u2019s guess. It\u2019s not clear, given the ruling, that the CRTC knows what to do with AI in the first place.<\/p>\n<p>These shortcomings do not seem to matter much to the national regulatory agency. Before the decision was even reached, CRTC chair and CEO <a href=\"https:\/\/www.cbc.ca\/news\/business\/crtc-telecom-call-authentification-1.6250599\">Ian Scott touted Bell\u2019s new system<\/a> as a success in an interview with the CBC. We can appreciate the commission\u2019s mandate to protect Canadians from fraudulent calls, but does it have to be at the expense of good governance?<\/p>\n<p><strong>Undermining responsible AI for the world to see<\/strong><\/p>\n<p>The CRTC\u2019s decision seems out of step with the Government of Canada\u2019s stated positions at the global level, where it is pushing for AI accountability as a member of<a href=\"https:\/\/www.international.gc.ca\/global-affairs-affaires-mondiales\/news-nouvelles\/2020\/2020-11-05-internet-freedom-liberte-internet.aspx?lang=eng\"> the Freedom Online Coalition (FOC). <\/a>In the group\u2019s 2020 statement, it was recommended that the \u201cprivate sector should endeavor to promote and increase transparency, traceability, and accountability in the decision, development, procurement, and use of AI systems.\u201d<\/p>\n<p>After two years of hearings, where we raised these same concerns, we have found little interest on behalf of this federal regulatory agency to implement Canada\u2019s international commitments to AI governance.<\/p>\n<p>Canada staked a claim to be a world leader when the Treasury Board implemented <a href=\"https:\/\/www.canada.ca\/en\/government\/system\/digital-government\/digital-government-innovations\/responsible-use-ai.html\">algorithmic impact assessments (AIA) in the federal service<\/a>. International observers may then be surprised to learn that the CRTC declined our request to develop a comparable algorithmic impact assessment, even when dealing with such a large-scale application by Canada\u2019s largest telecom infrastructure provider. With the decision, the CRTC missed another opportunity to translate international commitments into meaningful AI policy.<\/p>\n<p>A common reason we were given by the CRTC for withholding information, both during the hearing and elsewhere, was concern that \u201cbad actors\u201d could use it. That assumption is part of a worrisome trend toward secrecy at the commission, doubly problematic as we envisage national Canadian content promotion, Internet harm reduction and cybersecurity strategies. \u201c<a href=\"https:\/\/en.wikipedia.org\/wiki\/Security_through_obscurity\">Obscurity is not security<\/a>\u201d is a truism in the field. Yet, in all the cases described here, deference to confidentiality has overridden good faith efforts for transparency and public oversight.<\/p>\n<p>In the same 2020 Freedom Online Coalition commitment, Canada and other signatories warned about \u201c<a href=\"https:\/\/www.international.gc.ca\/global-affairs-affaires-mondiales\/news-nouvelles\/2020\/2020-11-05-internet-freedom-liberte-internet.aspx?lang=eng#:~:text=The%20FOC%20is,automated%20content%20moderation.\">the use of AI systems for repressive and authoritarian purposes<\/a>.\u201d We have another worry \u2013 that AI priorities of secrecy and excessive deference to commercial interests could lead to anti-democratic rulings and the undermining of public interest. How can we expect good AI governance if a new AI system cannot be described to citizens, be audited effectively, or be held accountable by public regulatory agencies or the courts if, and when, something goes wrong?<\/p>\n<p><strong>A regulator that actually understands this file<\/strong><\/p>\n<p>Last fall, NDP MP Charlie Angus joined calls for a <a href=\"https:\/\/ipolitics.ca\/2021\/10\/18\/ndp-mp-proposes-new-regulator-for-social-media\/\">new public regulator for digital matters<\/a>, one that \u201cactually understand(s) this file.\u201d Angus was responding to the CRTC\u2019s incapacity to regulate Silicon Valley, but with the Bell AI file, the CRTC has demonstrated it doesn\u2019t understand the risks even in the high-tech systems it <em>does<\/em> regulate.<\/p>\n<p>Expanding the CRTC\u2019s responsibilities, as the federal government has suggested, would be fraught with risk, as the agency\u2019s institutional capacity to learn and adapt to technological change appears limited.<\/p>\n<p>The case is a beginning as much as an end. The CRTC\u2019s failure to put an appropriate, publicly accountable framework in place for AI shows that Canada\u2019s voluntary approach to AI governance is inadequate and the consultations hollow. It is time for AI accountability and explainability to be given legislative priority. Unlike the CRTC, we think the time for good regulatory frameworks for AI is now<\/p>\n","protected":false},"excerpt":{"rendered":"<p>For those wondering about the extent of Canada\u2019s commitment to artificial intelligence (AI) accountability and transparency, we now have an answer: not much. Buried in a recent decision by Canada\u2019s media regulator (the Canadian Radio-television Telecommunications Commission or CRTC) was a clear admission that AI accountability was not a priority. The decision all but closes [&hellip;]<\/p>\n","protected":false},"featured_media":101702,"template":"","meta":{"_acf_changed":false,"content-type":"","ep_exclude_from_search":false},"categories":[9362,9372,9383],"tags":[9291,8609,9237],"article-status":[],"irpp-category":[4226,4225,4249,4243,4365,4374,4675,4244],"section":[],"irpp-tag":[],"class_list":{"0":"post-175113","1":"issues","2":"type-issues","3":"status-publish","4":"has-post-thumbnail","5":"hentry","6":"category-economie","7":"category-recent-stories-fr","8":"category-sciences-et-technologies","9":"tag-donnees","10":"tag-geants-technologiques","11":"tag-intelligence-artificielle","12":"irpp-category-big-tech","13":"irpp-category-data","14":"irpp-category-donnees","15":"irpp-category-economy","16":"irpp-category-geants-technologiques","17":"irpp-category-innovation","19":"irpp-category-science-and-technology"},"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.2 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>AI accountability can\u2019t be left to the CRTC<\/title>\n<meta name=\"description\" content=\"The CRTC enables secrecy and lacks technical competence to deliver oversight for developments in AI. Algorithmic transparency is crucial.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/\" \/>\n<meta property=\"og:locale\" content=\"fr_FR\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"AI accountability can\u2019t be left to the CRTC\" \/>\n<meta property=\"og:description\" content=\"The CRTC enables secrecy and lacks technical competence to deliver oversight for developments in AI. Algorithmic transparency is crucial.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/\" \/>\n<meta property=\"og:site_name\" content=\"Policy Options\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/IRPP.org\" \/>\n<meta property=\"article:modified_time\" content=\"2025-04-14T07:25:24+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Facebook-AI-accountability-cant-be-left-to-the-CRTC.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"628\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:image\" content=\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Twitter-AI-accountability-cant-be-left-to-the-CRTC.jpg\" \/>\n<meta name=\"twitter:site\" content=\"@irpp\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/\",\"url\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/\",\"name\":\"AI accountability can\u2019t be left to the CRTC\",\"isPartOf\":{\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Wordpress-AI-accountability-cant-be-left-to-the-CRTC.jpg\",\"datePublished\":\"2022-02-02T16:30:33+00:00\",\"dateModified\":\"2025-04-14T07:25:24+00:00\",\"description\":\"The CRTC enables secrecy and lacks technical competence to deliver oversight for developments in AI. Algorithmic transparency is crucial.\",\"breadcrumb\":{\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#breadcrumb\"},\"inLanguage\":\"fr-FR\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#primaryimage\",\"url\":\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Wordpress-AI-accountability-cant-be-left-to-the-CRTC.jpg\",\"contentUrl\":\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Wordpress-AI-accountability-cant-be-left-to-the-CRTC.jpg\",\"width\":1920,\"height\":1080},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/potestlaunch.irpp.org\/fr\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"AI accountability can\u2019t be left to the CRTC\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/#website\",\"url\":\"https:\/\/potestlaunch.irpp.org\/fr\/\",\"name\":\"Policy Options\",\"description\":\"Institute for Research on Public Policy\",\"publisher\":{\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/potestlaunch.irpp.org\/fr\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"fr-FR\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/#organization\",\"name\":\"Policy Options\",\"url\":\"https:\/\/potestlaunch.irpp.org\/fr\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"fr-FR\",\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2024\/11\/PolicyOptions_Logo.png\",\"contentUrl\":\"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2024\/11\/PolicyOptions_Logo.png\",\"width\":1200,\"height\":365,\"caption\":\"Policy Options\"},\"image\":{\"@id\":\"https:\/\/potestlaunch.irpp.org\/fr\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/IRPP.org\",\"https:\/\/x.com\/irpp\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"AI accountability can\u2019t be left to the CRTC","description":"The CRTC enables secrecy and lacks technical competence to deliver oversight for developments in AI. Algorithmic transparency is crucial.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/","og_locale":"fr_FR","og_type":"article","og_title":"AI accountability can\u2019t be left to the CRTC","og_description":"The CRTC enables secrecy and lacks technical competence to deliver oversight for developments in AI. Algorithmic transparency is crucial.","og_url":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/","og_site_name":"Policy Options","article_publisher":"https:\/\/www.facebook.com\/IRPP.org","article_modified_time":"2025-04-14T07:25:24+00:00","og_image":[{"width":1200,"height":628,"url":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Facebook-AI-accountability-cant-be-left-to-the-CRTC.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_image":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Twitter-AI-accountability-cant-be-left-to-the-CRTC.jpg","twitter_site":"@irpp","twitter_misc":{"Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/","url":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/","name":"AI accountability can\u2019t be left to the CRTC","isPartOf":{"@id":"https:\/\/potestlaunch.irpp.org\/fr\/#website"},"primaryImageOfPage":{"@id":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#primaryimage"},"image":{"@id":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#primaryimage"},"thumbnailUrl":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Wordpress-AI-accountability-cant-be-left-to-the-CRTC.jpg","datePublished":"2022-02-02T16:30:33+00:00","dateModified":"2025-04-14T07:25:24+00:00","description":"The CRTC enables secrecy and lacks technical competence to deliver oversight for developments in AI. Algorithmic transparency is crucial.","breadcrumb":{"@id":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#breadcrumb"},"inLanguage":"fr-FR","potentialAction":[{"@type":"ReadAction","target":["https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/"]}]},{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#primaryimage","url":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Wordpress-AI-accountability-cant-be-left-to-the-CRTC.jpg","contentUrl":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2022\/02\/Wordpress-AI-accountability-cant-be-left-to-the-CRTC.jpg","width":1920,"height":1080},{"@type":"BreadcrumbList","@id":"https:\/\/potestlaunch.irpp.org\/fr\/2022\/02\/ai-accountability-crtc-oversight\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/potestlaunch.irpp.org\/fr\/"},{"@type":"ListItem","position":2,"name":"AI accountability can\u2019t be left to the CRTC"}]},{"@type":"WebSite","@id":"https:\/\/potestlaunch.irpp.org\/fr\/#website","url":"https:\/\/potestlaunch.irpp.org\/fr\/","name":"Policy Options","description":"Institute for Research on Public Policy","publisher":{"@id":"https:\/\/potestlaunch.irpp.org\/fr\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/potestlaunch.irpp.org\/fr\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"fr-FR"},{"@type":"Organization","@id":"https:\/\/potestlaunch.irpp.org\/fr\/#organization","name":"Policy Options","url":"https:\/\/potestlaunch.irpp.org\/fr\/","logo":{"@type":"ImageObject","inLanguage":"fr-FR","@id":"https:\/\/potestlaunch.irpp.org\/fr\/#\/schema\/logo\/image\/","url":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2024\/11\/PolicyOptions_Logo.png","contentUrl":"https:\/\/potestlaunch.irpp.org\/wp-content\/uploads\/2024\/11\/PolicyOptions_Logo.png","width":1200,"height":365,"caption":"Policy Options"},"image":{"@id":"https:\/\/potestlaunch.irpp.org\/fr\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/IRPP.org","https:\/\/x.com\/irpp"]}]}},"_links":{"self":[{"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/issues\/175113","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/issues"}],"about":[{"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/types\/issues"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/media\/101702"}],"wp:attachment":[{"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/media?parent=175113"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/categories?post=175113"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/tags?post=175113"},{"taxonomy":"article-status","embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/article-status?post=175113"},{"taxonomy":"irpp-category","embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/irpp-category?post=175113"},{"taxonomy":"section","embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/section?post=175113"},{"taxonomy":"irpp-tag","embeddable":true,"href":"https:\/\/potestlaunch.irpp.org\/fr\/wp-json\/wp\/v2\/irpp-tag?post=175113"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}