
In short: Estonia and Belgium are the only two EU members to reject the Jutland Declaration, a Europe-wide commitment in October 2025 to limit children’s access to social media. Estonian ministers argue that age-based bans are unenforceable, that children will find ways around them, and that the right approach is to enforce GDPR against the platforms themselves and invest in digital literacy rather than restricting young people’s participation in the information society.
The declaration was signed by the majority of EU countries
On October 10, 2025, digital ministers from 25 of the 27 member states of the European Union signed the Jutland Declaration at an informal meeting in Horsens, Denmark. Norway and Iceland have also signed. The declaration is a non-binding political commitment to implement privacy-preserving age verification on social media platforms, protect minors from addictive design features and dark patterns, and work towards what the document describes as a “digital legal age” for access to online services. Estonia and Belgium were the two members that refused to join the EU. Belgium’s refusal came after a veto by Flemish Media Minister Cieltje Van Achter, who described the declaration’s age-verification requirements as disproportionate and objected to requiring children to use national identity systems such as Itsme to access services such as YouTube or Instagram. Estonia’s refusal was fundamentally different: it was principled rather than procedural, and based on a broader argument about where to focus Europe’s regulatory efforts. The political impulse reflected in the declaration is significant. Europe’s social media age shift accelerated to 2025 and 2026With Australia introducing the world’s first ban on under-16s from December 2025, France enacting a ban on under-15s in January 2026, Spain introducing restrictions on under-16s in February 2026 and Austria imposing restrictions on under-14s. Greece has announced that it will ban children under 15 from using social media from 2027It is part of the EU group of six countries, which also includes Denmark, France, Austria, Portugal and Spain. On 20 November 2025, the European Parliament, by 483 votes to 92 with 86 abstentions, supported a non-binding resolution calling for a digital minimum age of 16 in the EU and called on the European Commission to include the measure in the upcoming Digital Justice Act.
Why did Estonia say no?
Estonia’s displeasure is expressed by two ministers who approach the issue from different but complementary perspectives. Education and Research Minister Christina Callas has been a more outspoken critic of the ban consensus. At the Politico forum in Barcelona, Callas argued that age restrictions are the responsibility of the wrong party. “To me, the way to approach this issue is not to hold the kids responsible for this harm and start self-regulation,” she said. His relevant argument is that the responsibility should lie with the platforms. “Europe shows itself to be weak when it comes to big American and international corporations,” he said, urging the EU to “actually seize this power and regulate big American corporations.” He also spoke bluntly about the practical limits of ban-based approaches: “Kids will find ways to use social media very quickly.” This argument is closed Europe’s broader effort to assert regulatory power over American technology companiesA project that has gained significant momentum since 2025, but is still not applied with a force comparable to social media content management. Minister of Justice and Digital Affairs Liisa-Ly Pakosta highlighted the positive side of Estonia’s preferred approach. “Estonia believes in the information society and includes young people in the information society,” he says, emphasizing digital participation rather than exclusion. Pakosta pointed to the General Data Protection Regulation as an enforcement mechanism already in place: GDPR prohibits platforms from processing children’s personal data without appropriate consent and imposes fines of up to 4% of global annual turnover for violations. Estonia’s argument is essentially that Europe has not exhausted its existing tools before resorting to new and unproven ones.
Estonia points to an implementation problem
Estonia’s criticism of the prohibition model has a specific point of reference. Australia became the first country in the world to implement a social media ban for minors on December 10, 2025, banning anyone under the age of 16 from having accounts on platforms including Instagram, TikTok, YouTube, Snapchat, X and Facebook. Platforms face fines of up to around A$50 million for failing to take reasonable steps to prevent access by minors. A few months after the ban went into effect eSafety Commissioner Meta found that TikTok and YouTube were not complying with the banwith the regulator taking legal action against the platforms. The compliance picture was bleak: seven out of ten children who had social media accounts before the ban still had active accounts after the law went into effect. Workarounds such as VPNs, fake birth dates, and transferring accounts to adult relatives became simple and widely accepted. Whether the Australian experience represents a definitive verdict on the prohibition model, or an early enforcement struggle that stricter enforcement will ultimately resolve, remains debatable. What is indisputable is that the world’s first and most closely watched age ban produced high rates of non-compliance in the months since its introduction, a result predicted by critics who argued that the burden of compliance would be met by creative circumvention rather than genuine restriction.
What comes next in Brussels
The practical arena of competition between Estonia’s platform enforcement approach and the majority’s position is the Digital Fairness Act, the European Commission’s forthcoming legislation targeting addictive design, dark patterns and manipulative commercial practices in digital services. In a vote by the European Parliament in November 2025, it was explicitly included in the DFA text, along with a digital minimum age of 16, as well as bans on underage user recommendation algorithms, limits on loot boxes, and a default requirement for endless scrolling, autoplay and withdrawal mechanisms used by young people. The commission is expected to discuss the DFA proposal in the fourth quarter of 2026. This timeline gives Estonia a legislative window to discuss a platform accountability framework alongside or instead of an age-based access restriction. These two approaches are not necessarily mutually exclusive, but they reflect really different theories of where regulatory levers are most effectively applied: against the commercial platforms that build and profit from said systems, or against young people who treat social media as ordinary infrastructure. In 2025, artificial intelligence is set to be the defining technology of the decadeand as AI-powered recommendation systems become the primary mechanism by which young people encounter online content, the question of who bears legal and regulatory responsibility for what these systems serve a 14-year-old child is one that Europe will have to answer not just in declarations, but in law.




