Our daily lives are majorly shifting towards the online space, but little do we know how that online space works. Our best guess is that the big platforms like META control everything. This common view is not entirely wrong, the platforms do indeed govern the online space by the internal policies they make. However, if we only leave it up to them, the policies they undertake will not always represent the citizens’ interest vis a vis the platforms’ economic interest.
This means that in the absence of legally binding regulations, platforms can provide little oversight and capacities in content management, leaving users highly exposed to illegal or harmful content such as discriminatory statements, hate speech, violent content or disinformation. Therefore, laws and institutions have to play a role in ensuring that citizens are protected and enjoy their rights in the online space as they do in the physical life. Since the WB region is no exception to the digital transformation, we should use this very momentum to take the right measures on a technical and legal level, as well as bridge the digital divide with the EU.
In 2022 the European Union adopted the Digital Services Act (DSA) as a key regulation to enhance accountability, transparency and safety while users navigate in the online space. European institutions were committed to regulate the space which was until recently supervised only by the internal policies introduced by big platforms themselves (for example META, Google or Amazon).
The DSA is beneficial to the EU citizens, businesses and the society at large, as it provides the ground for the protection of fundamental rights, prevents harmful content and the spread of disinformation. Once the EU adopted the DSA, it sent shockwaves to the big tech companies.
Because of the large share of the EU market, platforms could not afford to violate the legal requirements of the DSA, nor could they walk away and find new users in other regions. Therefore, platforms revised their policies and introduced new tools to comply with EU legal requirements about content moderation in their platforms. Such a move started a spillover effect where platforms adjusted their policies on a global level.
Where does the WB stand?
While the EU citizens are protected by the DSA and similar other regulations like the Digital Markets Act or AI Act, the WB citizens do not have such a comprehensive legal framework, nor empowered institutions to ensure that their fundamental rights are protected online. The legal landscape in the WB6 remains fragmented on a country level and on a regional level.
For instance, in Albania there is no comprehensive legal framework to ensure online safety of users. Some 47% of the DSA related framework remains uncovered by the current legislation in the country. In Serbia and Bosnia and Herzegovina the situation is similar. Only Kosovo and North Macedonia represent higher convergence with the DSA, while still not offering all the guarantees that the DSA does.
Inadequate legislation in the WB6 is one reason why big platforms have little incentives to commit more resources to content moderation for a safer online space. Another reason is the lack of institutional capacities to monitor the situation, to interact with the platforms for law enforcement and to enact policies that benefit the citizens.
Being exposed to harmful content online, disproportionally affects already discriminated/sensitive groups in the society such as children, women and other vulnerable groups. Just recently a school kid in Albania used TikTok to post about a fight with another young boy, whom he later stabbed in the school yard. Instead of holding the platform accountable for its content moderation policies for allowing hate speech and violent content produced/used by teenagers, the Albanian government decided to just ban TikTok. This reflects the lack of capacities to run a thorough investigation and propose alternative policies that balance online safety and the freedom of expression and of choice.
In the absence of proper regulation that would empower public institutions to hold platforms accountable, the space is filled by the civil society. Media, non-profit organisations and think tanks in the WB have joined efforts to monitor, keep track, report and document the disinformation and violation of fundamental rights that happen online.
Despite all the great work and the good will, this approach is not sufficient as civil society has less leverage towards platforms when it comes to requesting more transparency and demanding that platforms take illegal content down. Thus, there is a need for a more structural approach with EU adopted legislation and with public institutions that have clear mandates and capacities to act.
We also have to speak the language of the platforms, the market. The WB countries are too small of a market for big platforms to have any interest to deploy resources for content moderation to ensure online safety. For instance, as of 2025, Albania has 1.41 million social media users, or 50.7 percent of the total population, whereas Montenegro has 410 thousand social media users, or 64,2% of the population. By comparison, the EU has 59% out of 447.6 million of its population in social media (2023).
From a study conducted by BIRN in 2021 for the WB, it was found that platforms allocate very few human resources to reviewing content online, while most of it is done by AI. The study highlighted that one in two posts in Bosnian, Serbian, Montenegrin or Macedonian language remained online after being reported as hate speech, threatening violence or harassment. The situation continues to remain problematic. For example, in 2024 in North Macedonia, hate speech & discrimination, together with threatening content, count as almost half of online violations in the country.
The momentum is now for WB to upgrade its digital regulation
Even though the current picture of a safer online space looks grey for the WB, the region has a unique opportunity to improve the situation because of the enlargement process. As part of the EU acquis approximation process adopting the DSA and DMA is a golden opportunity for the countries to avoid reinventing the wheel, while in the
same time filling the current regulatory gaps. This strategy also avoids misinterpreting or misusing the law to intensify censorship or press down digital activism.
Moreover, with the increase of targeted disinformation or the use of AI to spread misinformation, it is necessary that legislation and institutions in the region are prepared to prevent and respond to this new wave.
One could argue that it is also in the EU’s interest to push the WB in adopting regulation along the lines of the DSA, as well as support the training and the improvement of institutional capacities in the WB. Such a move would strengthen the efforts to fight disinformation and misinformation in the EU’s neighbourhood, a region which is prone to foreign influence.
From a market perspective, it is easier for the platforms to extend the same measures of risk mitigation that comply with the DSA, rather than comply with 6 different legal requirements. It is a win-win situation. Platforms have a cost-effective solution which infers just extending the existing content moderation toolkit from the EU to the WB; whereas the Western Balkans provide a high standard protection of citizens’ rights in the online space without needing to spend additional capacities to develop new country specific regulation.
Proposals for fixing Achilles heel: Law enforcement
As we argue about adopting legislation, we all know that in the WB the Achilles heel is law enforcement. One can hope that by having the same regulatory framework as the EU, big platforms would simply commit to voluntarily enforcement by extending the risk mitigation toolkit under DSA to the WB.
What if voluntary enforcement does not work? Untrained public institutions in the WB will not be able to hold big platforms accountable for their violations. Hence, EU institutions need to provide capacity building and mentoring to the region’s existing institutions or to the new ones that could be created as part of legal enforcement. Some level of cooperation should also be foreseen between EU and WB institutions as part of the ongoing enlargement process.
Moreover, recalling the market example of the EU, the WB would definitely count more as a region than as individual countries. Regional bodies can play a role in facilitating this process by coordinating the WB6 to adopt the DSA, or serving as a ground for information exchange and of good practices. One example can be that of the Regional Cooperation Council (RCC) which has kickstarted its work for the digital agenda by bringing the WB6 together to discuss the digital transformation from a market perspective.
RCC can leverage its position towards convincing the countries of the region to work as one for the DSA adoption and implementation. Another example would be to use the existing Berlin Process, an initiative of cooperation between the EU and the WB. This process already covers the common regional market and investment in broadband infrastructure, thus in the future it can develop to cover digital soft measures such as advocating for the DSA alignment and resource allocation for the DSA institutional implementation.
Regulatory gaps and lack of institutional capacities are leaving WB citizens highly exposed to illegal harms and harmful content. The statistics are worrying and the civil society is rightly pointing to this direction. However, a light weight market such as the WB, is of no interest to the big platforms for committing resources in content moderation and online safety.
Civil society efforts in the Western Balkans, implemented under the OSF-WB supported IGNITA initiative, focus precisely on regional cooperation and EU integration as key drivers for enhancing the digital environment and strengthening information integrity in the region. Indeed, the best scenario for the WB6 is to bandwagon with the EU and align with the DSA.
The WB would benefit in having a regulation with the highest protections of digital rights, while having platforms to extending the EU DSA policies of online risk mitigation to the users in the region. This regulatory framework must be accompanied with a set of law enforcement pathways which can cover capacity building, WB regional coordination efforts and institutional cooperation with the EU.