On October 28, 2025, Scientific American published an insightful article examining the potential impact of TikTok’s upcoming U.S. spin-off on the platform’s influential algorithm and the broader cultural landscape of online content curation. As TikTok prepares to launch a U.S.-only version of its app under majority American ownership, questions arise about how this change might reshape what users see and experience on the platform. The article features a detailed conversation with Kelley Cotter, an assistant professor specializing in the social and ethical implications of digital technologies, who sheds light on the complexities behind TikTok’s algorithm and the possible consequences of its Americanization.
TikTok’s algorithm has earned a near-mythical status for its ability to predict and tailor content to individual users, influencing what over a billion people worldwide watch daily. While this power is not magical, it remains a crucial factor in shaping culture by determining which information and ideas gain visibility online. The planned U.S. spin-off, driven largely by concerns over Chinese ownership and control, particularly ByteDance’s influence, involves creating a new app that will be majority-owned by American companies—around 80%—with less than 20% ownership by Chinese investors. A key component of the deal is transferring, licensing, and retraining TikTok’s algorithm for this new U.S.-focused platform.
Kelley Cotter explains that her research centers on social media algorithms and artificial intelligence, exploring how people understand these technologies and how such understanding could inform democratic governance of digital platforms. She notes that while awareness of algorithms has grown significantly over the past decade, most users still grasp only the basic idea—that the content shown is filtered and ranked based on their interactions such as likes, shares, and watch time. However, the broader societal impacts of algorithms—how they may shape public discourse, cultural norms, or political views—remain less understood by the general public.
The TikTok sale, Cotter emphasizes, is not just about ownership but about the control and future direction of the algorithm that drives the platform’s “For You” page. The algorithm is fundamental to TikTok’s identity because it personalizes the user experience, making content engaging and relevant. It also enforces community standards, filtering out harmful or inappropriate content. This algorithmic curation influences what culture develops on the platform, which content goes viral, and which voices are amplified or diminished.
What sets TikTok’s algorithm apart, Cotter notes, is partly the short video format, which provides a precise signal of user interest through watch time—how long a user watches a video is a strong indicator of engagement. Additionally, TikTok’s unique features, like “Stitch” and sound-based memes, foster connections between creators and users, enhancing the algorithm’s ability to identify user preferences and promote interactive cultural phenomena. Despite these insights, the exact reasons for TikTok’s algorithmic success remain somewhat opaque, contributing to its almost mystical reputation among users who sometimes believe it “knows them better than they know themselves.”
The proposed buyers of the U.S. TikTok app include well-known companies such as Oracle, which has already managed TikTok’s U.S. user data. However, Cotter points out that many of the investors involved have ties to the Trump administration and conservative political leanings. This raises concerns about potential ideological influences on the platform’s content curation. For example, earlier controversies involved claims by Republican lawmakers that TikTok’s algorithm favored Palestinian hashtags over Israeli ones, hinting at perceived political biases in content visibility. With a new ownership group holding significant sway, there is a possibility that the algorithm and community guidelines could be adjusted to reflect different ideological priorities, altering what is considered acceptable speech and shaping the platform’s overall content ecosystem.
Another important factor is that the new TikTok app will be limited to American users, although it might still feature some global content. The retraining of the algorithm on a U.S.-only user base means that American cultural values and user behaviors will heavily influence the content recommendations, likely producing subtle shifts from the current global TikTok experience. Furthermore, if users perceive the app as politically biased or dominated by a particular ideological faction, some may choose to leave, potentially resulting in a user base that skews more ideologically homogeneous. This could further reinforce particular viewpoints and limit the diversity of content seen on the platform.
Ultimately, Cotter suggests that the new American TikTok app could look drastically different from the current global version, depending on decisions made by its new owners
