When Algorithms Cross the Line: Ethical and Technical Challenges for Computer Scientists in Targeted Advertising



Main Ideas



  • Social media uses personal data to target ads.
  • Tracking tools collect what users do online.
  • This creates privacy risks for everyday people.
  • Most users don’t know how much data is taken.
  • Computer scientists must protect user privacy while building these systems.

By Yenia Vasquez

ENC2135 · Fall 2025 · Florida State University

One of the biggest technological changes of the 21st century is social media, captivating the attention of the entire world for years to come. With such a popular platform, it was only a matter of time before corporations decided to find a way to profit from the masses. Therefore, many business models began using the vast reach of social media to create personalized ads for users and increase profits. This led to what we now know as targeted advertising, and to work, these platforms need data. The amount of data needed to properly deliver personalized advertising to each user is massive. For software engineers and computer scientists, these systems are fascinating since they combine data structures, algorithms, machine learning, and system design. However, data collection and targeted advertising algorithms emphasize a serious problem because they compromise privacy and leave everyday users vulnerable. The main challenge for computer scientists lies in designing the algorithms and data-collection systems that power targeted advertising, while also protecting user's privacy from misuse. Overall, algorithms for data collections create a personalized environment, but at the same time they put at risk the user’s privacy.

Advertising has a long history of adapting to new technologies and social changes. It has aways reflected its era, changing frequently to match the “ethos, trends, and values” of each new century (Brand Vision Media). Every decade shows the social, technological and cultural trends of its time (Brand Vision Media). In the 1920s, bold Art Deco posters promoted modern luxury (Brand Vision Media). During the 1940s, patriotic propaganda supported war efforts (Brand Vision Media). By the 1990s, ads were ironic and self-aware matching the grunge culture style (Brand Vision Media). Over time, advertising has shifted from a simple one-way message to more interactive exchanges between brands and audiences (Kumar and Gupta 302). Today’s ads can track what people watch, like and click. This happens because of the new focus on “profitable customer engagement”, where the goal is not just selling, but building a loyal customer’s relationships (Kumar and Gupta 304). Customer information is used to create personalized, effective ads, which is why modern advertising relies on “data-driven” strategies (Kumar and Gupta 305).

From a technical point of view, there are many methods that social media platforms use to track people. Their techniques usually work by gathering and comparing user information from several platforms. Developers are constantly improving common tracking technologies like cookies, web beacons, JavaScript and fingerprinting, making it more difficult to detect and block them (Sim, Heo, and Cho). Cookies is a classic form of data tracking, specifically third-party cookies (Ullah, Boreli, and Kanhere 649). Third-party cookies allow trackers to follow a user’s browsing activity across multiple and unrelated websites to build a detailed profile (Ullah, Boreli, and Kanhere 649). Web beacons on the other hand, are these tiny, often invisible, images attached in emails or webpages that register a user’s interaction (Sim, Heo, and Cho). An example is when a user receives an email from a brand that includes a logo or image asking them to follow the brand on Instagram. This image acts as a web beacon and instantly collects information about the user.

On mobile phones, advertising software development kits (SDKs) collect information like the device ID, which apps are used, and location data to create detailed user profiles (Ullah, Boreli, and Kanhere 649). In contrast to cookies or web beacons, fingerprinting or stateless tracking operates without storing data on the device (Ullah, Boreli, and Kanhere 650). Basically, fingerprinting data tracking is almost hidden. This technique works by constantly asking the user’s browser and device for things such as screen resolution, installed fonts, browser version, operating system, and even hardware configuration (Sim, Heo, and Cho). This combination of things creates a unique identifier that we call a “fingerprint”. It does not matter if the user clears cookies and cache or browser history, the system still knows who the user is (Ullah, Boreli, and Kanhere 650). Then we have advertising software development kits (SDKs), these are already integrated in many apps, and they collect Advertising ID on Android and Advertisers IDFA on IOS (Ullah, Boreli, and Kanhere 650-651). These identifiers are made with purpose so SDKs can collect information through the phone, and unfortunately for many users they cannot be removed as mobile manufacturers integrate it as a default. But all this data collected is only for the purpose of increasing profits with targeted advertising.

This data is then used by machine learning systems that analyze wide datasets to train predictive models to guess what users will do next, and by real-time bidding (RTB) systems that decide in just milliseconds which ad to show (Nayyar 1747). These algorithms perform predictive analytics to key performance measures, like click-through rates, to help ads reach the right people (Nayyar 1749). Automating ad creation and personalization is necessary for the dynamic generation of technology. The ad industry is huge, processing terabytes of user data every day and generated hundreds of billions in ad revenue (Nayyar 1746). For computer science students, this is a big technical success: large systems can manage millions of requests at once, and the prediction tools are designed to be fast and accurate. While this represents a win in data-driven marketing, it does not feel ethical. These features make the system work efficiently, it also makes it confusing, so users cannot see or control how their personal data is being used (New America).

An example of social media with targeted advertising is Facebook. Its economic model relies on collecting large amounts of user data and using algorithms to predict interest which has raised concerns about privacy discrimination, and even emotional manipulation (Smith et al. 1). Facebook advertising system works by tracking what people do both on and off the platform using cookies and social plug-ins, building a detailed profile of each person’s interest (New America). With these profiles, ads can reach the correct audience because they have a vast array of demographics and attributes, personal information, and similar users in their databases (New America). However, this process leads to biased or discriminatory ad delivery, even though the goal is to reach a broad audience (New America). The reason why this happens is because algorithms depend on users that interact online the most, and some of the data involves sensitive details such as gender or race (New America). Many users “lack the knowledge needed to take control of their privacy” (Smith et al. 3). In response to these unclear and often concerning practices, Smith et al. found that educational videos can help users understand targeted advertising better and encourage them to protect their privacy (Smith et al. 2). Their experiment, which involved 127 participants, found that watching these videos increased privacy awareness and protective actions (Smith et al. 2). The study tested two types of videos: one that uses fear-based warnings about privacy risks and another that added reflective learning by showing people their own ad profiles to raise self-awareness (Smith et al. 12). Both videos improved the understanding of this, but the reflective one made people engage more with ad settings and feel more in control of their data (Smith et al. 13).

This lack of transparency is a big issue for computer science students. As future developers and engineers are taught how simple it is to add tracking scripts, use data gathered from users to train models, or even create recommendation systems. These powerful tools may show ethical concerns, and as this goes on the harder it is to understand that having the ability to build these tools also means having the responsibility to use them carefully. New data collection techniques, such as fingerprinting and cross-site tracking, often bypass traditional privacy tools that rely only on easy detectable algorithms like cookies (Sim, Heo, and Cho). This brings up an important career question: is it better to focus on making algorithms or come up with solutions to protect everyday users? To answer this question, it is important to think about the ethical rules that guide computer scientists. The Association for Computing Machinery (ACM) set standards that remind professionals that ethical computing goes beyond innovation, it is also about protecting the people who use it.

Protecting user data is not optional for computer scientists and software engineers; it is a core part of their ethical responsibility. The ACM Code of Ethics states that computing professionals should contribute to society and to human wellbeing which includes the responsibility to minimize negative consequences of computing including threats to privacy" (ACM Code of Ethics 1.1). This is connected to the rule to avoid harm and that includes the unjustified destruction or disclosure of information (ACM Code of Ethics 1.2). To meet this duty professionals must do more than just limit data collection they must actively respect privacy by taking steps such as preventing re-identification of anonymous data, keeping data accurate and protecting it from unauthorized access and accidental disclosures (ACM Code of Ethics 1.6). This is vital because research shows that even anonymized data can sometimes be traced back to individuals when combined with other demographic details putting users’ privacy at risk (Morehouse et al. 929). In response the tech community is creating and using Privacy Enhancing Technologies (PETs) and system designs that build data protection directly into software helping meet legal requirements and maintain ethical standards (“Topical Issue” 250). In the end for software engineers, writing secure code that protects user data is just as important as writing a code that works.

There are many software engineers who decide on creating methods for protecting everyday users. Users with technical knowledge often rely on ad blockers, VPNs, or encryption to protect their privacy (Lenhart et al.). But most users lack this knowledge, which means they do not defend against tracking. Therefore, privacy protection is difficult for everyday users. According to Sadique, Rahmani, and Johannesson, only certain individuals should have permission to access the suer’s personal data (201). This vulnerability leaves the sensation of being observed by an unknown entity, this is known as the “Big Brother effect”. This effect happens because it’s a way users resist smart devices (Mani and Chouk 1461). Moreover, human-computer interaction studies show that having clear interfaces help users feel more in control of their data and still maintain the benefit of personalization (Smith et al.). Although there are a couple ways to protect the user’s privacy, software engineers and regular users still face the challenge of doing so.

Social media’s use of data collection algorithms creates a big challenge for computer scientists and software engineers. These systems show how incredible modern computing can be; it can handle extensive amounts of data, make predictions, and it works fast. For computer science students, learning algorithms like cookies, web beacons, and fingerprint recognition is a challenge itself, but a nice one. Building dynamic and efficient algorithms is important for social media platforms. Even though the algorithms are amazing, they put everyday users at risk. User’s information is constantly being tracked whenever they use social media, browse websites or interact with apps. This data can include browsing history, location, device information and personal preferences. So, for professionals like software engineers or computer scientists the problem is not only making these platforms personalized and dynamic but also figuring out how to protect users from their data being invaded or breached.

Works Cited

“Topical Issue on Privacy, Data Protection, and Digital Identity.” SN Computer Science, vol. 1, no. 5, 250, 2020, https://doi.org/10.1007/s42979-020-00261-5.

ACM Code of Ethics and Professional Conduct. ACM, 2025, https://www.acm.org/code-of-ethics.

Brand Vision Media. “Evolution of Advertising: A Journey Through the Decades.” Brand Vision Media, 4 Sept. 2023, www.brandvm.com/post/evolution-of-advertising.

Kumar, Vijay, and Shaphali Gupta. "Conceptualizing the evolution and future of advertising." Journal of advertising 45.3 (2016): 302-317.

Morehouse, Kirsten N., et al. “Responsible Data Sharing: Identifying and Remedying Possible Re-Identification of Human Participants.” The American Psychologist, vol. 80, no. 6, 2025, pp. 928–41, https://doi.org/10.1037/amp0001346.

Nayyar, Gaurav. “‘Optimizing Ad Campaigns with Machine Learning: Data-Driven Approaches in Modern Media’.” The International Journal of Multiphysics., vol. 18, no. 3, 2024, pp. 1746–54.

New America, Open Technology Institute. Special Delivery: The Role of Data in the Targeted Advertising Industry. New America, 2024, https://www.newamerica.org/oti/reports/special-delivery/the-role-of-data-in-the-targeted-advertising-industry/.

Sim, Kyungmin, et al. “Combating Web Tracking: Analyzing Web Tracking Technologies for User Privacy.” Future Internet, vol. 16, no. 10, 363, 2024, https://doi.org/10.3390/fi16100363

Smith, Garrett, et al. “‘I Know I’m Being Observed:’ Video Interventions to Educate Users about Targeted Advertising on Facebook.” CHI 2024 - Proceedings of the 2024 CHI Conference on Human Factors in Computing Sytems, edited by Corina Sas et al., 112, ACM, 2024, https://doi.org/10.1145/3613904.3642885.

Ullah, Imdad, et al. “Privacy in Targeted Advertising on Mobile Devices: A Survey.” International Journal of Information Security, vol. 22, no. 3, 2023, pp. 647–78, https://doi.org/10.1007/s10207-022-00655-x.



About me

My name is Yenia Vasquez. I am currently a computer science student at FSU. I made this webpage for my class ENC2135. Today is December 5, 2025. I am currently 21 years old. I am also an artist; I love doing paintings and drawings. I have been on technology since I was a kid. When I was like 8 with my iPad, I remember doing a jailbreak on it because I wanted to have custom features. I always liked inventing stuff on my mum’s computer. Later on, I got an interest in programming when I was in middle school studying algorithms flow charts, Visual Basic, and Excel macros. Later, I discovered SpaceHey, which led me to discover Neocities, and I started getting an interest in website development. This whole page is made by me with HTML and CSS mostly.

I hope my webpage can reach many Gen Z students who aspire to get into technology. With the rise of AI and companies exploiting us, it is more than easy to lose control. Ever since websites stopped being static and we entered the dynamic internet of things, it feels like the internet is not interesting anymore. And we normalized having our data used however companies feel like using it.

Privacy should be a right. I think it is crazy that, knowing Windows, Android, macOS, and iOS collect our data all the time without our permission to target us with ads, people still see it as normal. The operating systems are made to include an ID that can track all our clicks. But at the end of the day, we are not treated like humans; we are treated like a product. Our data gets sold for money, and in a way, we end up buying each other.



What is data tracking?

Data tracking is when websites, apps, or online platforms collect information about what you do. They track things like your clicks, searches, and what you watch so they can learn your habits and show you things they think you might like.

What is Targeted Advertising?

Targeted advertising means showing people ads that are chosen based on their online behavior and interests.