Tag: Mental Health and Social Media

  • The Case for Scope 3 Emissions

    The Case for Scope 3 Emissions

    The digital economy treats human attention as a raw material to be mined without limit. However, this extraction produces a specific, harmful byproduct: it breaks our focus, fractures shared reality, and wears down the patience required for moral reasoning. Unlike physical pollution, which damages the air or water, this cognitive waste damages our ability to think. While corporations are increasingly held accountable for their physical footprint through ESG criteria, we ignore the mental damage caused by the product itself. What we are witnessing is the intentional fragmentation and manipulation of citizenry to the point of inability to functionally contribute to democracy. In this paper, I argue that this damage is a hidden cost that must be pro-actively regulated within specifically designed metrics and cognitive impact protocols, specifically expanding Scope 3 regulations to include “Cognitive Emissions.”

    To understand why this regulation is necessary, we must look at what is being lost. While the market views attention as a commodity to be sold, ethical philosophy defines it as the foundation of morality. French philosopher Simone Weil argued that “attention is the rarest and purest form of generosity” (Weil 1951, 105); it is the ability to pause one’s own ego to recognize the reality of another person. When digital platforms take over this “Attentional Commons” (Crawford 2015, 12), they do not merely distract us; they dismantle the mental tools, specifically the capacities for self-regulation and deliberation, required to govern ourselves (Crawford 2015, 23). Without the capacity for sustained, coherent thought, the self becomes fragmented, losing the sense of stability required to be a responsible citizen (Giddens 1991, 92).

    This fragmentation is not accidental. It is the result of a specific design philosophy that makes apps as tailored and easy to use as possible. Using Daniel Kahneman’s distinction, modern algorithms keep users trapped in “System 1” (fast, instinctive thinking) while bypassing “System 2” (slow, logical thinking) (Kahneman 2011, 20). System 2 governs executive control, a high-metabolic function that relies on cognitive resistance to maintain neural integrity. Neurobiologically, this follows the principle of use-dependent plasticity: neural pathways responsible for complex reasoning strengthen through challenge and degrade through disuse (Mattson 2008, 1). When an algorithm molds users into passive consumption, it removes the necessary friction required to sustain these pathways, leading to a functional degradation of critical thought.

    Consequently, this process is invasive. By predicting our desires, algorithms bypass our will, seizing our attention before we can decide where to place it. While Shoshana Zuboff calls this “surveillance capitalism” (Zuboff 2019, 8), the mechanism is closer to a slot machine. As Natasha Dow Schüll observes, these interfaces use design loops to induce a trance-like state that overrides conscious choice (Schüll 2012, 166). This is built-in user manipulation. Neuroscience research on “Facebook addiction” confirms that these platforms activate the brain’s impulse systems while suppressing the prefrontal cortex, the part of the brain responsible for planning and control (Turel et al. 2014, 685). The result is a depletion of critical thought, creating a population that reacts rather than reflects.

    Regulating this harm means looking to the precedent of carbon reporting. The Greenhouse Gas Protocol’s “Scope 3” covers indirect emissions, including those generated by the use of sold products (WRI/WBCSD 2004, 25). I propose applying this exact reasoning to the digital economy: a tech company must be responsible for the mental effects of product use. They are destructive because they erode the basic cognitive foundations required for a functioning society (Habermas 1989, 27). If a platform’s design creates measurable addiction, radicalization, or a loss of attention span, these are “emissions” that incur a cost to humanity.

    To develop sustainable and reliable policies, we require new auditing metrics. We must calculate the speed at which content triggers emotional responses and establish a fragmentation index to measure how often an app interrupts deep work. This is a necessary (albeit complicated) metric given that regaining focus after an interruption takes significant time and energy (Mark 2008, 108). Furthermore, we must assess the deficit of serendipity, determining whether an algorithm narrows a user’s worldview or introduces the necessary friction of new ideas.

    Once measured, these emissions can be mitigated through cognitive impact protocols. This includes mandating friction by design to force users to pause and think. For example, when Twitter introduced prompts asking users to read articles before retweeting, “blind” sharing dropped significantly (Twitter Inc. 2020). This proves that simple friction can measurably reduce cognitive waste. Beyond individual features, firms must submit to independent audits to ensure their code promotes agency rather than addiction. Finally, just as carbon sinks absorb physical waste, digital firms should be mandated to fund zones (libraries, parks, and phone-free spaces) where our attention can recover.

    It can be argued that introducing friction impedes innovation and destroys value. This view, however, fails to account for the long-term liability of cognitive degradation. A market that incentivizes the depletion of user agency creates a systemic risk to the public. By implementing “Scope 3 Cognitive Emissions,” we operationalize the cost of this damage, forcing platforms to account for the mental impact of their design choices. We are currently operating with a dangerous separation between our technological power and our institutional controls (Wilson 2012, 7). Closing this gap requires a shift, moving away from design that exploits immediate impulse. We must engineer digital environments that protect, rather than degrade, the cognitive autonomy required for a free society.

    References
    Crawford, Matthew B. 2015. The World Beyond Your Head: On Becoming an Individual in an
    Age of Distraction. New York: Farrar, Straus and Giroux.

    Giddens, Anthony. 1991. Modernity and Self-Identity: Self and Society in the Late Modern Age.
    Stanford: Stanford University Press.

    Habermas, Jürgen. 1989. The Structural Transformation of the Public Sphere: An Inquiry into a
    Category of Bourgeois Society. Cambridge: MIT Press.

    Kahneman, Daniel. 2011. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.

    Mark, Gloria, and Deepti Gudithala. 2008. “The Cost of Interrupted Work: More Speed and
    Stress.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,
    107–110.


    Mattson, Mark P. 2008. “Hormesis Defined.” Ageing Research Reviews 7 (1): 1–7.
    Pigou, Arthur C. 1920. The Economics of Welfare. London: Macmillan.

    Schüll, Natasha Dow. 2012. Addiction by Design: Machine Gambling in Las Vegas. Princeton:
    Princeton University Press.

    Sunstein, Cass R. 2017. #Republic: Divided Democracy in the Age of Social Media. Princeton:
    Princeton University Press.

    Turel, Ofir, Qinghua He, Gui Xue, Lin Xiao, and Antoine Bechara. 2014. “Examination of Neural
    Systems Sub-Serving Facebook ‘Addiction’.” Psychological Reports 115 (3): 675–695.

    Twitter Inc. 2020. “Read before you Retweet.” Twitter Blog, September 24.

    United Nations Global Compact. 2004. Who Cares Wins: Connecting Financial Markets to a
    Changing World. New York: United Nations.

    Weil, Simone. 1951. “Reflections on the Right Use of School Studies with a View to the Love of
    God.” In Waiting for God, translated by Emma Craufurd, 105–116. New York: Harper & Row.

    Wilson, Edward O. 2012. The Social Conquest of Earth. New York: Liveright.
    World Resources Institute and World Business Council for Sustainable Development
    (WRI/WBCSD). 2004. The Greenhouse Gas Protocol: A Corporate Accounting and Reporting
    Standard, Revised Edition. Washington, DC: WRI/WBCSD.

    Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at
    the New Frontier of Power. New York: PublicAffairs.

  • Digital Citizenship in the Age of Misinformation

    Digital Citizenship in the Age of Misinformation

    Education for the 21st Century

    In an era where the digital realm has become our second home, the concept of citizenship has undergone a change. We have become denizens of a vast digital ecosystem. This shift demands a reimagining of education that goes beyond traditional paradigms, equipping future generations with the critical thinking skills needed to navigate the labyrinth of online information.

    It is now a time requiring a shift in our systems as a society. We currently have access to the largest bank of information and recorded perspectives of all time, but the majority of us are not using it effectively. Imagine the possibilities of a world where students are capable of untangling the often overwhelming complexities of online information, and using it as a tool.

    This is a necessary evolution of our education systems. The digital age has gifted us with unprecedented access to information, but it has also presented us with a double-edged sword. As MIT Sloan researchers discovered, digital literacy alone isn’t enough to stem the tide of misinformation [3]. We need to cultivate a new breed of digital citizens who not only consume information responsibly but also create and share it ethically.

    To achieve this, we must incorporate digital citizenship into the fabric of our education system. History classes where students don’t just robotically memorize dates, but dissect the anatomy of fake news, tracing its origins and understanding its viral spread. Science courses that not only teach the scientific method, but also explore how misinformation can distort public understanding of crucial issues like climate change or vaccination [1].

    I’ll take it a step further. What if we created immersive digital simulations where students could experience the real-world consequences of spreading misinformation? Imagine a virtual reality scenario where a student’s decision to share an unverified piece of news triggers a chain reaction, allowing them to witness firsthand the ripple effects of their digital actions [5]. This approach could potentially transform abstract concepts into tangible experiences, making the lessons of digital citizenship both memorable and impactful.

    Moreover, I believe we need to shift our focus from mere technical proficiency to ethical mastery. In a time where a single tweet can spark a global movement or tarnish a reputation irreparably, understanding the ethical implications of our online actions is paramount. We should be fostering empathy, teaching students to see beyond the screen and recognize the human impact of their digital footprint [2]. Users have to recognize they are interacting with people at the other end – not 1’s and 0’s.

    The challenge of online misinformation is a human one. And the solution lies not in algorithms/AI, but in nurturing discerning, ethical, and adaptable users.

    As we stand at the precipice of two paths, we have an opportunity to redefine education for the 21st century. By cultivating critical thinking skills, ethical awareness, and adaptability, we can empower the next generation to become not just consumers of technology, but masters of the digital world. It’s clear that we need to create an education system that doesn’t just keep pace with technological advancements but anticipates and shapes them; to empower them to shape the world of tomorrow.

    The future of society depends on it.

    Works Cited:

    AACSB. “Assessing Critical Thinking in the Digital Era.” AACSB, 6 June 2023, www.aacsb.edu/insights/articles/2023/06/assessing-critical-thinking-in-the-digital-era.

    Learning.com Team. “Digital Citizenship in Education: What It Is & Why it Matters.” Learning.com, 6 Feb. 2024, www.learning.com/blog/digital-citizenship-in-education-what-it-is-why-it-matters/.

    MIT Sloan. “Study: Digital Literacy Doesn’t Stop the Spread of Misinformation.” MIT Sloan, 5 Jan. 2022, mitsloan.mit.edu/ideas-made-to-matter/study-digital-literacy-doesnt-stop-spread-misinformation

    Columbia University. “Information Overload: Combating Misinformation with Critical Thinking.” CPET, 27 Apr. 2021, cpet.tc.columbia.edu/news-press/information-overload-combating-misinformation-with-critical-thinking.

    Colorado Christian University. “The Importance of Critical Thinking & Hands-on Learning in Information Technology.” CCU, www.ccu.edu/blogs/cags/category/business/the-importance-of-critical-thinking-hands-on-learning-in-information-technology/.

    University of Iowa. “Digital Literacy: Preparing Students for a Tech-Savvy Future.” University of Iowa Online Programs, 19 Aug. 2024, onlineprograms.education.uiowa.edu/blog/digital-literacy-preparing-students-for-a-tech-savvy-future.

  • The Extinction of Serendipity: An Evolution of Social Media and Its Value

    The Extinction of Serendipity: An Evolution of Social Media and Its Value

    In the halcyon days of the early 2010s, a revolution was brewing. Platforms like Vine and Instagram burst onto the scene, capturing our collective imagination with their bite-sized content and visual storytelling. These digital windows opened up worlds beyond our own, allowing us to share our art, laughter, and those serendipitous moments that make life charming.

    Do you remember when scrolling through your feed felt like cozying up in a living room with family and friends? We’d gather around our screens, sharing snippets of our lives, finding solace in the universality of human experiences. Those were the days when a six-second Vine could leave you in stitches, and an unfiltered Instagram post could make you feel less alone in your quirks and struggles.

    But oh, how the tables have turned. Our quest for connection took an unexpected detour, and the charm of spontaneity has been usurped by a carefully curated façade of perfection. The era of candid captures has given way to a barrage of heavily edited photos that set unrealistic standards. The shift in social media behavior has been seismic.

    We used to pour our hearts out in status updates and tweets, but those days are now rapidly fading. The percentage of users citing ‘sharing opinions” as a key motivation for using social media has taken a nosedive, dropping from about 40% in 2014 to a mere 30% in 2017 (Auxier, 2019). It seems we’ve transformed from eager sharers to voracious consumers.

    The originally entertaining and funny influencers have traded authenticity for scripts, chasing laughs with contrived scenarios. Sure, some still hit the mark, but the sparkle of spontaneity has dimmed considerably. And let’s talk about TikTok, shall we? This new kid on the block has become the belle of the marketing ball, continuing to grow in popularity and even evolving into a “super app” that combines social media, messaging, services, and payments. It’s so influential that 40% of Gen Z users now turn to TikTok or Instagram instead of Google for search queries, with a total disregard for the validity of the content they are consuming. (Ewing, 2023).

    What began as a tool for bringing humanity together has morphed into a breeding ground for comparison. It’s no longer about shared experiences; it’s a relentless competition. Who has the most followers? The flawless face? The most enviable lifestyle? Social media has become a digital arms race of personal branding.

    Enter artificial intelligence, promising to enhance our online experiences but inadvertently contributing to the dehumanization of social interactions. AI algorithms now curate our feeds, deciding what content we see based on our past behavior and preferences. While this personalization can be convenient, it often creates echo chambers, limiting our exposure to diverse perspectives and reinforcing existing beliefs.

    These algorithms are designed to maximize engagement, but at a cost. They appear to be exacerbating negative feelings like loneliness, depression, or anxiety (Craine, 2023). A concern that we seem to hear about increasingly. It’s like we’re trapped in a digital funhouse mirror, constantly reflecting our own biases back at us.

    AI-powered chatbots and voice assistants have become ubiquitous, offering instant responses and interactions. However, this efficiency comes at a cost: the loss of human connection. As we increasingly interact with algorithms rather than real people, our online relationships risk becoming shallow and transactional.

    Content creation has evolved into a science, with videos engineered to trigger psychological responses and boost engagement. It’s a new form of creativity, sure, but at what cost? Information flows in torrents, but it’s muddied by bias and judgment, making the search for unvarnished truth a Herculean task.

    The all-encompassing nature of social media has transformed it from a platform of community and entertainment into an arena of constant comparison. Users are feeling the strain, with many stepping back from these platforms to protect their mental well-being. The interactions that once brought joy and connection now leave many feeling drained and disconnected. It’s particularly concerning for our younger generations. Youth spending over 3 hours daily on social media are at higher risk for mental health problems, and Gen Z is more likely than older generations to report negative mental health effects from social media use (Minamitani, 2024). We’re talking about sleep loss, cyberbullying, body image issues, and depressive symptoms. It’s like we’ve traded our online living room for a high-stakes colosseum.

    As we navigate this evolving digital landscape, it’s worth asking: Can we reclaim the magic of those early days? Can we strike a balance between technological advancement and authentic human connection? The answer isn’t simple, but there are ways to foster proper connections in this new world. For brands and individuals alike, it’s showcasing the human side by sharing behind-the-scenes content and personal anecdotes. Maintaining consistency in voice and messaging across platforms, and utilizing user-generated content to build authenticity and community. Most importantly, it’s about creating valuable content that educates, entertains, or inspires, rather than just promoting products or personas.

    In our quest to be seen and understood, we shouldn’t lose sight of what matters. Perhaps the answer lies in remembering why we fell in love with social media in the first place. The joy of connection, the thrill of discovering new perspectives, and the comfort of shared experiences. Now is the time to reclaim our digital spaces, to use AI and technology as tools for enhancing rather than replacing human interaction, and to remember that behind every profile is a real person with real emotions. Let’s be mindful of our time spent scrolling, rather than passive consumption, and seek out diverse perspectives to break free from our echo chambers.

    After all, social media is what we make of it. It’s a tool, not a tyrant. So let’s use it to build bridges, not walls. To encourage understanding, not division. To celebrate our shared humanity, not our curated personas. To consume knowledge that will grow us as a society, rather than trap us in the mud that is misinformation. The power to transform our digital landscape lies in our hands, or rather, at our fingertips. It’s time, yet again, to change how we consume content.

    ~ Jess

    Works Cited

    Auxier, Brooke, and Monica Anderson. “10 Tech-Related Trends That Shaped the Decade.” Pew Research Center, 20 Dec. 2019, www.pewresearch.org/short-read/2019/12/20/10-tech-related-trends-that-shaped-the-decade/.

    Craine, Kelly. “Baylor Researchers Explore Effect of Instagram, TikTok on Psychological Well-Being.” Baylor University Media and Public Relations, 2 May 2023.

    Ewing, Carson R., et al. “Social Media Use Motives: An Influential Factor in User Behavior and User Health Profiles.” Psi Chi Journal of Psychological Research, vol. 28, no. 4, Winter 2023, pp. 275-278.

    Minamitani, Kenta. “Social Media Addiction and Mental Health: The Growing Concern for Youth Well-Being.” Stanford Law, 20 May 2024.