Tag: Social Media

  • The Case for Scope 3 Emissions

    The Case for Scope 3 Emissions

    The digital economy treats human attention as a raw material to be mined without limit. However, this extraction produces a specific, harmful byproduct: it breaks our focus, fractures shared reality, and wears down the patience required for moral reasoning. Unlike physical pollution, which damages the air or water, this cognitive waste damages our ability to think. While corporations are increasingly held accountable for their physical footprint through ESG criteria, we ignore the mental damage caused by the product itself. What we are witnessing is the intentional fragmentation and manipulation of citizenry to the point of inability to functionally contribute to democracy. In this paper, I argue that this damage is a hidden cost that must be pro-actively regulated within specifically designed metrics and cognitive impact protocols, specifically expanding Scope 3 regulations to include “Cognitive Emissions.”

    To understand why this regulation is necessary, we must look at what is being lost. While the market views attention as a commodity to be sold, ethical philosophy defines it as the foundation of morality. French philosopher Simone Weil argued that “attention is the rarest and purest form of generosity” (Weil 1951, 105); it is the ability to pause one’s own ego to recognize the reality of another person. When digital platforms take over this “Attentional Commons” (Crawford 2015, 12), they do not merely distract us; they dismantle the mental tools, specifically the capacities for self-regulation and deliberation, required to govern ourselves (Crawford 2015, 23). Without the capacity for sustained, coherent thought, the self becomes fragmented, losing the sense of stability required to be a responsible citizen (Giddens 1991, 92).

    This fragmentation is not accidental. It is the result of a specific design philosophy that makes apps as tailored and easy to use as possible. Using Daniel Kahneman’s distinction, modern algorithms keep users trapped in “System 1” (fast, instinctive thinking) while bypassing “System 2” (slow, logical thinking) (Kahneman 2011, 20). System 2 governs executive control, a high-metabolic function that relies on cognitive resistance to maintain neural integrity. Neurobiologically, this follows the principle of use-dependent plasticity: neural pathways responsible for complex reasoning strengthen through challenge and degrade through disuse (Mattson 2008, 1). When an algorithm molds users into passive consumption, it removes the necessary friction required to sustain these pathways, leading to a functional degradation of critical thought.

    Consequently, this process is invasive. By predicting our desires, algorithms bypass our will, seizing our attention before we can decide where to place it. While Shoshana Zuboff calls this “surveillance capitalism” (Zuboff 2019, 8), the mechanism is closer to a slot machine. As Natasha Dow Schüll observes, these interfaces use design loops to induce a trance-like state that overrides conscious choice (Schüll 2012, 166). This is built-in user manipulation. Neuroscience research on “Facebook addiction” confirms that these platforms activate the brain’s impulse systems while suppressing the prefrontal cortex, the part of the brain responsible for planning and control (Turel et al. 2014, 685). The result is a depletion of critical thought, creating a population that reacts rather than reflects.

    Regulating this harm means looking to the precedent of carbon reporting. The Greenhouse Gas Protocol’s “Scope 3” covers indirect emissions, including those generated by the use of sold products (WRI/WBCSD 2004, 25). I propose applying this exact reasoning to the digital economy: a tech company must be responsible for the mental effects of product use. They are destructive because they erode the basic cognitive foundations required for a functioning society (Habermas 1989, 27). If a platform’s design creates measurable addiction, radicalization, or a loss of attention span, these are “emissions” that incur a cost to humanity.

    To develop sustainable and reliable policies, we require new auditing metrics. We must calculate the speed at which content triggers emotional responses and establish a fragmentation index to measure how often an app interrupts deep work. This is a necessary (albeit complicated) metric given that regaining focus after an interruption takes significant time and energy (Mark 2008, 108). Furthermore, we must assess the deficit of serendipity, determining whether an algorithm narrows a user’s worldview or introduces the necessary friction of new ideas.

    Once measured, these emissions can be mitigated through cognitive impact protocols. This includes mandating friction by design to force users to pause and think. For example, when Twitter introduced prompts asking users to read articles before retweeting, “blind” sharing dropped significantly (Twitter Inc. 2020). This proves that simple friction can measurably reduce cognitive waste. Beyond individual features, firms must submit to independent audits to ensure their code promotes agency rather than addiction. Finally, just as carbon sinks absorb physical waste, digital firms should be mandated to fund zones (libraries, parks, and phone-free spaces) where our attention can recover.

    It can be argued that introducing friction impedes innovation and destroys value. This view, however, fails to account for the long-term liability of cognitive degradation. A market that incentivizes the depletion of user agency creates a systemic risk to the public. By implementing “Scope 3 Cognitive Emissions,” we operationalize the cost of this damage, forcing platforms to account for the mental impact of their design choices. We are currently operating with a dangerous separation between our technological power and our institutional controls (Wilson 2012, 7). Closing this gap requires a shift, moving away from design that exploits immediate impulse. We must engineer digital environments that protect, rather than degrade, the cognitive autonomy required for a free society.

    References
    Crawford, Matthew B. 2015. The World Beyond Your Head: On Becoming an Individual in an
    Age of Distraction. New York: Farrar, Straus and Giroux.

    Giddens, Anthony. 1991. Modernity and Self-Identity: Self and Society in the Late Modern Age.
    Stanford: Stanford University Press.

    Habermas, Jürgen. 1989. The Structural Transformation of the Public Sphere: An Inquiry into a
    Category of Bourgeois Society. Cambridge: MIT Press.

    Kahneman, Daniel. 2011. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.

    Mark, Gloria, and Deepti Gudithala. 2008. “The Cost of Interrupted Work: More Speed and
    Stress.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems,
    107–110.


    Mattson, Mark P. 2008. “Hormesis Defined.” Ageing Research Reviews 7 (1): 1–7.
    Pigou, Arthur C. 1920. The Economics of Welfare. London: Macmillan.

    Schüll, Natasha Dow. 2012. Addiction by Design: Machine Gambling in Las Vegas. Princeton:
    Princeton University Press.

    Sunstein, Cass R. 2017. #Republic: Divided Democracy in the Age of Social Media. Princeton:
    Princeton University Press.

    Turel, Ofir, Qinghua He, Gui Xue, Lin Xiao, and Antoine Bechara. 2014. “Examination of Neural
    Systems Sub-Serving Facebook ‘Addiction’.” Psychological Reports 115 (3): 675–695.

    Twitter Inc. 2020. “Read before you Retweet.” Twitter Blog, September 24.

    United Nations Global Compact. 2004. Who Cares Wins: Connecting Financial Markets to a
    Changing World. New York: United Nations.

    Weil, Simone. 1951. “Reflections on the Right Use of School Studies with a View to the Love of
    God.” In Waiting for God, translated by Emma Craufurd, 105–116. New York: Harper & Row.

    Wilson, Edward O. 2012. The Social Conquest of Earth. New York: Liveright.
    World Resources Institute and World Business Council for Sustainable Development
    (WRI/WBCSD). 2004. The Greenhouse Gas Protocol: A Corporate Accounting and Reporting
    Standard, Revised Edition. Washington, DC: WRI/WBCSD.

    Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at
    the New Frontier of Power. New York: PublicAffairs.

  • The Socratic Paradox

    The Socratic Paradox

    Today I write from the ever bustling airport, the ultimate people watching spot.

    I found myself considering the thousands of separate realities that others exist in. As I am sure you have also noticed, the majority of the public tend to be absorbed by their devices, constantly- regardless if they’re walking, sitting, standing, running, etc. They are sucked in to screens. It’s almost their second reality, their never-ending dopamine fix. So, what is this constant use doing to us?

    Socrates once famously claimed that wisdom begins with acknowledging one’s own ignorance. In 2025, this principle takes on new significance. We’ve mastered the knack of recognizing what we don’t know; our questions are sharper and more frequent than ever. Yet, as recent neuroscientific research reveals, our capacity for deep understanding may be eroding.

    We’re armed with AI-powered tools that can respond to complex queries, yet studies show that 80% of workers suffer from ‘information overload’. Our brains, designed to handle 3-4 items of information at once, are bombarded with up to 74 GB of data daily. This cognitive overload is reshaping our neural pathways, potentially at the cost of our ability to engage in sustained, deep thinking.

    The prefrontal cortex, particularly the dorsolateral and ventrolateral regions (DLPFC and VLPFC), plays a crucial role in controlling learning processes. These areas are responsible for selecting and manipulating goal-relevant information. However, when faced with an overwhelming amount of data, these regions can become overtaxed, leading to decreased efficiency in information processing and decision-making.

    This cognitive strain extends to the workplace, where the cost of information overload is staggering. Research indicates that cognitive overload costs the US economy about $900 billion annually. The implications are clear: our ability to ask sophisticated questions has outpaced our capacity to absorb and integrate the answers.

    To address this imbalance, we must cultivate a practice of “mindful inquiry” that combines the Socratic method with modern cognitive science:

    1. Pause to consider the depth of your question and your readiness to engage with the answer, aligning with the Socratic tradition of self-examination.
    2. Implement spaced repetition and active recall to reinforce learning and enhance long-term memory formation.
    3. Design learning experiences that reduce extraneous cognitive load, allowing for deeper processing and comprehension.
    4. Incorporate periods of ‘digital fasting’ to allow for reflection and knowledge consolidation. Give yourself a mental spa treatment.

    Moving forward, the integration of AI in learning presents both challenges and opportunities. AI-powered tutors could engage learners in adaptive Socratic dialogues, potentially revolutionizing the way we balance inquiry and absorption. However, we must remain vigilant against the risk of intellectual complacency that easy access to information might foster.

    In navigating this new landscape, our goal should be to harness both technology and wisdom from the past. By combining Socratic inquiry with neuroscience-backed learning strategies, we can evolve into knowledge vacuums, capable of not just posing insightful questions, but of deeply understanding and applying the answers we receive.

    The future of learning lies not in the volume of information we can access or the complexity of questions we can ask, but in our ability to transform that information into wisdom through thoughtful inquiry and absorption. Our next steps will shape not just our individual minds, but the collective intelligence of our species for generations to come.

    Perhaps the greatest wisdom lies in knowing not just how to ask, but how to listen, absorb, and integrate. The Socratic paradox of our age challenges us to be both humble in our questioning and diligent in our understanding, fostering a new kind of intellectual virtue that balances curiosity with contemplation.

    This is a follow up thought chain to Asking Better Questions

    Works Cited:

    Friedlander, M. J., et al. (2013). Neuroscience and Learning: Implications for Teaching Practice. PMC.

    Rensselaer Polytechnic Institute. (2024). Information Overload Is a Personal and Societal Danger. RPI News.

    Structural Learning. (2024). How Neuroscience Informs Effective Learning Strategies.

    Ji, X. (2023). The Negative Psychological Effects of Information Overload. ERHSS, 9, 250-256.

    Moore, C. (2025). Is Cognitive Overload Ruining Your Employee Training? Cathy Moore’s Blog.

  • Digital Citizenship in the Age of Misinformation

    Digital Citizenship in the Age of Misinformation

    Education for the 21st Century

    In an era where the digital realm has become our second home, the concept of citizenship has undergone a change. We have become denizens of a vast digital ecosystem. This shift demands a reimagining of education that goes beyond traditional paradigms, equipping future generations with the critical thinking skills needed to navigate the labyrinth of online information.

    It is now a time requiring a shift in our systems as a society. We currently have access to the largest bank of information and recorded perspectives of all time, but the majority of us are not using it effectively. Imagine the possibilities of a world where students are capable of untangling the often overwhelming complexities of online information, and using it as a tool.

    This is a necessary evolution of our education systems. The digital age has gifted us with unprecedented access to information, but it has also presented us with a double-edged sword. As MIT Sloan researchers discovered, digital literacy alone isn’t enough to stem the tide of misinformation [3]. We need to cultivate a new breed of digital citizens who not only consume information responsibly but also create and share it ethically.

    To achieve this, we must incorporate digital citizenship into the fabric of our education system. History classes where students don’t just robotically memorize dates, but dissect the anatomy of fake news, tracing its origins and understanding its viral spread. Science courses that not only teach the scientific method, but also explore how misinformation can distort public understanding of crucial issues like climate change or vaccination [1].

    I’ll take it a step further. What if we created immersive digital simulations where students could experience the real-world consequences of spreading misinformation? Imagine a virtual reality scenario where a student’s decision to share an unverified piece of news triggers a chain reaction, allowing them to witness firsthand the ripple effects of their digital actions [5]. This approach could potentially transform abstract concepts into tangible experiences, making the lessons of digital citizenship both memorable and impactful.

    Moreover, I believe we need to shift our focus from mere technical proficiency to ethical mastery. In a time where a single tweet can spark a global movement or tarnish a reputation irreparably, understanding the ethical implications of our online actions is paramount. We should be fostering empathy, teaching students to see beyond the screen and recognize the human impact of their digital footprint [2]. Users have to recognize they are interacting with people at the other end – not 1’s and 0’s.

    The challenge of online misinformation is a human one. And the solution lies not in algorithms/AI, but in nurturing discerning, ethical, and adaptable users.

    As we stand at the precipice of two paths, we have an opportunity to redefine education for the 21st century. By cultivating critical thinking skills, ethical awareness, and adaptability, we can empower the next generation to become not just consumers of technology, but masters of the digital world. It’s clear that we need to create an education system that doesn’t just keep pace with technological advancements but anticipates and shapes them; to empower them to shape the world of tomorrow.

    The future of society depends on it.

    Works Cited:

    AACSB. “Assessing Critical Thinking in the Digital Era.” AACSB, 6 June 2023, www.aacsb.edu/insights/articles/2023/06/assessing-critical-thinking-in-the-digital-era.

    Learning.com Team. “Digital Citizenship in Education: What It Is & Why it Matters.” Learning.com, 6 Feb. 2024, www.learning.com/blog/digital-citizenship-in-education-what-it-is-why-it-matters/.

    MIT Sloan. “Study: Digital Literacy Doesn’t Stop the Spread of Misinformation.” MIT Sloan, 5 Jan. 2022, mitsloan.mit.edu/ideas-made-to-matter/study-digital-literacy-doesnt-stop-spread-misinformation

    Columbia University. “Information Overload: Combating Misinformation with Critical Thinking.” CPET, 27 Apr. 2021, cpet.tc.columbia.edu/news-press/information-overload-combating-misinformation-with-critical-thinking.

    Colorado Christian University. “The Importance of Critical Thinking & Hands-on Learning in Information Technology.” CCU, www.ccu.edu/blogs/cags/category/business/the-importance-of-critical-thinking-hands-on-learning-in-information-technology/.

    University of Iowa. “Digital Literacy: Preparing Students for a Tech-Savvy Future.” University of Iowa Online Programs, 19 Aug. 2024, onlineprograms.education.uiowa.edu/blog/digital-literacy-preparing-students-for-a-tech-savvy-future.