Anthony J. Pennings, PhD

WRITINGS ON AI POLICY, DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL E-COMMERCE

LSEG Workspace as the Central Spreadsheet Grid for Eurodollars

Posted on | August 23, 2025 | No Comments

In spreadsheet capitalism terms, LSEG’s Workspace is a major platform that continues to turn fragmented eurodollar dealing into a coordinated, cell-based, time-stamped matrix of quotes and trades. In grid infrastructure terms, Workspace is fast becoming the organizing interface where eurodollar rates, FX swap points, and Treasury collateral pricing are aggregated and computed. Surprisingly for people who do not understand how the US dollar worldwide, it functions as a key node in the daily digital grid for offshore USD liquidity.[1]

Workspace is spreadsheet logic made infrastructure, and this post outlines the major functions + formulas that make it the newest gridmatic trading infosystem for eurodollar markets. It is part of my “formulating power” project that looks at how spreadsheet formulas have not only made spreadsheets like Excel important, but how the grid-like structure and symbolic computing has expanded into a wide variety of information-based services.

Reuters Dealing (1980s–1990s) was the first electronic dealer-to-dealer chat and price system, effectively the “proto-grid” for eurodollar spot/forward trading. Thomson Reuters’ Eikon (2000s–2010s) evolved into a spreadsheet-like terminal with live market data, analytics, and FX integration. LSEG Workspace (2020s–) emerged as the rebranded, cloud-based successor after Refinitiv’s integration into the London Stock Exchange Group (LSEG). Workspace is not entirely new, but the next-generation layer in the same “gridmatic” tradition.

But it is not exclusive, it coexists with the Bloomberg Terminal and BlackRock’s Aladdin. The Bloomberg “Box” is still dominant on many trading desks, and Aladdin structures the portfolio-level balance sheet exposures. Workspace is strongest where FX and money markets plus data grids converge. Thus, Workspace is becoming the eurodollar spreadsheet of record for traders and banks.

The eurodollar market relies on offshore USD deposits, lending, and collateralized borrowing, and its infrastructure requires:

Real-time USD funding rates (OIS, SOFR-linked, FX swap basis).

Collateral pricing (USTs, agencies).

Cross-border FX settlement data (via SWIFT/CLS).

Workspace pulls all of these streams together:

FX Dealer Chat and Matching is the modern version of Reuters Dealing and still where traders text and eurodollar liquidity is posted/negotiated.

Data Grids: yield curves, swap points, repo rates, and collateral haircuts displayed in table form, live-linked to Excel or APIs.

Analytics: symbolic computing overlays for stress-tests, correlations, and risk across currencies, but USD is always the pivot cell.

The Gridmatic Infrastructure of Workspace

Workspace is a giant spreadsheet-as-interface consisting of data tables, formula-driven analytics, and live-linked cells. Application Programming Interfaces (APIs) that connect to Excel, Matlab, and Python, or the combination (Excel/Matlab/Python) turns Workspace into a grid feeder for downstream symbolic computing in popular spreadsheets. The collaboration grid replaced the “chat window + quote” of Reuters Dealing (a real-time messaging tool for financial professionals) where the trader/analyst views the same USD-centered tables.

So, Workspace turns fragmented eurodollar dealing into a coordinated, networked interface, which is a cell-based and time-stamped matrix of quotes and trades. Workspace is becoming the organizing gridspace where eurodollar rates, FX swap points, and Treasury collateral pricing are aggregated and computed in a networked spreadsheet for offshore USD liquidity.

LSEG Workspace and Its Implications for USD Network Power

By now owning Workspace and Refinitiv FX platforms plus popular benchmark indices (FTSE Russell), LSEG positions itself as the non-US hub for eurodollar telematics. Yet because everything inside Workspace still references US Treasuries and Fed policy rates, the USD remains the anchoring cell of the eurodollar spreadsheet, even on a “foreign” platform. Workspace (LSEG), but with Bloomberg, and Aladdin, overlap and compete in anchoring the USD/eurodollar system, but with Treasuries still in the central reference cell.

LSEG Workspace is spreadsheet logic made infrastructure. Its major functions + formulas make it the newest gridmatic spreadsheet system for organizing eurodollar markets. The major functions and formulas used by Workspace work as data feeds and grid cells. Workspace ingests real-time price streams into cells that look and behave like spreadsheet ranges.

=PRICE(UST_10Y, NOW()) – live U.S. Treasury yield.

=FXSPOT(EURUSD) – spot euro-dollar exchange rate.

=OIS(USD, 3M) – overnight indexed swap rate.

=BASIS_SWAP(EURUSD, TENOR=3M) – FX swap basis points.

These are the “input cells” that anchor all calculations.

Curve Construction & Interpolation

To model funding markets, Workspace builds continuous yield curves from discrete data.

=YIELD_CURVE(INSTRUMENT=UST, METHOD=SPLINE) – fits Treasury yields across maturities.

=BOOTSTRAP(SOFR_SWAP, TENORS) – constructs discount factors from SOFR swap rates.

=INTERP(RATE, MATURITY=7D) – interpolates between tenor points (e.g., repo rate curve).

These are crucial for eurodollar term lending, where pricing depends on forward curves.

Collateral & Repo Analytics

Eurodollar mechanics are collateral-driven so Workspace handles haircut and repo pricing with the following.

=REPO_RATE(COLLATERAL=UST_2Y, TERM=ON, COUNTERPARTY=PD) – overnight repo rate for a 2Y Treasury.

=HAIRCUT(ASSET=UST_30Y, CCP=LCH) – margin requirement for long bond collateral.

=RRP_RATE(TERM=ON) – Fed’s overnight reverse repo program rate, the floor of dollar funding.

These cells define the “funding grid.”

FX & Cross-Currency Funding

Workspace integrates FX swaps, which are the offshore synthetic dollar creation mechanism.

=FXSWAP(EURUSD, TENOR=1W) – implied cost of borrowing USD via FX.

=COVERED_IRP(USD, EUR, TENOR=3M) – checks covered interest parity (CIP).

=BASIS(EURUSD, 3M) – eurodollar “basis spread,” measuring stress in offshore dollar lending.

This function set is where the eurodollar meets FX.

For banks and dealers, Workspace provides grid-like portfolio functions for balance sheets and capital allocation.

=VaR(PORTFOLIO, CONF=99%) – Value-at-Risk of USD positions.

=LCR(LIQUID_ASSETS, NET_OUTFLOWS) – liquidity coverage ratio.

=BALANCE_SHEET(CURRENCY=USD, ASSETS-COLLATERAL, LIABILITIES-FUNDING) – structured balance sheet grid.

This mirrors how eurodollar institutions literally tabulate their offshore books in grid form.

Policy & News Integration

Financial news is also tied right into the spreadsheet logic.

=FOMC_TARGET_RATE() – current Fed funds target.

=ECON_RELEASE(CPI_US, DATE) – inflation print automatically populates.

=NEWS_SENTIMENT(FED, WINDOW=1D) – text mining of Fed statements, outputting a score.

This allows policy shocks to instantly cascade through formulas across FX, repo, and swaps.

Optimization & Allocation

Workspace incorporates optimizers, which are like Solver in Excel:

=ALLOCATE(FUNDING, CONSTRAINTS={DURATION<2Y, COSTEurodollar-Specific Grid Structure

If you were to visualize Workspace as a spreadsheet, its eurodollar tabs would include:

Tab 1 – Market Inputs: Treasuries, SOFR, repo, FX spot.
Tab 2 – Funding Curves: OIS, cross-currency basis, term structure.
Tab 3 – Collateral: Haircuts, repo spreads, RRP anchors.
Tab 4 – Balance Sheet: Offshore USD assets/liabilities in grid form.
Tab 5 – Optimizer: Allocation solver for cheapest-to-deliver funding.
Tab 6 – News/Policy: FOMC targets, Fedwire flows, central bank interventions.

Conclusion

LSEG’s Workspace is the newest gridmatic infrastructure for the eurodollar system. It transforms fragmented offshore USD trading into a cell-based, time-stamped spreadsheet interface, where Treasury yields, FX swaps, and repo collateral are continuously aggregated and recalculated.

LSEG Workspace is essentially a giant live spreadsheet that organizes the eurodollar system into cells and formulas. Treasuries and Fed rates are the anchor cells. Repo, FX swaps, and offshore deposits are formula-driven references. Optimizers and risk tools act like Solver, wiring symbolic computing into the market grid.

The “gridmatic infrastructure” of Workspace is what allows offshore USD lending to stay synchronized with Fed policy and Treasury collateral — keeping the U.S. dollar anchored as the central cell of global finance.

Not intended to be investment advice of any kind.

Notes

[1] I started my research on eurodollars for my MA thesis in 1986 that investigated how deregulation in finance and telecommunications were allowing new financial services to emerge like SWIFT, CHIPS, and Reuters Monitor and Dealing. I continued to write about electronic money using the term digital monetarism but drawing on my PhD, what we call digital monetarism is better understood as spreadsheet capitalism: a system where gridmatic infrastructures (Bloomberg, Aladdin, Workspace) continuously synchronize global liquidity, collateral, and pricing to keep the U.S. dollar at the core of world money.
[2] Chat GPT was an important research aide in this inquiry. In accordance with the determiniations set by the US Copyright Office, I claim the results of prompts as property. I guide the inquiry and carefully edit any LLM returns. Prompt 1: Does the USD operate globally with network effects?
Prompt 2: The US dollar’s global power is not just about US GDP or military strength. It’s about network effects built into financial infrastructure, where the dollar sits at the center cell of the global spreadsheet, anchoring liquidity, collateral, and pricing. Continuing the analysis of spreadsheet capitalism elaborate on how technological systems like BlackRock’s Alladin, Bloomberg’s “Box” and Thomson Reuter’s Eikon work with treasury markets, foreign exchange trading, financial news, and central banks to keep the USD anchored as the world’s primary reserve and payment system.
Prompt 3: Is LSEG Workspace the new central gridmatic organizing spreadsheet system for the eurodollar market?

Citation APA (7th Edition)

Pennings, A.J. (2025, Aug 23) LSEG Workspace as the Central Spreadsheet Grid for Eurodollars. apennings.com https://apennings.com/technologies-of-meaning/lseg-workspace-as-the-new-central-spreadsheet-for-eurodollars/

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Excel vs. Java: Two Languages, Two Worlds of Thought: Symbolic Grid vs. Algorithmic Script

Posted on | August 12, 2025 | No Comments

Spreadsheets are often dismissed as “just” tables with lots of numbers. Still, their symbolic computing functions operate in a very different register than traditional programming, and that difference is why they’ve been so powerful in business, finance, and administration.

In this post, I examine the differences between Excel spreadsheets and traditional programming, primarily Java, but with some references to Python. This inquiry comes from my long history into inquiring into both areas of computing, mostly spreadsheets. But also my more recent experience in a program that stresses students learn Java and Python. It ends with some speculation about AI and its influence on both.[1]

Excel vs. Java/Python

When we talk about programming, it’s easy to imagine a single tradition — algorithms, variables, loops — the kind of logic you find in computer science textbooks. But spreadsheets operate in a parallel tradition. They use symbolic computing in a grid environment. And when you set Excel side-by-side with Java and Python, the contrasts reveal not just different tools, but two very different ways of thinking about computation, knowledge, and work.

Excel and Java are representatives of two cultural histories of computing. The ledger-turned-simulator, serving the micro-decisions of commerce and finance. And the algorithm-turned-platform, serving the macro-architectures of global software development. Both shape our world, but through different epistemologies, and in service to different forms of power.

Excel is declarative and reactive. You don’t tell Excel how to compute something step-by-step; you state the relationship between values — =SUM(A1:A10). Then let the spreadsheet engine figure out the rest. Users declare relationships between cells; the engine maintains them automatically. The primary metaphor for the spreadsheet is the ledger. Spatial placement and references produce meaning. The “program” is a living grid where changes ripple through automatically with updated information.

Java’s procedural lineage grew from Cold War software engineering. It is procedural and object-oriented. You write explicit instructions: declare variables, loop over arrays, branch with conditions. The “program” is a designed machine — it does nothing until you start it, and it runs in the sequence you prescribe. Likewise, Python is a multi-paradigm programming language that is imperative, procedural, and mostly object-oriented (with functional elements). Users specify step-by-step instructions in human-readable syntax. The primary metaphor is the script, a narrative sequence of commands to be executed.

Python was developed in the early 1990s in the tradition of general-purpose scripting languages like Unix shells and C. It was designed for readability, minimal syntax noise, and broad applicability — from system scripting to AI/ML. Python code runs from top to bottom, unless redirected by flow control. Scripts can be interactive (REPL/Jupyter) or batch-run, but they start from a blank state each run unless persisted.

Excel’s grid is both interface and execution environment. A cell like “B2” is a symbol in a two-dimensional space, not a memory address. Ranges (A1:C3) are symbolic collections, defined by their position in the grid. The layout is the program. In Java, memory is abstract and invisible. You construct arrays, lists, and objects, but they live behind the scenes. The UI, if you build one, is separate from the computation. The layout of your code is not the same as the data it processes.

In Excel, execution is continuous and event-driven. Change one value, and the dependency graph kicks in: only cells that rely on that value are recalculated. There’s no “Run” button for formulas — the sheet is always “on.” In Java, execution is discrete. You compile and run the program, it executes a defined sequence, and it stops. Event-driven behavior (like GUIs) is possible, but it must be explicitly coded with listeners and handlers.

In Excel a cell can hold a number, text, a boolean, or an error without predeclaring its type. This flexibility makes rapid modeling possible, but it also hides data-type errors until they matter. In Java, every variable must have a type before it can be used, and the compiler enforces it. This prevents certain categories of bugs but slows the fluid improvisation that spreadsheets enable.

The spreadsheet user thinks spatially — “This cell depends on that one; those three cells add up to this total.” Programming in Excel feels like maintaining a ledger or building a map — computation embedded in geography.

The Java programmer thinks algorithmically — “First I do this, then I do that.” Programming in Java feels like designing a machine — computation as a process unfolding in time, not space.

Economic and Cultural Lineages

Excel descends from accounting, tabulation, and ledgers. It carries the epistemology of the bookkeeper — numbers aligned in rows and columns, meaning attached to placement, relationships visible at a glance. Java descends from structured programming and software engineering. It carries the epistemology of the engineer — modularity, encapsulation, and abstraction layers designed to scale from small programs to global systems.

Why it matters? Spreadsheets turned computation into a cultural form accessible to non-programmers, embedding symbolic computing into the everyday life of business, government, and finance. Java, and languages like it, built the backbone of global software infrastructure. Both changed the world — but they changed different worlds.

Excel’s “gridmatic” logic empowers rapid prototyping and live simulation, perfect for domains like finance where decisions are model-driven and iterative. Java’s “narrative” logic excels in building robust, reusable systems where execution order and data integrity are paramount.

The Role of AI

Excel’s symbolic, reactive grid and Python/Java’s procedural, text-based scripts represent two mature modes of computation. AI — especially large language models (LLMs) and generative systems — introduces a third mode: probabilistic, pattern-driven inference.

Instead of Excel’s symbolic mapping (“Cell B2 = sum of A1:A10”) and traditional programming’s procedural instruction (“Loop through this array and sum the numbers”), we now get pattern completion (“Given these values and my training data, here’s the sum, the trend, and a prediction — without you explicitly defining the steps”). AI doesn’t just run code — it generates it, interprets messy inputs, and adapts behavior without explicit reprogramming.

Excel is already being transformed into a natural-language modeling surface with Copilot for Excel that lets you type “Show me sales trends for Q3” and get formulas, pivot tables, or charts without knowing the syntax. AI can generate whole spreadsheets from plain text or PDFs, eliminating manual cell-by-cell building. Predictive modeling (e.g., forecasting, anomaly detection) is moving inside Excel’s grid without the user writing formulas.

AI is turning Excel into “spreadsheet as conversation.” The grid remains, but much of the symbolic formula-writing is offloaded to an AI intermediary. Excel’s tabular form will survive because the grid is a powerful cognitive interface. But its internal programming layer will be increasingly AI-written rather than human-written.

For procedural languages, AI is already functioning as a code generator where LLMs write Python/Java from natural language prompts.
It also works in refactoring and as a debugging assistant — turning vague goals into structured, tested, deployable code. AI can be seen as an Integration orchestrator, pulling APIs, databases, and machine learning models together without the developer having to remember all syntax and library calls.

Python in particular is symbiotic with AI — it’s the dominant language for training and deploying models, and AI frameworks (PyTorch, TensorFlow) are Python-native. Java will continue in large-scale systems and enterprise software, with AI increasingly writing parts of it (e.g., data-access layers, boilerplate). Procedural coding won’t vanish — but the human role will shift from “writing code” to “specifying requirements, constraints, and reviewing AI output.”

Excel democratized computing for business users without replacing procedural programming. It just absorbed a huge chunk of modeling work. AI will do the same. It will democratize and accelerate both symbolic and procedural programming, not erase them.

AI will continue to replace the rote coding of standard programming patterns, but humans will still be needed to define problems precisely, decide trade-offs between performance, accuracy, and interpretability, and ensure compliance, security, and business alignment.

Conclusion

Spreadsheets like Excel embody a distinct symbolic computing tradition, separate from conventional programming languages such as Java. Excel is declarative, reactive, spatial, and loosely typed, with computation embedded directly in its two-dimensional grid. Java is procedural, algorithmic, temporal, and strongly typed, with computation separated from its interface.

These differences reflect their cultural and economic lineages. Excel emerges from accounting and tabulation, enabling rapid, accessible modeling for business and finance, while Java descends from structured programming, supporting large-scale, robust software systems. Both have profoundly shaped modern life, but in different domains and through different epistemologies.

Understanding these perspectives side-by-side isn’t just academic. It’s a reminder that “programming” isn’t one thing — it’s a family of practices, each with its own history, cultural lineage, and vision of what computers are for. Digital spreadsheets will continue to be used because it’s a human-readable canvas for understanding and auditing results.

Excel and Python/Java will increasingly become AI-mediated layers in a bigger computational infosystem. Excel will be the visual decision surface while Python/Java are the execution engines and integration layers. AI will be the translator and generator between human intent and machine implementation.

Notes

[1] I wrote my MA thesis on eurodollars and telecommunications deregulation but didn’t appreciate the importance of the spreadsheet until my PhD students, and my personal experience running a small Center for Modern Media at the University of Hawaii in the summers during graduate school. My overriding purpose is to develop a more sophisticated understanding of the digital spreadsheet and its impact on finance and other social areas.
[2] I used a very general ChatGPT prompt to get this conversation started. Describe the symbolic computing functions in spreadsheets and how they differ from traditional computer science concepts. Then I followed up with requests to include Java and Python.

Citation APA (7th Edition)

Pennings, A.J. (2025, Aug 12) Excel vs. Java: Two Languages, Two Worlds of Thought: Symbolic Grid vs. the Algorithmic Script. apennings.com https://apennings.com/meaning-makers/digital-spreadsheets-power-over-people-and-resources/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Reprogramming Spreadsheet Capitalism for Climate Resilience

Posted on | July 28, 2025 | No Comments

Digital spreadsheets such as VisiCalc, Lotus 1-2-3, Microsoft Excel, and Google Sheets transformed how corporations, states, and investors conceived of value and how they operationalized strategies based on spreadsheet functions, formulas, and their results called “returns.” Capital became not merely a tangible asset or productive relation but a symbolic register and logic of economic and organizational transformation. This logic enabled large-scale financial engineering through leveraged buyouts (LBOs), the privatization of government agencies and state-owned enterprises (SOEs), the rise of private equity giants, and the computation of national investment value by sovereign wealth funds (SWF).

This post argues that the framework of “spreadsheet capitalism,” where capital transcended traditional definitions to become the symbolic logic of transformation and organization of the global political economy, offers a potent framework for addressing the complex financing and organizational demands of climate resilience. The intensifying impacts of climate change necessitate a fundamental re-evaluation of how capital is conceived, managed, and deployed.

Defining Spreadsheet Capitalism

Spreadsheet capitalism refers to the emergence and persistance of a distinctive phase in economic governance and financial management, where the digital spreadsheet serves as both the interface, and the organizational logic of economic decision-making. It is characterized by the abstraction of capital into programmable units — cells, tables, formulas, and dashboards — and its animation through symbolic action.

In the framework of spreadsheet capitalism, capital no longer exists solely as a tangible asset, productive relation, or factor of industrial production — it is reconstituted as a programmable logic embedded in and animated by digital spreadsheets. Drawing on the semiotic-computational framework, this reconceptualization reveals how capital is not simply represented in spreadsheets but reborn through symbolic substitution, abstract modeling, and procedural calculation.[1]

In this framework, capital is:

Abstracted – It is separated from material reality and modeled in symbolic placeholders (e.g., cells representing a company, project, or debt).

Structured – It is organized gridmatically and hierarchically through linked sheets, conditional logic ( =IF() ), and financial formulas ( =NPV(), (=IRR() ).

Animated – Through =GOALSEEK() and =SCENARIOS(), capital is simulated to produce optimal pathways for investment, divestment, or transformation.

These tools turned balance sheets into strategic instruments, enabling governments and firms to model everything from cost centers to pension reforms. The public sector adopted these logics to restructure budgets, model privatizations of public assets, and simulate fiscal outcomes in macroeconomic planning [2].

Financing Climate Resilience

Spreadsheet capitalism offers a framework to finance climate resilience by modeling future scenarios and allocating symbolic capital to present interventions:

=NPV() and `=IRR() simulate long-term returns from climate investments (e.g., sea wall construction, renewable grids).

=IF() can be used for performance-linked disbursements (e.g., climate bonds with conditional triggers). It is the expected annual growth rate of the investment.

=FORECAST() estimates climate-related losses avoided through mitigation strategies [3].

Digital spreadsheets are not just record-keeping tools, they are simulation engines. They make climate modeling more accessible by translating complex scientific and financial models into programmable logic through formulas, conditional logic, lookup tables, and scenario planning tools.

For climate risk forecasting and statistical projections, spreadsheets can translate climate data into predictive economic and infrastructure outcomes using some of the following.

=FORECAST.LINEAR() – Predict flood frequency or temperature increases

=TREND() – Project emissions or water stress over time

=LINEST() – Regression-based risk modeling from historical climate or financial data

=RANDBETWEEN() + =NORM.INV() – Generate probabilistic stress-testing values

Spreadsheets support the organization of complex, multi-stakeholder projects via:

– Resource Allocation using =SUMIFS() aggregates costs by category, region, or priority.

– Risk Modeling using heatmaps embedded in spreadsheets visualize geospatial threats.

– Scenario Analysis using functions such as =SCENARIOS() tests how variables like temperature or funding affect outcomes.

– Tracking KPIs with dashboards visualize real-time emissions, adaptation metrics, and budget flows.[4]

Simulation Capabilities for Impact Modeling

Simulation is central to spreadsheet capitalism and can be applied to climate finance. Governments and firms use this to:

– Compare costs of inaction vs. action under 2°C and 4°C warming scenarios.
– Estimate the payback period of battery storage and solar projects using =PMT().
– Run stochastic simulations ( =RANDBETWEEN() + Monte Carlo macros) to evaluate risk in volatile geographies [5].

Spreadsheet-Based Simulation Tools

Purpose: Flexible, accessible modeling of adaptation/mitigation economics
Capabilities:

=NPV() and =IRR() to simulate investment viability

=GOALSEEK() to reverse-engineer cost targets

=SCENARIOS() and =DATA TABLES() for multi-path forecasting

Used in: World Bank, UNDP, Green Climate Fund projects

The strengths of spreadsheet modeling are transparent, auditable, low-barrier for public sector use. Its limitations include its limitation to known variables and lacks stochastic complexity.

While notorious tools of corporate raiding, privatization of public assets, and the short-term profit maximization of private equity, spreadsheet logic also provides climate investment spreadsheets with models for evaluating carbon pricing impacts, informing climate information dashboards with open data APIs and infrastructure models for resilience financing, as well as providing Excel-based ESG scoring formulas to guide portfolio shifts. Examples of important formulas include:

– Simulation of future risks ( =FORECAST.LINEAR() for sea level rise)
– Financing models ( =NPV() for water infrastructure)
– Impact assessment ( =IF(ROI>=Threshold, “Fund”, “Defer”) to determine climate-related investment decisions)

Those are some of formulas that can inform long-term investment modeling for infrastructure projects, climate scenario analysis for forecasting adaptation vs. inaction costs, and modular energy project building that breaks significant interventions into spreadsheet-modeled stages for easier governance. Originating from the performative capabilities of digital spreadsheets, this emerging framework abstracts, structures, and animates capital through substitution and computational functions, formulas, and simulations.

Real-world applications include:

– The World Bank’s climate investment spreadsheets for evaluating carbon pricing impacts.
– UNDP’s Climate Information Platforms integrating Excel dashboards with open data APIs.
– The US Army Corps of Engineers’ infrastructure models for resilience financing.
– Global sovereign funds, like Norway’s GPFG, using Excel-based ESG scoring formulas to guide portfolio shifts [6].

This project attempts to “deconstruct” the digital spreadsheet and its formulas, highlighting its genesis from a mere tool to a pervasive framework that democratized/centralized financial power through substitution and computational abstraction. It elaborates on how capital is abstracted into manipulable data, structured via gridmatic interfaces, and animated through temporal financial models, creating a powerful feedback loop between symbolic logic and material reality.

Examining the multifaceted landscape of climate resilience, this post starts to highlight the vulnerabilities of critical infrastructure, including energy grids, transportation networks, public utilities, coastal areas, and landslide-prone regions, underscoring their deep interdependencies and the potential for cascading failures. It then outlines current technological and policy innovations, from advanced materials and intelligent monitoring systems to nature-based solutions and comprehensive frameworks like the Sendai Framework for Disaster Risk Reduction, noting a crucial convergence of “hard” and “soft” solutions.

The core argument unfolds in the application of spreadsheet capitalism to climate resilience. Financial modeling, leveraging temporal finance, is presented as a catalyst for climate investment, translating physical risks into quantifiable financial values to drive strategic decisions. Strategies for bridging the climate finance gap, particularly for local and SME-led initiatives, are explored through the lens of granular data aggregation and micro-modeling, demonstrating how localized impact can attract broader capital. Organizationally, spreadsheet capitalism facilitates data-driven governance, transforming static planning into adaptive, real-time management. The spreadsheet emerges as a vital “boundary object,” fostering interdisciplinary collaboration by providing a common, quantifiable platform for diverse stakeholders to work together.

Conclusion and Recommendations

Spreadsheet capitalism offers both an epistemology and a toolset to support the transition to climate resilience. It enables not only the modeling of risk and cost but also the simulation of future scenarios and the reallocation of capital toward sustainability.

Benefits of spreadsheets include making complexity computationally manageable, enabling anticipatory governance possible via symbolic simulation, and enhances coordination across jurisdictions with linked models.

Challenges include over-simplification of local nuances into numeric cells, risk of model opacity and technocratic control, data assumptions may embed biases or historical inequalities into calculations.

Recommendations for using spreadsheets in the transition should include:

– Mandating open-source spreadsheet templates for climate finance.
– Requiring scenario-based planning for national resilience budgets.
– Embedding transparency mechanisms into formula logic for public accountability.

Summary

As climate change intensifies, it demands a radical reconsideration of how capital is conceptualized and mobilized. This paper introduces the concept of spreadsheet capitalism, a mode of economic rationality in which capital transcends its material form to become a symbolic, programmable logic mediated by digital spreadsheets. Born from tools like Excel, VisiCalc, and Google Sheets, the spreadsheet now functions not just as a medium for accounting but as an epistemic and organizational infrastructure. It abstracts, structures, and animates capital through functions, formulas, and simulations.

Drawing on a semiotic-computational framework, spreadsheet capitalism is described as transforming capital into an algorithmic object that can be substituted symbolically (e.g., through cell values and formula references) and manipulated computationally and formulaically (e.g., through functions such as =NPV(), =IF(), and =GOALSEEK()). These tools allow capital to be modeled as a temporally dynamic, strategic input, organizing everything from leveraged buyouts (LBOs) to private equity and sovereign wealth funds. Public sector applications have included fiscal reform, privatization modeling, and macroeconomic planning simulations.

The post then applies this framework to climate resilience, arguing that spreadsheet capitalism offers powerful instruments for financial and operational governance. By transforming physical risks into computable values, climate investments can be simulated, justified, tested, and scaled. Spreadsheet tools facilitate resource allocation (=SUMIFS()), risk visualization, scenario testing, and adaptive project governance through dashboards and key performance indicators.

Real-world cases illustrate this logic in action: from the World Bank’s carbon pricing tools to UNDP’s open-data dashboards, to ESG evaluation in sovereign portfolios. The ability to run long-range simulations (=RANDBETWEEN() + Monte Carlo macros) positions spreadsheet capitalism as a critical ally in projecting the impacts of mitigation and adaptation efforts.

Yet, this symbolic infrastructure presents risks. Overreliance on abstraction can obscure social and environmental realities, embed bias, and centralize epistemic control in technocratic institutions. Thus, while spreadsheet capitalism offers anticipatory and computational power, it also necessitates new governance principles, including open-source modeling, transparent logic, and scenario-based budgeting, for enhanced public accountability.

In conclusion, the spreadsheet, once a corporate ledger, has become a symbolic engine of global capital. Its logic can — and must — be redirected toward financing and organizing a resilient, sustainable future.

Citation APA (7th Edition)

Pennings, A.J. (2025, July 27) Reprogramming Spreadsheet Capitalism for Climate Resilienceapennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/reprogramming-spreadsheet-capitalism-for-climate-resilience/

Notes

[1] The semiotic-computational framework is best understood through the integration of semiotic theory (Saussure, Barthes, Chandler, Derrida) and computational performativity (Austin, Peirce, Foucault), where signs, formulas, and simulations act not merely as reflections of reality but as instruments that organize and enact it. I have been teaching a course in Visual Rhetoric and Information Technologies that uses Chandler, D. (2007). Semiotics: The Basics. Routledge. It has given me a useful tool to crack the code of spreadsheets, which are mainly constructed in utilitarian terms. It also presented a choice about methodology – to conduct an ethnographic analysis of how spreadsheets were used in workplaces or to a media analysis to “deconstruct” the functions and formulas used. It also helped me distinguish between symbolic computing and generative AI, an important distinction in spreadsheet capitalism.
[2] I first wrote about spreadsheets and their impact in my PhD dissertation Symbolic Economies and the Politics of Global Cyberspaces that used Goux, J.-J. (1990). Symbolic Economies. Cornell University Press. It was a followup to my Masters thesis about how deregulation of finance and banking was creating new forms of electronic money and putting pressure on countries to privatize their telecommunications structure.
Working with the UNDRR in Songdo, Korea helped shape the focus of our graduate program at SUNY Korea and helped me develop a stronger understanding of climate risk and financing issues.
[3] At New York University I developed a digital media management program that placed a strong emphasis on media metrics such as KPIs. [4] When I moved to Austin, TX to set up a similar media management program I taught a Social Media Metrics and Analytics course at St. Edward’s University that explored KPIs and other key metrics.
[5] World Bank. (2021). Climate Change Overview. https://www.worldbank.org/en/topic/climatechange/overview
Norway Government Pension Fund Global (GPFG). (2022). ESG Integration and Responsible Investment. https://www.nbim.no/en/
[6] UNDP. (2020). Climate Information Platforms. https://www.undp.org/publications/climate-information-platforms
[7] U.S. Army Corps of Engineers. (2022). Climate Preparedness and Resilience Program. https://www.usace.army.mil/
[8] Norway Government Pension Fund Global (GPFG). (2022). ESG Integration and Responsible Investment. https://www.nbim.no/en/

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Stablecoins, Blockchains, and the Semiotic-Telecom-Computational Stack of Spreadsheet Capitalism

Posted on | July 24, 2025 | No Comments

This post continues the spreadsheet capitalism framework, focusing on crypto stablecoins as a further manifestation of the semiotic-symbolic gridmatic technologies that synthesize writing-as-substitution with computation-as-symbolic-action.

Yet, unlike Excel and other spreadsheets, crypto blockchains add irreversibility, temporal structure, and global accessibility. It is a semiotic-computational system with historical memory and real-time synchronization, providing the international grid of capitalism with archival gravity and live action. In the transition to digital money, crypto stablecoins such as USDT (Tether) embody this same logic, now embedded into blockchain infrastructure, running on telecommunications networks, and maintained through distributed computational cloud systems.

Spreadsheet capitalism originated as a human-machine interactive symbolic technology: an architecture of gridded substitution and computed action. Spreadsheets substituted real-world entities with signs and executed symbolic operations (such as NPV() or IF()) to shape corporate futures and privatize public value. Just as the spreadsheet reshaped corporate governance and public finance, stablecoins rewire the conditions of global liquidity, remittance, savings, and speculation. They are spreadsheets transformed into sovereign, programmable currencies residing on a semiotic-telecom-computational stack.[1]

From Cell to Chain: Writing as Substitution in Spreadsheet and Blockchain and the Symbolic Governance of Value

In the digital age, spreadsheet capitalism has evolved from a grid-based organizational logic into a fully programmable architecture of value, culminating in the rise of crypto stablecoins. As symbolic instruments of financial rationality, spreadsheets once enabled the privatization of public assets and the modeling of abstract futures. Now, with stablecoins — cryptocurrencies pegged to the value of fiat money — the same logic of writing-as-substitution and computation-as-symbolic-action reemerges in more abstract, yet more powerful, forms.

Through a synthesis of Saussure’s structuralism, Derrida’s différance, Barthes’ mythologies, and Peirce’s pragmatism, we can trace the lineage of the stablecoin to the spreadsheet grid and contribute to our understanding of Human-Machine Knowledge (HMK) production.

In a spreadsheet, numbers and words substitute for real entities, such as assets, employees, and liabilities. This substitution is more than representational; it is operational. When a balance sheet contains a cell like =SUM(A1:A10), it writes a truth into being through substitution and calculation. Stablecoins extend this practice. A stablecoin, such as USDC or USDT, is not a dollar; it is a token that substitutes for the dollar, whose value is governed by a combination of contractual references (e.g., “backed 1:1 by USD”) and automated verifications, as well as smart contracts.

The value of a stablecoin exists not in itself, but in its relational difference from other coins (BTC, ETH) and fiat assets. The meaning of “1 USDC” is a digital currency that is a tokenized US dollar, with the value of one US Digital Coin (USDC) coin pegged as close to the value of one US dollar as it can get, is derived from its system of oppositions — what it is not — and maintained by a symbolic tether, thus the name of the leading company.

USDT is a stablecoin issued by Tether Ltd., which claims to be backed 1:1 by US dollars (or near equivalents, such as commercial paper, T-bills, or cash). It is issued on multiple blockchain networks, such as Ethereum (ERC-20), Tron (TRC-20), Solana, Avalanche, and Polygon.

Each USDT token serves as a digital representation, substituting for the dollar in transactions, lending, remittances, and reserves. This substitution mirrors the spreadsheet cell substituting for a balance, cost, or asset — except now it’s instantiated on a public ledger.

Smart Contracts as Semiotic Computational Operators

Stablecoins operationalize the spreadsheet’s foundational logic on a planetary scale through writing-as-substitution (token for dollar) and computation-as-action (smart contracts executing transfers or collateralization). They are semiotic instruments of programmable value, bound up in both networked infrastructure and crypto ideology.

Where the spreadsheet formula authorized action (=IF(balance<0, “Reject”, “Approve”)), the smart contract in stablecoin systems automates enforcement. These computational rules, embedded in code, are Peircean interpretants. They are not merely signs, but actants. A user interacting with a decentralized exchange does not simply view a price; their interaction triggers computational outcomes, transfers, collateralization, or liquidation.

Blockchain smart contracts perform the symbolic functions of spreadsheet formulas — autonomously, repeatably, and audibly. The value of the token exists only to the extent that the network of users, validators, and protocols believes in and acts upon it. It depends on its networked presence and computational power.

Though USDT itself is centralized, its transfer, creation, and redemption processes are implemented via smart contracts and automated computational rules. These include:

mint(amount) – Create new USDT tokens when funds are received. New blocks are created, data is authenticated, and validators are rewarded with newly minted tokens or coins for verifying transactions and securing the network.

burn(amount) – Destroy tokens when redeemed for fiat. Burning cryptocurrency tokens refers to the process of intentionally removing a specified amount of tokens from circulation permanently by sending the tokens to a special wallet address, known as a “burn address,” “eater address,” or “null address,” that has no private key associated with it. This renders the tokens irretrievable and unspendable, effectively destroying them and reducing the overall supply.

transfer(to, amount) – Symbolically moves value via code. It is a request to move a specified amount of money to a recipient.

Each of these functions performs Peircean interpretant logic: signs triggering real-world responses such as cash movements, accounting entries, or liquidity adjustments. Blockchain is a semiotic infrastructure where code is the law, and symbolic signs act upon the world.

Here, computation becomes symbolic power. Derrida’s logic of the trace is evident: the stablecoin system governs through what is absent — the off-chain fiat, the implied future audit, the deferred collateral. The token is the mark of absence, yet it performs as if its presence were guaranteed.

While blockchain secures the symbolic value of USDT, telecom networks deliver its functionality. Internet backbones, including global fiber-optic and satellite systems, support the transmission of blockchain data. Wireless networks & 5G support mobile DeFi (decentralized finance) via apps like MetaMask, a web browser extension and mobile app, or exchanges like Binance. Distributed networks of computers (nodes) validate and relay transactions across geographies.

Each USDT transaction is a symbolic event encoded, transmitted, verified, and stored using telecommunications systems. Just as the spreadsheet depends on the screen and processor, USDT depends on networks of hardware, routing protocols, and cross-chain bridges. In this way, value is not only symbolically constructed — it is telecommunicated into being.

The Myth of Stability

Stablecoins are designed to naturalize stability in a world of volatile assets. Their names (Tether, TrueUSD, Circle, USD Coin) encode Barthesian myths — promises of permanence, neutrality, and reliability. Yet they are neither inherently stable nor universally redeemable. The spreadsheet-like dashboards that report “fully backed,” “attestation complete,” or “on-chain proof” perform a mythologization of technical trust. This techno-narrative transforms symbolic abstraction into economic myth. the user believes in the peg not because they hold cash in hand, but because a table somewhere says the numbers match.

Stablecoin systems rely on temporal deferral. They promise future redemption, the eventual audit, and the belief that the collateral exists. Just as =NPV() calculates the present value of future cash flows based on speculative rates, stablecoins calculate present equivalence based on hypothetical future convertibility. The system is structured by what Derrida referred to as différance — not only meaning deferred, but value deferred, stability deferred. This is not a flaw, it is the system’s feature.

It is the very condition that allows the symbolic system to function. The value of the token exists only insofar as the network of users, validators, and protocols believe and act on it. This is Derrida’s trace: the token’s meaning is built on absence (you don’t hold dollars, but a token that refers to them). Its power lies in deferral (future redeemability) and reference (belief in the peg, the audit, the issuer).

Stablecoins also act as regulatory technologies in decentralized finance (DeFi). They govern behavior, constrain actions, and provide legibility to otherwise opaque systems. Their price stability enables leverage, denominates loans, and structures risk modeling. This is the spreadsheet logic deployed at scale: visibility, discipline, quantifiability.

Grids Without Borders: The Logic of Transnational Computation and the Governance of Liquidity

Where spreadsheet capitalism created managerial governance inside firms, stablecoin economies extend that logic into macroeconomic governance. Every stablecoin token is a symbolic cell in a transnational financial spreadsheet. Its value is enforced through automated minting, redemption, and auditing protocols. It is governed not by central banks, but by contractual and cryptographic mechanisms.

This logic scales globally: through mobile phones, crypto wallets, and stablecoin APIs. In a time of chronic USD shortage, users in Brazil, India, or Nigeria can access, store, and transmit synthetic dollars. This expansion creates a shadow dollar system that operates alongside or beneath state monetary policy.

Some stablecoins (like Circle’s USDC) are backed in part by short-term US Treasuries. This connection has two symbolic effects. It expands the reach of US sovereign debt as BRIC-country users indirectly hold US treasury-backed assets. It also normalizes US debt as a global unit of monetary reserve. Treasuries become not just central bank holdings, but extend into the new algorithmic monetary infrastructures. The grid of stablecoins imports the sovereign architecture of the US fiscal state into the crypto economy without negotiation, treaty, or IMF oversight.

BRIC nations have sought to resist US financial hegemony by creating new interbank clearing systems, bilateral swap lines, and threatening new reserve currencies. Yet paradoxically, Russian users turn to USDT amidst sanctions and ruble volatility; Indian crypto traders rely on USDC for dollar-denominated returns; Brazilian fintechs integrate stablecoins for efficient cross-border finance, and Chinese crypto users use Tether extensively for OTC (Over-the-Counter) settlements.

This tension reveals a semiotic-financial asymmetry: even as states pursue de-dollarization, populations and platforms move toward re-dollarization. The spreadsheet logic of programmable substitution precedes and exceeds state control.

Stablecoins represent not merely a technical innovation, but a new semiotic and monetary regime. It reproduces spreadsheet capitalism on a planetary scale, replacing gridmatic sovereignty with networked symbolic authority. They transform the dollar from a fiat currency into a programmable token, turn global value systems into grids without borders, and convert sovereign financial instruments (treasuries) into collateral for algorithmic money (like it was for Eurodollars).[2] Stablecoins expand US monetary influence into BRIC and Global South regions without military or institutional intervention.

In this emerging landscape, the spreadsheet logic has become sovereign, and the dollar, through its stablecoin avatars, writes itself into being across the symbolic infrastructures of global computation. The stablecoin’s value is sustained not just by reserves, but by the architecture of surveillance, reporting, and symbolic coherence — all of which originate in the spreadsheet’s managerial gaze.

Conclusion: From Grids to Chains

The stablecoin is not a digital coin in the traditional sense — it is:

A sign (substitution),

A command (smart contract),

A node in a network of symbolic governance,

A performative utterance (Austin),

A myth of stability (Barthes),

A trace of future value (Derrida),

A structural element in economic semiotics (Saussure),

A trigger of financial behavior (Peirce),

A disciplinary technology of monetary order (Foucault).

Just as the spreadsheet reshaped corporate governance and public finance, stablecoins rewire the conditions of global liquidity, remittance, savings, and speculation. They are spreadsheets turned sovereign — programmable currencies living on a semiotic-telecom-computational stack.

In conclusion, stablecoins are the natural heirs to the semiotic-computational legacy of the digital spreadsheet. Like spreadsheets, they substitute signs for things; like spreadsheet functions, they act through symbols. They are not merely representations of value—they construct, govern, and animate economic behavior in digital space.

Where spreadsheet capitalism rationalized public assets and privatized state assets, stablecoins extend this logic into programmable, transnational value systems, governed not by legal fiat but by symbolic chains of reference and procedural authority. They are grids without borders, ledgers without guarantees, and spreadsheets that write the economy into being.

Notes

[1] This is a reference to the Internet stack, a technical framework that conceptualizes the network in layers: Application, Transport, Network, Datalink, and Physical.
[2] US treasuries became collateral for transnational Eurodollar lending because they were trusted by banks and allowed financial insitutions that had no formal or informal relationship to engage in transactions.

Prompt: Spreadsheet capitalism has evolved from a grid-based organizational logic into a fully programmable global architecture of value, culminating in the rise of crypto stablecoins. As symbolic instruments of financial rationality, spreadsheets once enabled the privatization of public assets and the modeling of abstract futures. Now, with stablecoins—cryptocurrencies pegged to the value of fiat money—the same logic of writing-as-substitution and computation-as-symbolic-action reemerges in more abstract, yet more powerful, forms. Describe how crypto stablecoins like USDT use blockchain to operate and how do they intersect with networking technologies and telecommunications systems. Continue the analysis of spreadsheet capitalism as resulting from semiotic-symbolic technology writing-as-substitution with computation-as-symbolic-action by applying it to crypto stable coins.

Citation APA (7th Edition)

Pennings, A.J. (2025, July 24) Stablecoins, Blockchains, and the Semiotic-Telecom-Computational Stack of Spreadsheet Capitalism. apennings.com https://apennings.com/artificial-intelligence/stablecoins-blockchains-and-the-semiotic-telecom-computational-stack-of-spreadsheet-capitalism/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy and researches digital spreadsheets and sustainability. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Spreadsheet Formulas and the Temporal Imperatives of the Leveraged Buyout Economy

Posted on | July 11, 2025 | No Comments

The IBM PC and its compatibles (Compaq, Dell, DEC Rainbow 100), powered by Intel’s microprocessors, provided the hardware platform for the Lotus 1-2-3 spreadsheet to thrive in the 1980s. The spreadsheet’s features, combined with Intel’s 8088’s processing power, made it a versatile tool for financial professionals.

This post analyzes the digital spreadsheet’s capability to use formulas to discipline time, meaning, and performance – specificially the logic of Time Value of Money (TVM) formulas to collapse temporality into symbols that motivate finanical action. The spreadsheet’s power comes from its ability to transcend passive representation to actively construct and simulate financial realities to produce justifications for leveraged buyouts and other economic actions. This performative capacity implies that financial models, rather than passively describing an existing reality, actively shape that reality.

By providing the computational power needed for sophisticated spreadsheet software, the Intel 8088 chip-enabled Lotus 1-2-3 to become a powerful tool for financial analysis, transforming how businesses managed and analyzed financial data in the 1980s. The 8088’s arithmetic capabilities enabled Lotus 1-2-3 to execute complex financial formulas and algorithms quickly, making it suitable for forecasting, budgeting, and economic modeling tasks.

Spreadsheets like Lotus 1-2-3 and Excel allowed financiers to analyze target companies, model LBO financing, and present compelling cases to investors. This facilitated the wave of LBOs in the 1980s, exemplified by deals like RJR Nabisco and Beatrice Foods. Spreadsheets enabled sophisticated financial modeling across time periods, incorporating factors like cash flows, interest rates, and investment returns. This research highlights specific Lotus 1-2-3 functions and formulas, such as @SUM, @ROUND, @PV, @FV, @NPV, and @IF, that were crucial for LBO modeling and financial analysis. “Temporal finance” became crucial for LBOs and other financial instruments.

Time is a crucial factor in capitalism and its financial investments, but it was only after the West’s social transformation of time and sacrifice that investment took its current priority. Religious discipline, which structured earthly time for heavenly reward, met with the Reformation in 16th-century Europe to produce a new calculative rationality – financial investment.[1] Also, by incorporating Indo-Arabic numberals from the Middle East and solidifying time periods in alphanumerical base-12 measures (60 minutes, 24-hour days, 360 days a year), a new correlation – investment over time gained prominence. Sacrificing spending in the present for payoffs in the future was the cultural precondition for spreadsheet capitalism.[4]

The analysis of the time value of money (TVM) was crucial for modern finance and digital spreadsheet provided the calculative “steroids” to quickly crunch the numbers. They were quickly adopted for LBOs, particularly in valuing a target company, determining debt service scenarios, and calculating return on investment. They helped financial raiders in understanding and managing the risks associated with the LBO.

Previously, TVM calculations were time-consuming and tedious, often requiring financial tables or manual calculations using formulas. Martín de Azpilcueta, a Spanish theologian and economist, is often credited with the first explanation of the concept as it developed as practice during the 16th and 17th centuries along with financial markets.

Digital spreadsheets significantly accelerated and improved the analysis of TVM by automating calculations, enabling “what-if” analysis, increasing accessibility, and enhancing visualization with charts. Lotus 1-2-3 introduced built-in financial functions, such as PV (Present Value), FV (Future Value), PMT (Payment), RATE, and NPV (Net Present Value). These functions simplified TVM calculations that would otherwise require extensive manual work or financial calculators.

Instead of manually solving the compound interest formula to find future value, users could simply input values (e.g., interest rate, payment periods, and payment targets) into the FV function. Spreadsheets allowed users to quickly change input variables (interest rates, cash flows, and time periods) and instantly see the impact on the TVM calculations.

TVM is based on the notion that a dollar today is worth more than a dollar in the future due to its earning potential. This formula (FV = PV x (1 + i / f) ^ n x f ) has empowered individuals and businesses to make more informed financial decisions. Lotus 1-2-3 allowed users to create charts and graphs to visualize TVM concepts, such as the impact of compounding interest over time and the relationship between present value and future value.

A vital formula that was converted to the Lotus spreadsheet was Present Value @PV(), a crucial tool for analyzing companies. It provided a foundation for evaluating the worth of future cash flows from raided companies or their parts in the present terms. Companies generate cash flows over time, and analyzing them with PV ensures that delayed returns are appropriately considered and valued. PV helps distinguish between high-growth opportunities that justify higher valuations and overvalued prospects with limited potential.

PV quantifies this by discounting future cash flows to reflect their value today. This equation is critical in decision-making, whether assessing investments, valuing a company, or comparing financial alternatives. Present Value determines the internal rate of return (IRR) or net present value (NPV), the difference between the present value of cash inflows and the present value of cash outflows over a period of time. NPV is used in capital budgeting and investment planning to analyze a project’s projected profitability.

A related temporal technique is Future Value @FV() that was developed to project future cash or investment values. It calculates what money is expected to be worth at a future date based on current growth trends. It is particularly useful for debt paydown schedules or residual equity valuation @IRR(), the Internal Rate of Return. These calculations were crucial for evaluating the return on investment for equity holders, a core metric in LBOs.

Net Present Value @NPV() helped assess the profitability of an investment by calculating the value of projected cash flows discounted at the required rate of return. @NPV was crucial as it allowed users to input a discount rate (representing the cost of capital) and a series of future cash flows, and the @NPV function would calculate the present value of those cash flows.

@IF() determined whether a debt covenant has been breached or whether excess cash should be used for debt repayment. Payment @PMT() was useful for calculating the periodic payment required for a loan, considering principal, interest, and term.

Capitalism acts not in time, but through time as abstraction. In this framework, the past is irrelevant, the present is discounted, and the future is instrumentalized. The Time Value of Money, when enacted through spreadsheets in the context of LBOs, becomes more than a financial concept — it becomes a signification regime in which absent futures govern the present and economic decisions are derived from models, not experience. The future becomes a site of calculable justification, and the present is overwritten by what is forecasted, not what is known.

In this regime, the past is irrelevant, the present is discounted, and the future is instrumentalized. Capitalism acts not in time, but through time as abstraction. TVM does not just reflect time, but rather it restructures it into discrete periods (monthly, quarterly, annually), predictable flows (=FORECAST.LINEAR() or =PMT(rate, nper, pv, [fv], [type]) and discountable intervals (where each future year is “worth less”). This restructuring flattens uncertainty into modelable variables, renders futures governable through current action, and makes possible “rational” decisions to restructure, downsize, or liquidate. The spreadsheet presents these acts as paths to financial optimization.

This is the logic of spreadsheet capitalism. It does not manage time, it disciplines it, using the logic of TVM to collapse temporality into symbolic instructions of command. Under this regime, the spreadsheet becomes the temporal sovereign of capital, writing the future not as uncertainty but as discounted destiny, a destiny that rationalizes corporate takeovers, and justifies extraction.

Conclusion

Lotus 1-2-3’s capabilities on IBM and “IBM-compatible” personal computers allowed private equity firms to confidently pursue larger and more complex deals by providing a reliable platform for financial forecasting and decision-making. The tool’s role in shaping LBO strategies contributed to the emergence of private equity as a dominant force in corporate finance. Many fundamental modeling practices in these landmark deals continue to underpin private equity and LBO analyses today, albeit with more advanced tools like Microsoft Excel.

The digital spreadsheet, understood as a hybrid semiotic-computational technology within the Human-Machine Knowledge framework, has profoundly reshaped the modern economy and accelerated its financialization. This transformation is driven by the spreadsheet’s capacity for computation-as-symbolic-action, where formulas and symbolic inscriptions do not merely represent but performatively construct and simulate financial outcomes.

Notes

[1] A.J.Pennings (1993) Symbolic Economies and the Politics of Global Cyberspaces. Dissertation for a PhD in Political Science, University of Hawaii.
[2] Chat GPT was used for parts of this research

Citation APA (7th Edition)

Pennings, A.J. (2025, July 11) Spreadsheet Formulas and the Temporal Imperatives of the Leveraged Buyout Economy. apennings.com https://apennings.com/technologies-of-meaning/spreadsheet-formulas-and-the-temporal-imperatives-of-the-modern-economy/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy as well as visual rhetoric. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Satellites for Global E-Finance and News

Posted on | May 17, 2025 | No Comments

“We choose to go to the moon in this decade and do the other things — not because they are easy, but because they are hard.”
– President John F. Kennedy, September, 1962 [1]

I found a chapter, “Satellites for Global E-Finance and News,” in my Document file for work I was doing on technology and the internationalization of money, particularly in the realms of finance and news. It traces the development and impact of satellite technology on international communications, primarily from the 1960s to the late 1980s. So I ran it through Google’s Gemini 2.5 AI and asked it to summarize. I edited it and posted it below.

Satellites

telecommunication-satellites

The chapter demonstrates how satellite technology, driven by the demands of globalizing finance and the need for rapid news dissemination, navigated regulatory hurdles and technological advancements to become a crucial component of international communication infrastructure by the late 1980s. The chapter highlights the interplay between political decisions (deregulation), technological innovation (more powerful satellites, VSATs), and economic forces (the growth of global finance and media) in shaping this transformative period.

Inspired by John F. Kennedy’s vision of landing a man on the Moon and enabled by the space program and the Communications Act of 1962, Intelsat was formed, leading to the first global satellite communication system. Live broadcasts, especially the 1964 Tokyo Olympics, demonstrated its potential to the US and the world.

At the time, satellites had some significant advantages over undersea cables. They offered geography-insensitive connectivity, bypassing national terrestrial infrastructures and enabling point-to-point, point-to-multipoint, and broadcast transmissions. Fiber optic cables now provide much more capacity and speed, but satellites have become more sophisticated, especially new designs like SpaceX’s Starlink.

Satellites had a significant impact on the globalization of electronic finance. The end of the Bretton Woods agreement in the 1970s spurred the growth of electronic currency exchanges, which relied heavily on satellite data communication for financial news and transactions. The liberalization of the global financial and telecommunications regulatory environment allowed satellites to facilitate borderless commerce and e-money. This digital globalization led to the resurgence of the US dollar as the world’s reserve and trading currency.

Deregulation in the US helped propel new services. The 1972 Domestic Satellite Decision (“Open Skies”) was a crucial step in deregulating the US telecommunications system, allowing new companies to enter the satellite service market and setting a precedent for further deregulation. [2]

The Apollo space project spurred the development of stronger rockets and miniaturization of electronics, leading to larger and more sophisticated satellites like the Intelsat IVA with over 6,500 voice circuits and 2 TV channels by the mid-1970s (up from 240 voice circuits in Intelsat I). Satellites also began supporting packet-switched data communications thanks to the Alohanet and other networking projects.

Satellite communications became vital for electronic money and news flow. Cheaper long-distance telephone service, live broadcasting of major sports events, and faster news footage transmission from remote locations became possible. Newspapers like the Wall Street Journal saw significant benefits, reducing transmission costs from $3,500 per day to $90 and increasing plant productivity from 25,000 to 70,000 copies. Dow Jones’ revenues increased from $156 million in 1971 to $531 million ten years later.

Satellite technology facilitated the rise of global news networks like PBS (reaching 179 earth stations by 1984) and CNN (launched in 1980), enabling live, continuous news coverage from anywhere in the world. ABC’s Nightline exemplified this trend by its reporting on the Iranian hostage crisis and later the Tiananmen Square uprising in China. Hosted by Ted Koppel, the program conducted remote interviews with people such as UN Secretary-General Boutros Boutras Ghali. It combined the long distance service with some new studio technology and in the process developed a new form of news coverage that caught on quickly around the world.[3]

Europe also recognized the importance of space-based communications, forming the European Space Agency (ESA) in 1973 and developing the Ariane rocket, which became a primary launch vehicle, especially after the 1986 Challenger space shuttle disaster. I had a chance to visit the ESA’s European Space Research and Technology Centre (ESTEC) in Noordwijk, Netherlands in 1990 while visiting my uncle. It was while writing my PhD on electronic money and helped shape my thinking.

Satellite Business Systems (SBS) was formed by a joint venture between Aetna, Comsat, and IBM and aimed to provide high-speed data communications (45 megabits per second) for corporations. Despite technical capabilities (SBS-1 had a capacity for 1250 two-way telephone conversations per channel and 10 simultaneous color television transmissions with an average data rate of 480 megabits per second), SBS faced challenges due to the emergence of the X.25 protocol and the high cost of earth stations. It was eventually sold off.

Intelsat expanded rapidly, despite facing emerging competition. It launched its Intelsat V series in the mid-1980s, supporting both C-band and Ku-band frequencies and offering a wide range of services to 174 countries by 1987. Its International Business Services (IBS) catered to major international corporations.

In the early 1980s, the Reagan administration pushed for the deregulation of international satellite communications, challenging Intelsat’s monopoly. The FCC allowed new US International Service Carriers (USISCs) and open ownership of earth stations.

Deregulation also led to competition in launch vehicles. The NASA Space Shuttle program initially boosted the satellite industry by providing a reusable launch vehicle. However, the 1986 Challenger disaster led to a temporary halt and a shift towards privatization of launch services, allowing European (Ariane) and Chinese (Long March) rockets to gain market share.

Concerns arose regarding the limited resource of the geosynchronous orbit, conceptualized by Arthur C. Clarke. (see below) This shortage led to monitoring and regulation by the International Telecommunications Union (ITU), primarily using a first-come, first-served principle, although political pressures for a registration system emerged.

CLARKE BELT

Overall, “Chapter 12” “Satellites for Global E-Finance and News,”
explores how satellite technology evolved from the 1960s to the late 1980s to become a foundational infrastructure for global finance and news media. It illustrates the interaction between technological innovation, regulatory reform, and globalization, focusing on how satellites enabled real-time, transnational communication crucial for electronic money systems and financial journalism.

Citation APA (7th Edition)

Pennings, A.J. (2025, May 17) Satellites for Global E-Finance and News. apennings.com https://apennings.com/global-e-commerce/satellites-for-global-e-finance-and-news/

Notes

[1] Quote from JFK was from a speech given at Rice University in Houston Texas to dedicate NASA’s new Manned Spacecraft Center, 22 miles away.
[2] I always found Rob Frieden’s work on the FCC and international communications technology to be particulary useful. Parsons, P.R and Frieden, R.M. (1998) The Cable and Satellite Television Industries. Needham, MA: Allyn and Bacon. p. 52. Tim Logue, as well. Logue, Timothy “Recent Major U.S. Facilities-Related Policy Decisions: Letting a Million Circuits Bloom,” In Wedemeyer D. and Pennings, A. (eds.) Telecommunications-Asia, Americas, Pacific: PTC’ 86 Proceedings. p. 81.
[3] Hansen, Jarice. (1994) Connections: Technologies of Communication. New York: HarperCollins College Publishers. p. 162.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012, he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Determining Competitive Advantages for Tech Firms, Part 1

Posted on | May 14, 2025 | No Comments

Is the world more competitive for tech companies? Globalization has expanded market reach and access to talent, while the rapid pace of technological innovation constantly reshapes the competitive landscape. Lower barriers to entry have fostered a vibrant startup environment, challenging established players. The fierce competition for skilled technology talent further fuels this dynamic environment. Rising consumer expectations demand continuous improvements and innovations, and increasing regulatory scrutiny adds another layer of complexity. Finally, geopolitical factors significantly influence the global technology market and its supply chains. Working in concert, these forces have created a highly competitive arena where technology companies must constantly adapt and innovate to survive and thrive.

This post reworks one of two previous blogs that analyzed The Curse of the Mogul: What’s Wrong with the World’s Leading Media Companies by Jonathan A. Knee, Bruce C. Greenwald, and Ava Seave. I used it as part of the Digital Media Management curriculum at New York University and the Digital Media MBA at St. Edward’s University in Austin, Texas. It stresses the importance of substantial barriers to entry, or conversely, competitive advantages, for success in what economists call the “market structure” of a particular product.

The Curse of the Mogul emerged at a time when digital media firms were first starting to wrestle with the Internet. I draw on this book and other sources to continue to stress the importance of firms building strong barriers to entry in competitive economic environments. I changed the focus from digital media to tech companies in line with the changing vernacular and a shift in power to edge computing companies using AI and e-commerce.

The authors of Curse of the Mogul argue point out that companies should focus on developing and reinforcing more serious competitive advantages and/or operational efficiencies (disciplined management of resources, costs, and capital allocation).[1] They were critical of media mogul’s preoccupation with topics like brands, deep pockets, talent (creative, managerial), a global footprint, and first-mover benefits. These points are relevant but obscure other business factors that would likely facilitate better results.[2] Successful tech companies must define and protect more structural barriers to entry or adopt strict cost control procedures and operational efficiencies to enhance productivity and profitability.

Market structure has become a key focus of strategic thinking in modern firms. It refers to the environment for selling or buying a product or product series and influences key decisions about investments in production, people, and promotion. It is primarily about the state of competition for a product and how many rivals a company will have to deal with when introducing it. How easy is it to enter that market? Will the product be successful based on current designs and plans for it, or will the product need to be changed? How will the product be priced? Market structure is impacted by technological innovations, government regulations, network effects, customer behaviors, and costs.

Key factors include the number of firms supplying product, the level of differentiation between products offered, and the main focus of this post – the competitive advantages or barriers to entry that a company can erect to bolster their position or stave off competition. The pricing strategy can also be a factor but that is largely dependent on the level of competition.

In light of the rapid development and convergence of these tech and digitally-based industries, it is worth exploring the areas of key focus for the authors. In this post and the next, I review some of the major sources of competitive advantages according to The Curse of the Mogul and reference how they might apply to digital media firms. The book refers primarily to traditional big media firms. How do these categories of competitive advantage apply to a wider group of digital firms?[3] The authors distinguish four categories:

Due to space constraints, I will cover economies of scale and customer captivity in this post and cost, innovation, and government protection in a future one.

Economies of Scale

This is a central concept in economics and refers to the benefits that come to a firm when it becomes more effcient. It may involve fixed costs or network effects. Fixed costs refer to both the traditional sense of decreasing costs per unit produced as well as to the barriers created by a company like Google with the ability to spend lavishly on equipment, knowledge attainment and other factors that make it prohibitive for other firms to match. Steven Levy’s “Secret of Googlenomics: Data-Fueled Recipe Brews Profitability” on the Wired website provides an excellent introduction to the search behemoth’s business model, primarily built around its Adwords and Adsense advertising business.

Google has a number of advantages, perhaps foremost being the massive investments in its built infrastructure. Google’s mission requires more than the most sophisticated “big data” software, it necessitates huge investments in physical plant, particularly data centers, power systems, cooling technologies, and high-speed fiber optic networks. Google has built up a significant global infrastructure of data centers (increasingly located close to cheap, green tech) and connecting its storage systems, servers, and routers is a network of fiber optic switches. For example, the Eemshaven data center facility in the Groningen region of the Netherlands is at the end connection point for a transatlantic fiber optic cable.

Large firms like Google can spread their fixed costs over greater volumes of production and operate more profitably than their competitors. For the most part, the details on fixed costs are not readily available as they are proprietary and represent trade secrets. However, aggregate numbers of Google’s fixed costs are informative.

Near Zero Marginal Costs

One of the characteristics of digital media is that although initial production costs may run high, the costs for additional viewers to experience the resulting digital content – movie, television show, software, song, or video game are negligible. Most digital goods can experience near-zero marginal costs. This advantage has been challenged in the age of Artificial Intelligence (AI) however, as the “compute” needed to produce results requires significantly more energy than traditional search.

Economies of scale for book publishers have always meant they needed to cover their fixed costs such as editors and author royalties before they can achieve profits. However, if they have a bestseller, it can be quite profitable as they spread their costs over a larger production run. Digital distribution through Amazon’s Kindle or Apple’s iBooks not only reduces the costs of production, but as no ink or paper is involved, it significantly reduces the costs of delivery as well. This happens for software as well. Microsoft Office, for example, which contains Access, Word, Excel, and PowerPoint, can be distributed over the Internet with little expense. But that is not necessarily a competitive advantage. Digital assets also need to be protected from copying and other forms of theft, and they need to utilize network effects and viral marketing.

Network Effects

Network effects refer to the increasing value of a product or service that occurs when additional customers or users start to use them. Many communications technologies such as telephones, fax machines, and text applications exhibit direct network effects. The telephone system became more valuable to each individual telephone subscriber as more people connected to the phone system. When more mobile phone users started to take advantage of Short Message Service (SMS) or “texting,” it attracted even more users. When I got my first text from my sister, for example, who was not known at the time for her technological prowess, I knew that texting had arrived.

Network effects are complicated and may not be sufficient and always be positive, as MySpace discovered after 2008 when members abandoned it for Facebook. MySpace was a social media site that allowed users to create their own “spaces” with pictures, blogs, music, and videos. The darling of early “social networking,” it was sold to Rupert Murdoch’s News Corporation for US$580 million dollars in 2005. Two years later, with 185 million registered users, it had a valuation of $65 billion. By early 2011 MySpace was down to about USD 63 million, while Facebook had jumped ahead with over 500 million members. Tired of pumping money into the sinking ship, News Corp. sold MySpace to Specific Media, an advertising network for $35 million, just 6% of its purchase price.[5] By 2025, Facebook had over 3 billion monthly active users (MAU).

Digital firms need to consider multiple repercussions such as cross-network and indirect network effects. The authors use the example of eBay, an online auction company that benefits from cross-network effects. eBay, Uber, Airbnb, and many other “platforms” such as dating or recruiting sites are also known as two-sided networks because they bring two distinct groups together. As the number of the eBay’s customers increased, it became increasingly attractive for others to sell their wares on the site. Conversely, as more products were displayed, it attracted more customers. A major success for Microsoft Office is that files produced on Word or Excel often need to shared and read by others.

Network effects makes a site or product more valuable as it includes more people and those additional people make it more attractive for another group. Credit cards, for example, are another good example of cross-network effects. They rely on a large base of individual card holders for profitability and this large customer base than attracts merchants who want their business and are willing to pay the extra costs to the credit card company. This raises questions about who you charge and if a proprietary platform is needed.

Over-the-top (OTT) services that use the Internet as a distribution system, like Amazon Prime, Netflix, and YouTube, connect consumers with content makers. While Prime and Netflix produce considerable content, they draw on outside content producers to keep their viewers engaged. YouTube has drawn heavily on user-generated content (UGC) as does Instagram and TikTok. In each case, the platform’s success depends on its direct network effects – its ability to connect a large number of viewers with a large number of producers.

Another phenomenon is indirect network effects. This occurs when the increasing use of one product or service increases the demand for complementary goods. The standardization of the Windows platform in the 1990s, for example, and its nearly ubiquitous installed user base among PC users allowed many other software producers to thrive as they built their applications to run on the Microsoft operating system. Both Apple and Android-based smartphones have allowed thousands of apps to be added to their functionality. So the network effects attributed to the popularity of these PCs and smartphones carry over to applications that run on them.

Viral Marketing

Viral marketing is a promotional strategy that relies on the audience to organically spread a marketing message to others, much like a biological virus. The goal is to create content that is so compelling, entertaining, or valuable that people will naturally want to share it with their friends, family, and colleagues. The key characteristics of viral marketing include rapid spread, user-driven growth, shareable content, and short-term growth.

Viral marketing’s primary aim is for the message to spread quickly and widely through social networks and word-of-mouth. Viral marketing relies on individuals to share the content, rather than the company paying for extensive distribution. Successful viral marketing campaigns typically involve content that evokes strong emotions, provides utility, or has a novelty factor that encourages sharing. While highly effective at generating initial awareness and a rapid influx of users, the effects of viral marketing can be short-lived if not coupled with other strategies. Examples of viral marketing campaigns include engaging videos, social media challenges like the ALS Ice Bucket Challenge, and creative contests or giveaways.  

Key Differences Between Network Effects and Viral Marketing

The main differences between network effects and viral marketing lie in their focus and the source of their power. The primary goal of network effects is to increase product/service value for users, while for viral marketing it is to achieve rapid user acquisition and brand awareness. The main mechanism for network effects is to increase value through the number of users while viral marketing works through the spread of shareable content created by the brand. The driver of network effects are the user connections and interactions. That also powers the sharing of content by individual users, which can feed more users for increased network effects. Network effects grows long-term sustainability and creates competitive advantage based on value production. Viral marketing creates rapid, short-term growth based on content appeal that can provide a temporary boost but doesn’t inherently build firm defensibility unless it results in more captured customers.[7]

Customer Captivity

Maintaining the attention and fealty of customers is often vital to a product’s success and is reinforced through habit, switching costs and search costs. Successfully introducing customer practices and reinforcing habitual use is a crucial strategy for retaining customers. Mobile apps lock users into a much more narrow range of options than surfing the Web on their PCs. Also, Amazon’s One-click purchase option makes it quick and easy to complete the deal without dragging out the credit card and inputting all the numbers and other information.

Speaking of credit cards though, they remain a consistent vehicle to keep a hold of customers through subscriptions and reward programs. Subscriptions use the automatic payments of credit cards to keep making the necessary payments to maintain the continous service or supply of product. Switching costs and reward forfeitures discourage giving up a credit card. Loyalty programs foster perceived value, not always real value. Switching mean may mean losing accumulated points, changing autopay for multiple bills, and potentially hurting one’s credit score due to new inquiries or shorter account history.

One new digital tool that is proving effective is the recommendation engine. Netflix uses a recommendation engine to keep customers engaged. It constantly suggests titles the viewer might be interested in watching based on their previous viewing. Amazon destroyed the Borders bookstore with its recommendation engine and an effective email system that targeted customers with what they wanted. Borders could only offer pictures of loosely associated books with dubious links to the customer’s interests. I, for example, was not interested in their fine collection of Harlequin-like romance novels. Borders did not recommend the books I wanted, so I bought them from Amazon, despite the enjoyment of going to the Borders bookstore.

It is also important to keep customers from switching to competitors. Switching barriers can involve exit fees, learning effort, equipment costs, emotional stress, start-up costs, as well as various types of risk: financial, psychological, and social. Cable and home security companies are notorious for trying to keep customers in long-term contracts to keep them from switching.

Making it easy to learn new products is helpful as is reducing any stresses associated with understanding new features or upgrading. One way to keep customers is to make the payment system easy. Automatic payments work for subscription-based services like Netflix and other deliverers of online content that tie in customers through credit cards and other continuous payment systems.

Search costs encourage consumers to stay with a particular product or entice them to go with your brand if the information provided is convincing enough to cause them to give up their search. Rational consumers will tend to search until the perceived benefits outweigh the costs. Testimonials and good reviews will help alleviate their concerns. Big ticket items like cars, homes, or major appliances tend to require more search time than smaller items. But any search requires a calculation of the opportunity costs involved. What are they giving up to spend this time searching?

In the passages above, I reviewed competitive advantages as specified by the authors of The Mogul’s Curse and applied them to digital media firms. Their focus on moguls doesn’t hold as much interest for me as their discussion about competitive advantages for smaller companies.[4] Being technologically dynamic, the digital media field is still investigating and exploring its ability to create competitive advantages and erect barriers to entry.

It is also important to understand that two or more competitive advantages may be operating at the same time. Recognizing the potential of reinforcing multiple barriers to entry and planning strategies that involve several competitive advantages will increase the odds for success. In “Determining Competitive Advantages for Tech Firms, Part 2,” I will discuss competitive advantages related to costs and government protection.

Review

This blog post summarizes key competitive advantages for firms, drawing from “Curse of the Mogul.” It emphasizes that success in a market’s structure depends on establishing strong barriers to entry or achieving operational efficiencies, rather than relying solely on brands, deep pockets, talent, global reach, or first-mover status. The post defines market structure and its influencing factors (technology, regulation, network effects, behavior, costs) and focuses on competitive advantages as barriers to entry. It then delves into several categories of competitive advantages.

Economies of scale and network effects are key barriers to entry. Firms can benefit from increased efficiency, including spreading fixed costs over larger production volumes (relevant for digital media with near-zero marginal costs, though AI compute challenges this with their high energy costs). Network effects are the increasing product/service value with more users (direct effects like communication technologies and indirect/cross-network effects seen in platforms like eBay, Uber, Airbnb, and the complementarity of products like Microsoft Office). The post notes that network effects aren’t always sustainable. For example MySpace vs. Facebook showed that network effects can tip one way or another quite fast. Once a platform reaches a critical mass of users, the value it offers becomes hard to replicate.

Customer captivity is reinforced through habitual use crucial for retention, as seen in mobile phone usage. Switching costs also present barriers preventing customers from moving to competitors and include fees, learning effort, equipment costs, and various risks. Also, search costs encourage sticking with an acquired product if the cost of searching start to outweigh the benefits of a new product. Recommendation engines and online reviews can play a role in reducing the costs of searching for a replacement product.

The post concludes by stating that the tech field is still exploring the creation of competitive advantages and barriers to entry. It highlights that multiple competitive advantages can operate simultaneously, increasing the likelihood of success.

Conclusion

This post outlines the critical importance of tech firms establishing powerful competitive advantages, particularly economies of scale, network effects, and customer captivity. These include firms operating in any market, including the dynamic digital media landscape. By dissecting these concepts and providing relevant examples from both traditional and digital companies, it underscores that sustainable success hinges on creating structural barriers to entry or achieving significant operational efficiencies, rather than relying on more superficial advantages often touted by industry leaders. The follow-up post on cost and government protection suggests a comprehensive exploration of the strategic levers available to companies seeking to thrive in competitive markets. Ultimately, the post serves as a framework for understanding how businesses can build lasting advantages in an ever-changing economic environment.

Citation APA (7th Edition)

Pennings, A.J. (2025, May 14). Determining Competitive Advantages for Tech Companies, Part I. apennings.com https://apennings.com/media-strategies/determining-competitive-advantages-for-digital-media-firms-part-1/

Notes

[1] Jonathan A. Knee, Bruce C. Greenwald, and Ava Seave, The Curse of the Mogul: What Wrong with the World’s Leading Media Companies. 2014.
[2] Knee, Greenwald, and Seave argue that the poor financial performance of major media conglomerates isn’t primarily due to external factors like the rise of the Internet. Instead, they contend that it stems from internal operational inefficiencies and misguided strategies driven by the egos and “megalomania” of media moguls. Lack of Focus on Cost Control: The moguls often prioritize growth, acquisitions, and maintaining a powerful image over rigorous cost management. They tend to downplay the importance of “number crunchers” and “pencil pushers,” leading to bloated budgets and unnecessary expenses. Driven by a desire for scale and market dominance, media companies frequently overpay for acquisitions and strategic investments that don’t yield commensurate returns. This misallocation of capital hinders profitability and shareholder value. Even when acquisitions have strategic rationale, poor integration processes often lead to duplicated efforts, loss of synergies, and ultimately, underperformance. The book challenges the notion that simply having the best content guarantees financial success. It argues that efficient distribution, marketing, and monetization strategies are equally, if not more, crucial. Moguls who fixate solely on content creation often neglect these operational aspects. Finally, the authors argued that moguls often believe their creative nature exempts them from standard financial scrutiny. This allows operational inefficiencies to persist without being adequately addressed. Unlike operationally efficient businesses that concentrate on core competencies and streamline processes, media conglomerates often lack focus, dabbling in diverse and sometimes unrelated ventures without achieving deep efficiencies in any one area.
[3]”Reviews: The_Curse_of_The_Mogul.” Quantum Media: Links_Reviews. N.p., n.d. Web. 30 Mar. 2014.
[4] Greenwald, Bruce C. “The Moguls’ New Clothes.” The Atlantic. Atlantic Media Company, 01 Oct. 2009. Web. 30 Mar. 2014.
[5] Jackson, Nicholas. “As MySpace Sells for $35 Million, a History of the Network’s Valuation.” The Atlantic, Atlantic Media Company, 29 June 2011.
[6] I finally found the hard copy of this book at a Borders near Wall Street in New York City.
[7] This post is a rewrite of the version I wrote in 2014. I used Gemini to add a more information on the distinction between network effects and viral marketing.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea, holding a joint research appointment at Stony Brook University. Before joining SUNY, he taught at St. Edwards University in Austin, Texas. He was on the faculty of New York University. Previously, he and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Visual Rhetoric Analysis of Social Media: YouTube Channels and Memes

Posted on | May 1, 2025 | No Comments

What makes a successful YouTube channel? What meaning-making practices are used to make a channel interesting, or informative, or enjoyable? What story is being told, who is telling it, and how is it being told? How are people making money from it? These are some of the main questions we address in the final project of my EST 240 Visual Rhetoric and IT class. It examines the details of imagery or moving images closely for a rhetorical and denotation/connotative analysis of the persuasive techniques and meanings involved.

This post is about using a semiotic or visual rhetoric analysis to understand why some YouTube videos are successful, and others are not. Both rhetoric and semiotics offer valuable, yet distinct, frameworks for analyzing the complexities of visual media. While rhetoric, with its focus on the art of persuasion, examines the strategic use of appeals to ethos, pathos, and logos to influence audiences, semiotics delves into the science of signs, seeking to decode the underlying systems of meaning embedded within visual and auditory elements. Despite their different origins and primary objectives, these two disciplines share a fundamental concern with communication and meaning-making, particularly in our increasingly visually driven world.

The assignment is to interrogate a web channel, looking at its details, from its hosts to its thumbnails, to identify its signifying practices that make it a success. It may not be too different from an assignment to analyze a movie or a novel, as the meaning-making practices are examined much like a media paper. But YouTube is like a film on steroids, or a psychedelic drug. Its commitment to realism is lacking. A lot more is happening, and standard rules of organizing perception are being broken. Analyzing a YouTube channel takes a good eye for identifying details and a strong vocabulary to put what you see into words. It also requires an analytical framework to put the signifying practices into a theoretical perspective that helps create additional understanding of meanings created and the myths supported.

The class starts out with an intensive look at the vocabulary for techniques used in film, television, music videos, and more recently social media tools like Instagram, Rumble, Vine, TikTok. We move on to YouTube channels(with hosts for this assignment). Future versions of this course will also delve into the use of artificial intelligence (AI) to synthesize images and video.

Initially, we work on vocabulary and the “grammar” of visual creation – how moving images are shot and edited/structured to create meaning and narrative. We analyze films, television, music videos, and move on to YouTube videos. Terms like closeup, pan, tilt, parallel editing, and voice-overs provide key conceptual understanding for both technical and analytical purposes. Moving images are shot with a general grammar in mind – establishing shots for creating context, medium shots for introducing subjects and perspective, and closeups for detail and emotion.

I recommend analyzing the channel’s host first, drawing on the analysis of a newscaster or news “anchor.” The anchor secures the narrative of the news story. He or she (or AI it) literally anchors the meaning of the newscast or story. Stuart Hall talks about imagery needing a “fixed” meaning, which I find useful as well. The anchor or host tells a story, fixing the meaning but also moving the story along. What drives the “story” or myth-making? How is the story being told? Narration? Voiceover (VO)? Who is the author? Are they part of the story?

What is the rhetoric of the YouTube channel? What is the purpose of the site? What meanings does it produce? How does it engage the audience? What audience is it producing (How can it be sold to an advertiser?)

Two French terms have guided televisual analysis over the years: Mise-en-scène, for what is in the scene, or the shot. This is a combination of composition, costuming, hair and makeup, lighting, and set design. The other is Montage, from the French “to build,” that refers to the editing process. This action involves the pace of editing, wipes, continuity, and cross-cutting or parallel editing.

Recommended Outline

Introductions are drafted early but the last to be completed. Why is the channel a success? What metrics can we access to measure the success? How many subscribers does it have? How many videos have they produced? How many viewers do they attract? How many comments do they usually get? Can you find out how much money they are making? Dude Perfect has over 60 million subscribers and regularly makes over $20 million a year. It emphasizes male competition and sport. Genre is often an interesting exercise in the process of categorization that determines distinctions as well as similarities. A popular new genre on YouTube is the video blog or “vlog.”

Meaning-Creating Techniques 1 Denotation and Connotation: Host
Meaning-Creating Techniques 2 Denotation and Connotation: Shots (Mise en scène)
Meaning-Creating Techniques 3 Denotation and Connotation: Editing (Montage)

Additional areas of analysis:

Meaning-Creating Techniques 4 Denotation and Connotation: Logo
Meaning-Creating Techniques 5 Denotation and Connotation: Thumbnails

Rhetoric or Semiotics?

Despite their shared interests, rhetoric and semiotics exhibit fundamental differences in their historical development, primary objectives, and the specific analytical tools they employ. Rhetoric has its roots in the classical art of oratory, initially focusing on the principles of effective public speaking and argumentation. Semiotics, on the other hand, emerged later as a broader scientific and philosophical inquiry into the nature of signs and the processes by which meaning is generated and interpreted across all systems of communication, including film and YouTube. Ryan’s World is a particularly rich channel to analyze with almost 40 million subscribers and over 3000 videos.

Rhetoric’s primary concern lies with the persuasive intent behind communication and its impact on the audience’s beliefs or actions. Semiotics, however, has a more encompassing aim to understand the underlying structures and processes of signification, regardless of the communicator’s specific intentions or the message’s persuasive efficacy. Rhetoric traditionally emphasizes the appeals of logos, ethos, and pathos as its core analytical framework for examining persuasive strategies. In contrast, semiotics focuses on dissecting the structure of signs through concepts such as the signifier and signified, and the categorization of signs into icons, indices, and symbols. Furthermore, while rhetoric is primarily centered on human communication, semiotics has a broader scope. It extends its analysis to various phenomena that function as signs, including cultural rituals, fashion systems, and even biological communication among organisms (becoming more relevant with AI and big data’s capacity to capture and decipher animal sounds).

Analyzing Political Memes with Rhetorical Theory

Social media circulates many images with texts called “memes.” These constructions can be both effective and harmful as they can be constructed quickly with modern apps and spread virally through social media. Memes are often anonymous with little or no indication of authorship, yet are often shared from trusted friends.

Rhetoric can be used to analyze the techniques that are used to influence the reader. The ancient Greek philosopher Aristotle identified three fundamental modes of persuasion: ethos, pathos, and logos. These appeals form the cornerstone of rhetorical theory and provide a framework for analyzing how persuasion functions in communication, including visual media and memes.  

Ethos is the appeal to credibility and centers on the character and trustworthiness of the communicator. In rhetoric, ethos is established by demonstrating expertise in the subject matter, conveying honesty and goodwill towards the audience, and presenting oneself with appropriate authority and character. For instance, in visual memes, the use of celebrity political endorsements leverages the perceived credibility and admiration associated with the celebrity to build trust in the policy. The audience is more likely to be persuaded by someone they view as knowledgeable, reputable, or possessing good political character. A political meme might showcase a political candidate in professional attire with a party logo.

Pathos is the appeal to emotion and involves persuading the audience by evoking certain feelings. These emotions can range from positive feelings like joy, hope, and excitement to negative ones such as sadness, fear, and anger. Visual media is particularly adept at employing pathos through the use of powerful imagery, evocative slogans, and compelling narratives designed to resonate with the audience’s values, beliefs, and cultural background. For example, a public service announcement might use distressing images to evoke empathy and encourage viewers to take action. A political meme might use patriotic imagery like the national flag to evoke pride.

Logos is the appeal to logic and relies on reason and evidence to persuade the audience. This involves using facts, statistics, logical arguments, and clear reasoning to support a particular claim or viewpoint. In visual rhetoric, logos can be conveyed through the presentation of data in infographics, demonstrations of a policy’s effectiveness, or a logical visual narrative that leads the viewer to a specific conclusion. A clear and specific thesis or claim, supported by well-reasoned arguments, is crucial for a strong appeal to logos. A political meme might include concise policy statements or slogans implying logical benefits.

The effective use of these three appeals, often in combination, is central to the art of rhetoric and its ability to influence audiences with memes quickly. Understanding how ethos, pathos, and logos are employed in visual media provides a valuable framework for analyzing the persuasive strategies at play in our visually saturated world.  

Summary

This post underscores that success on YouTube is not just technical or algorithmic but rhetorical and semiotic. Content creators—whether kids unboxing toys or athletes competing—construct complex layers of meaning using visual tools and persuasive strategies. Understanding these tools equips students to critically analyze, produce, or even monetize content more effectively.

What makes a YouTube channel successful? The answer lies in metrics and how content is constructed, delivered, and interpreted. The post encourages a deep investigation of hosts, thumbnails, editing styles, and narrative strategies, using concepts like denotation/connotation, mise-en-scène, and montage — terms borrowed initially from film theory but now applicable to YouTube videos.

Citation APA (7th Edition)

Pennings, A.J. (2025, May 01) Visual Rhetoric Analysis of a YouTube Channel. apennings.com https://apennings.com/technologies-of-meaning/visual-rhetoric-analysis-of-a-youtube-channel/

Share

Notes

[1] Rhetoric has an older lineage and appears to have started with a focus on persuasion and public speaking in ancient Greece. Semiotics, on the other hand, is a more recent and broader field that comes from linguistics and philosophy, looking at all kinds of signs, not just language. While rhetoric often has a goal of influencing people, semiotics is more about understanding how meaning works within different cultures.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and Research Professor at Stony Brook University. He teaches AI and broadband policy as well as visual rhetoric. Previously, he was on the faculty of New York University teaching digital economics and media management. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

keep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    August 2025
    M T W T F S S
     123
    45678910
    11121314151617
    18192021222324
    25262728293031
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.
  • Verified by MonsterInsights