Anthony J. Pennings, PhD

WRITINGS ON AI POLICY, DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL E-COMMERCE

Situating Gold in the Substitution-Computational-Telecom Stack of Global Finance

Posted on | September 20, 2025 | No Comments

Gold prices have increased substantially in the last few years from hovering continuously around $2,000 to $3,647/oz on September 19, 2025. As an appreciating asset, gold has become popular for many individual and institutional investors. It has also become a substantial holding of reserves for many central banks worldwide. When a central bank or an investor buys gold on the modern grid, they are rarely taking physical delivery. Instead, they are buying the symbol XAUUSD.

Crucially, the symbol is priced in US dollars, which means gold now functions as a special class of asset within the dollar-anchored system, not as a competitor outside of it. Gold is reduced from a material commodity to a symbolic inscription in a Bloomberg terminal or a central bank spreadsheet.

This post examines how gold currently operates within the semiotic-computational-telecom (SCT) stack of spreadsheet capitalism. Gold currently functions not as a direct currency, but as a potent semiotic anchor for ‘safe-haven’ value.[1] Its symbolic representation is priced and managed through sophisticated computational models that are synchronized across the global telecom grid. Gold operates as an alternative anchor that represents historical memory, geopolitical hedging, and systemic counterweight.

Gold in spreadsheet capitalism is an inscribed number (semiotic substitution), a variable in financial formulas (symbolic computing), and a globally synchronized price stream (telecom infrastructure). Gold today is no longer the official “cell anchor” of the global financial spreadsheet (as it was under the Bretton Woods system when priced at $35/oz and other countries required to peg their currencies to the dollar within a 1-3% band).[1] However, it still operates within the semiotic–computational–telecom stack of spreadsheet capitalism as a special “actant,” carrying symbolic weight, computational functions, and networked circulation across terminals and inducing trading actions.

Gold as Symbolic Substitution Token

The foundational step is the semiotic substitution of the physical gold bar with a universally recognized ticker symbol: XAUUSD. This symbol no longer represents a medium of exchange, but rather a set of powerful, abstract concepts. These include safety, history, an inflation hedge, and a non-sovereign store of value.

When a central bank or an investor buys the gold symbol it is usually a purchase of “financial insurance” against geopolitical risk, currency debasement, or systemic instability. However, recent price increases have also made it a profitable trade.

Gold substitutes for a range of meanings. It is a hedge against inflation (e.g., “XAU/USD” formula cell), a geopolitical “insurance” asset that substitutes for trust in the USD system, and a cultural signifier of permanence and stability across millennia. These substitutions enable gold to circulate in financial systems as numbers and formulas, rather than as physical bars. Its semiotic power rests in its ability to represent non-dollarized value.

Formulating Power with Gold

XAUUSD is given life and integrated into the global financial system through formulaic, symbolic computation. This is where its price is determined and its risk characteristics are modeled. Gold is deeply embedded in the functions and formulas of spreadsheet capitalism. It is an important component of risk formulas that utilize gold prices as inputs to calculate portfolio volatility. It is also used in hedging strategies such as option pricing formulas and swap functions. Correlation matrices also compute gold’s correlation to USD, oil, or Treasury yields.

The vast majority of gold trading occurs in the derivatives market, particularly futures on the COMEX (Commodity Exchange). The price of a gold futures contract is a purely computational product, derived from formulas that factor in the spot price, the risk-free interest rate (SOFR), storage costs (“cost of carry”), and time to expiration.

The most popular way for investors to hold gold is through Exchange-Traded Funds (ETFs) like GLD and GOLY. The price of a GLD share is a formula based on the net asset value (NAV) of the physical gold held in trust, minus fees.[2] This formulation allows the symbol of gold to be traded with the liquidity of a stock.

Gold’s symbol is a critical input for global risk models. Formulas constantly calculate its correlation to other assets (stocks, bonds, currencies), and its Value at Risk (VaR) formula is used by institutions to manage portfolio-wide exposure.

Gold plays a crucial role in ‘computing’ systemic anxieties. When confidence in fiat (especially USD) wavers, models shift weight toward gold as an input into hedging formulas. This underscores the importance of gold in managing systemic anxieties, making the buyers appreciate its function in the financial system.

Central banks also use computational inscriptions that include gold. Their reserve composition spreadsheets include gold holdings, with formulas that track percentage shares of gold versus USD/Euro reserves.

Gold in the Gridmatic Telecom Stack

The telecom grid of interlinked terminals and exchanges synchronizes all of this symbolic and computational activity into a single, globally recognized, real-time price for gold. Bloomberg, Reuters LSEG Workspace, and Wind terminals log second-by-second spot and futures prices. Clearinghouses (CME, LBMA) record gold futures, ETFs, and swaps. Gold-backed ETFs (GLD, etc.) synchronize millions of spreadsheet cells across retail and institutional portfolios.

The key nodes in this grid are the London Bullion Market Association (LBMA), which sets the benchmark spot price, the COMEX in New York, which drives futures pricing, and the thousands of financial terminals (Bloomberg, LSEG) that distribute this data.

Digital infrastructures (COMEX servers, LBMA vaulting systems, Shanghai Gold Exchange) operate as networked actants stabilizing the semiotic and computational layers. This telecom synchronization enforces a single global price of gold, regardless of whether it is traded in London, Shanghai, or Chicago.

The telecom stack ensures that a price change in a COMEX futures contract is reflected in the price of the GLD ETF and the XAUUSD spot price on every trader’s screen from Seoul to Frankfurt in microseconds. This creates one unified and constantly updated “fact” — the global price of gold — making it a perfectly liquid, globally fungible asset within the digital architecture of spreadsheet capitalism.

Gold’s Role in Time-Space Power

Gold creates time-space power within the stack of spreadsheet capitalism by being transformed from a heavy, physical object into a weightless, placeless (XAUUSD) digital symbol.[2] This symbol’s future value can be computationally priced and traded, with all actions now synchronized globally by the telecom grid, allowing actors to exert influence across vast distances and into the future.

Semiotic substitution is the foundation of gold’s space-compressing power. A physical gold bar sitting in a vault in London or New York is replaced by the universal ticker symbol XAUUSD on a terminal screen. This act disembeds the value of gold from its physical location. A central banker in Seoul doesn’t need to arrange for the costly and slow physical transport of gold bars to rebalance reserves. Instead, she can trade the symbol XAUUSD instantly. This substitution makes gold’s value placeless and perfectly mobile, allowing it to be controlled and reallocated anywhere on the grid at the speed of light. This substitution is the SCT’s power to collapse space.

The computational stack also gives gold its power over time. The XAUUSD symbol is fed into complex formulas, primarily for pricing derivatives like futures and options. A gold futures contract is a formulaic promise to buy or sell gold at a predetermined price on a future date. By trading these contracts on an exchange like the COMEX, a mining company or a jeweler can lock in their costs or revenues months in advance, protecting themselves from price volatility. An option on gold provides the right, but not the obligation, to trade at a future date.

These computational instruments allow actors to stretch their influence and decision-making forward in time. They are using formulas to manage future risk and make binding commitments about the future, today. This symbolic computation is the power to colonize and control future economic possibilities.

The telecom stack is the grid that synchronizes these activities globally, creating a single, unified arena where time-space power is exercised. When a major fund in New York executes a large trade in gold futures, the price change is not a local event. The telecom grid of networked exchanges, servers, and terminals ensures the new price is reflected instantaneously on the screens of every other participant, from a bank in London to a sovereign wealth fund in Tokyo.

This instant synchronization means that power is transmitted across the globe without delay. An action taken in one financial center has an immediate and unavoidable consequence in all others, forcing real-time adjustments and reactions. The grid creates a single, global “present” for the gold market, enabling instantaneous coordination and control of capital across planetary distances.

Gold currently anchors emerging-market attempts at monetary independence (e.g., BRICS discussions of gold-linked settlement). In this sense, gold acts as a shadow spreadsheet cell — not the central one, but one always available to be “activated” when the dollar grid weakens.

Gold no longer synchronizes the world’s money clock (as it did under gold convertibility) but now operates as a counter-clock. It ticks against the USD and eurodollar grid, rising when trust in dollar formulas falters.

Summary and Conclusion

Sophisticated computational formulas price gold derivatives and ETFs, while modeling their risk characteristics. The global telecom grid synchronizes this data across financial terminals, creating a single, real-time price. This process transforms gold into a weightless, placeless asset, granting actors who trade its symbol the “time-space power” to exert financial influence instantly across the globe and into the future.

Gold in spreadsheet capitalism is an inscribed number (semiotic substitution), a variable in financial formulas (symbolic computing), and a globally synchronized price stream (telecom infrastructure). It does not displace the USD as the dominant central cell, but its persistence ensures that the global financial spreadsheet always has a non-dollar column available for gold. It serves as an alternative anchor, representing historical memory, geopolitical hedging, and a systemic counterweight.

In the modern financial system, gold no longer functions as a currency but as a powerful symbolic asset within the dollar-anchored framework of “spreadsheet capitalism.” Its physical form is replaced by the digital symbol XAUUSD, which represents abstract concepts like “safe-haven,” “inflation hedge,” and “geopolitical insurance.” This symbol is integrated into the global economy through a Semiotic-Computational-Telecom (SCT) stack.

The analysis reveals that gold has become a paradoxical actor in the global financial grid. While it is traded and priced entirely within the US dollar’s computational system, its symbolic meaning is rooted in being the ultimate alternative to that system. It functions as a dormant “shadow cell” in the global spreadsheet, a counter-clock ticking against the main dollar-based mechanism.

Gold’s modern power, therefore, comes not from its potential to replace the dollar, but from the ever-present threat, inscribed in every financial formula, that it could be activated if trust in the current system fails. With substantial price increases over the last two years, is it reasonable to conclude that global finance is facing deep-seated systemic issues, or unwarranted lack of faith in the USD centered global grid?

Citation APA (7th Edition)

Pennings, A.J. (2025, Sep 20) Situating Gold in the Substitution-Computational-Telecom Stack of Global Finance. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/situating-gold-in-the-substitution-computational-telecom-stack-of-global-finance/

Notes

[1] I wrote my MA thesis on the Bretton Woods system and how deregulation and technology created a new system in the 1970s.
[2] Gold is a chemical element with the chemical symbol Au. It is an orange-yellow metal that is dense and heavy, yet soft, and malleable. Historically, this made it an excellent store of wealth, and medium of exchange. Gold does not rust or chip. It can be melted down and reformed in alternative shapes that maintain their form over long periods of time. Historically this has made it the preferred medium for bullion to be stored or coinage that can circulate widely, facilitating economic exchange. But that has changed over time.

Disclaimer: LLMs were used in researching this topic and the content of this post should not be considered investment advice.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI/broadband policy & engineering/financial economics. From 2002-2012 he taught comparative political economy, digital economics, macroeconomics, and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

USD Centrality and Network Effects in the Global Economy

Posted on | September 15, 2025 | No Comments

Discussions about replacing the US dollar as the world’s dominant reserve and transacting currency have diminished recently, as the BRICS rhetoric has weakened. However, the issue remains on the global agenda, as the world often suffers from significant dollar shortages, and fintech continues to innovate. Although its role as a reserve currency has diminished to roughly 60%, roughly 80–90% of global trade invoices are still priced in USD, even when neither party is American.

This post addresses the USD in the context of spreadsheet capitalism and its representational, formulaic, and networking techniques. It uses its central logic (substitution – symbolic computing – telecom infrastructure) to explain the USD/eurodollar’s centrality and network effects. It shows that the USD and its eurodollar shadow retain centrality because they are not just monetary units but cells at the center of the global spreadsheet, reproduced and reinforced daily by formulas running on Bloomberg, Aladdin, Workforce, and Wind terminals operating worldwide.

The standard definition of network effects is that the more people who use a medium, the more valuable it becomes. In this case, the more actors (banks, corporations, central banks, exporters/importers, and remitters) use the USD, the stronger the need for dollar-denominated assets, such as US Treasuries, repos, and eurodollars. The SWIFT (Society for Worldwide Interbank Financial Telecommunication) network is particularly valuable because it is used globally to transfer funds from one bank to another.

This means that USD/eurodollar retains dominance not just because of US economic size, but because its spreadsheet centrality as a reference cell, formulaic operator, and synchronized node element. These combine to create powerful network effects. These effects continuously reproduce the dollar’s role as the world’s reserve currency, invoicing unit, and liquidity solution, creating unique time–space power over globalization and trade.

Currently, the USD/eurodollar is the absolute reference cell ($USD$1) in global spreadsheet finance. Formulas in financial terminals (NPV, IRR, discounting, FX, VaR, repo haircuts, and swap resets) are constantly calculated relative to this cell, forcing alignment to the USD. In the global balance sheet “spreadsheet,” central banks, sovereign wealth funds, and corporates hold reserves in cells denominated in USD and eurodollars. [1]

This process is daily, recursive, and performative. The centrality of the USD is not given; it is produced and reproduced every time a formula is run. The USD/eurodollar isn’t just the dominant currency; it is the spreadsheet cell through which the world makes its financial calculations, continuously recomputed by formulas that lock global finance into its orbit. This makes the USD, global finance’s universal semiotic anchor, reinforced daily.

Network Effects of the Spreadsheet Stack

The USD’s global power is not just about US GDP or military strength; it’s also about network effects built into the financial infrastructure, where the dollar sits at the center cell of the global spreadsheet, anchoring liquidity, collateral, and pricing. Strong network effects are one of the primary reasons the USD remains the dominant reserve, funding, and transaction currency. These calculations are global, synchronized by terminals and telecom infrastructure.

In the semiotic–computational–telecom stack of spreadsheet capitalism, three major factors converge to generate USD network effects. These entrench dollar dominance and project financial time-space power to shape global liquidity rhythms and constrain the monetary policy space of other states. These are semiotic substitution, symbolic computing, and gridmatic network structure at a global scale.

Semiotic substitution makes USD the symbolic unit of global equivalence. In the reigning spreadsheet logic, USD is the universal cell of value. Every financial instrument — from Treasury bonds to repos, FX swaps, or commodities is denominated, quoted, or can be translated into US dollars. The USD substitutes for local currencies, creating a standard unit of account that translates heterogeneous values into a single unit of equivalence. The more instruments priced in USD, the more necessary it becomes for others to use it. Liquidity consolidates in the dollar. Actors anywhere in the world can instantly benchmark value in USD, binding local markets into a shared temporal rhythm (Fed policy cycles, Treasury auctions) producing time-space power.

Symbolic computing in spreadsheet capitalism ties global instruments into formulas that recalculate from a USD central cell. Through the mechanism of spreadsheet logic – formulas for discounting, swaps, or collateral haircuts are built around dollar benchmarks (SOFR, Fed Funds, Treasury yields).

The semiosis (meaning-making) effect is that USD benchmarks function as symbolic operators. Change one cell, like the Fed rate, and the recalculations ripple globally. This symbolic centrality means that dollar instruments are self-reinforcing: hedging, risk modeling, and valuation all “speak the language” of dollar pricing, reinforcing network effects. The USD wields time-space power because symbolic recalculations of global liquidity are temporally synchronized to its own monetary calendar.

Gridmatic infrastructure provides a networked visual interface as data digitizes and transmits those substitutions and recalculations at scale. It provides the digital plumbing of dollar circulation through SWIFT, CHIPS, CLS, and Fedwire. These form the computational-telecom backbone that routes and clears dollar transactions. Eurodollar markets add offshore balance sheets, while Bloomberg terminals, Alladin, and LSEG Workspace supply real-time spreadsheet overlays. The infrastructure itself becomes a representational grid where financial messages (such as MT103/202 in SWIFT, repo trade tickets, and swap confirmations) serve as symbolic substitutions, which are processed computationally.

Network effects are produced as each new participant increases the value of the system by extending the reach of USD liquidity, locking in path dependence, and raising switching costs. The telecom infrastructure makes dollar liquidity instantaneous across geographies while anchoring it in US jurisdictional time (New York business hours, Fed settlement deadlines). The gridmatic infrastructure enforces the wiring of payments in USD. Even offshore markets (eurodollars, petrodollars, Hong Kong’s dim sum bonds) need this grid to clear settlements.

Together, these factors combine to generate network effects that entrench dollar dominance and project temporal and spatial power. One factor is the collateral gravity of US Treasuries that concentrates balance-sheet activity around itself. When stress spreads globally, the network effects flip into liquidity gravity. Capital seeks the safety of USD, regardless of where it originates. The Dollar Milkshake Theory is an especially clear illustration of the network effects of the USD within spreadsheet capitalism.

Brent Johnson’s “Milkshake Theory” highlights the USD as a liquidity magnet. Because dollars are the most liquid, they remain the cheapest to hedge and easiest to transact. The Milkshake Theory is a vivid way of explaining why the USD attracts liquidity, and it aligns closely with our framework of spreadsheet capitalism and USD network effects.

Conclusion: Formulaic Reinforcement

The USD and eurodollar retain centrality because they are not just monetary units but cells at the center of the global spreadsheet, reproduced and reinforced daily by formulas running on Bloomberg, Aladdin, Workforce, and Wind terminals. A “global spreadsheet” grid — with USD as the absolute $A$1 reference cell, and formulas showing how FX trades, repos, and swaps would visualize this argument.

When we say the USD/eurodollar is a cell at the center of the global spreadsheet, we mean every balance sheet in the world, whether in New York, London, Shanghai, or São Paulo, has at least one column denominated in USD. A loan in yen, a bond in euros, or a derivative in pesos ultimately references the USD cell through conversions, swaps, or collateral rules.

The formulas running on Bloomberg, Aladdin, Wind, and similar terminal systems constantly recalculate this cell into action. FX conversions such as =Amount * FXRate(USD/JPY) ensure that any yen trade is mapped back to USD. Discounting/valuations like =NPV(SOFR, Cashflows) run collateral and swaps priced off USD benchmarks.[2] Risk models use Aladdin’s VaR and stress-test engines to run simulations with USD Treasuries as the safest anchor cell. In eurodollar transactions, Bloomberg’s repo haircuts and collateral flows formulaically reference US Treasuries, reinforcing USD liquidity. Every recalculation pulls other currencies, assets, and risks back into alignment with the USD reference cell.

Because markets operate like spreadsheets set to “auto-recalculate,” they continually reproduce the dollar’s centrality. Overnight repos reset collateral values daily. SOFR fixes are also computed and published daily, re-anchoring the time value of USD-denominated money. Swap resets adjust every three or six months, recalculating obligations in USD terms. Central bank operations (e.g., Fed swap lines) enter the sheet as “inputs” that cascade across the grid. Each recalculation doesn’t just measure — it re-performs the dollar’s centrality, turning semiotic substitution into lived liquidity.

This process makes the USD/eurodollar cell the universal denominator — the reference column against which all other rows and columns reconcile. It’s the pivot cell. If Excel had one “absolute reference” ($USD$1), that’s what the dollar is. Future USD challenges will face the prospects of “tippy” network effects that could shift centrality quickly if the right conditions occur.[3]

Citation APA (7th Edition)

Pennings, A.J. (2025, Sep 15) USD Centrality and Network Effects in the Global Economy. apennings.com https://apennings.com/dystopian-economies/electric-money/usd-centrality-and-network-effects-in-the-global-economy/

Notes

[1] An absolute reference in a spreadsheet, such as ($USD$1), is a formula element that uses dollar signs ($) to lock a cell’s reference so it doesn’t change when a formula is copied to other cells. In ($USD$1), the “USD” is not a standard column reference; a true absolute reference would be $A$1, where the dollar signs lock both the column (A) and the row (1).
[2] Examples like “=Amount * FXRate(USD/JPY)” and “=NPV(SOFR, Cashflows)” are illustrative but may oversimplify complex terminal calculations. They are simplified proxies in this case.
[3] In discussing the weaknesses of USD network effects, Amy Shuen in Web 2.0: A Strategy Guide suggested that network effects can be “tippy” and cause disruption such as the rapid user transition from MySpace to Facebook. Can the USD tip quickly to another currency?

© ALL RIGHTS RESERVED

Not to be considered financial advice.



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

APEC Presentation: Healing the COP (Common Operating Picture) with AI and APIs for Disaster Management

Posted on | September 13, 2025 | No Comments

Edited remarks from an invited presentation at the Asia-Pacific Economic Cooperation (APEC) meeting on July 31, 2025. Due to a family responsibility in New York at the same time, I invited a former PhD student to work with me on the slides and give the presentation. My deepest gratitude goes out to Dr. Saebom Jin for her collaboration, insights, and excellent presentation.

Good afternoon, distinguished participants at APEC’s SDMOF, the 18th Senior Disaster Management Officials’ Forum. My name is Saebom Jin, from the National AI Research Lab at KAIST. It is my true privilege to be here today at this important dialogue on strengthening disaster leadership in the Asia-Pacific region.

Together with Prof. Anthony Pennings of SUNY Korea, we prepared this presentation, studying how emerging technologies like AI can enhance existing disaster information platforms. In particular, drawing on the Remediation Theory from media studies, we attempted to examine how AI might be integrated into traditional information systems used in disaster management. Our particular concern was the Common Operating Picture (COP) used in disaster management control rooms. Our aim is to better understand the deeper implications of such integration in supporting effective and trustworthy leadership in times of crisis.

APEC Title slide

As you might have heard many times, we live in a VUCA (Volatility, Uncertainty, Complexity, and Ambiguity) world. And it is more and more difficult to predict something in the future, even in the near future, as historical data is incapable of predicting the future – especially when it comes to the changing climate and its disaster implications.

I would say “the odds of odd things happening are increasing” – we have just witnessed very tragic disaster outbreaks in Texas, 3 years ago when a winter storm caused widespread electricity blackouts, and also just a few weeks ago, when flash floods washed away the Camp Mystic summer camp, taking the lives of more than 27 young girls.

It is not a matter of data scarcity. In fact, we have encountered a proliferation of data from a variety of sources – from satellite imagery to IoT sensors and social media. In this situation, the challenge we face is not the lack of data but the overwhelming complexity of making sense of it.

Not data scarcity

In such environments, leadership must innovate — not only in decision-making but also in navigating these multi-layered information environments of crucial information. In this presentation, I would like to suggest that effective leadership must embrace three qualities:

Data-driven decision making using insights from complex systems to guide action.

Anticipatory response by identifying patterns and taking proactive measures (Typhon monitoring and warnings)

Adaptive systems moving beyond static command structures to dynamic and interface-enabled leadership.

This shift requires more than technology. It demands a rethinking of how we design information platforms to support human judgment.

Here is where Remediation theory offers us insight. According to the famous media theorist Marshall McLuhan, the content of any medium is always another medium. Following up on this observation, Jay David Bolter and Richard Grusin described remediation as the process by which new media refashion and incorporate older media forms. In this remediation theory, new technologies, like spreadsheets, are designed to improve upon the prior technologies to mediate a more authentic sense of reality.[2]

In other words, this isn’t about simply replacing old technological systems with new ones, but adopting and adapting the functions of the existing systems to help leaders better perceive, interpret and act through media. Drawing on the Italian roots of the word remediari it is about “healing” the media’s access to an authentic reality.

Here are the key concepts of this theory:

The logic of Immediacy is the idea that technology should closely reflect the real world in order to create a sense of presence and realism, as seen in movies, television news, and day-time soap operas or live video streaming. Relevant examples are real-time video-conferencing and live data feeds. Drawing on first person video from the affected zones often provides the transparent immediacy of authentic experience. Television reporters for example often go out into water and wind to dramatize the weather events.

Hypermediacy indicates multiple representations within a heterogeneous space. It is a layered, often windowed interface with GIS informatics and multichannel communications to combine images, sounds, text, and video. This approach offers an augmented, quantitative view of the world, drawing on the power of numeracy and remediating tools like graphics and maps.

The most radical concept in remediation refers to the transformation of prior media into a new framework, creating a more authentic, interactive, and actionable experience. In this concept, we do not just replace the legacy systems that leaders and practitioners have long relied upon, but transition from one to another by integrating new media forms into the COP.

The Common Operating Picture

In practice, we have witnessed how media have been transformed or remediated to better convey valid information out of a vast amount of data – from printed material to radio and television, from TV news to social media updates, and so on. In such an environment, crisis communication and disaster leadership involves navigating various types of digital interfaces rather than issuing directives alone.

Among the various mechanisms and tools, we focused on the Common Operational Picture (COP) as a mediation tool that can incorporate novel technologies using as AI and APIs, and create value for disaster leadership in response to climate variance and data variance.

COP is a shared, real-time view of operational data for decision-makers, teams, and agencies involved in multi-agency disaster risk reduction operations. By providing up-to-date information through dashboards and alerts, COP supports situational awareness and coordinated action.

Let me present the idea of how COPs can be remediated with AI and APIs. First, APIs act as bridges, integrating and normalizing diverse data streams into a hypermediated, yet unified operational view. APIs incorporate various data streams and make them available for the COP.

Next, AI processes and analyzes this massive and disparate data in real-time dynamic media environments. While APIs serve as the infrastructure for remediation, AI acts as the operating system for the remediated COP. By utilizing NLP, computer vision, and predictive analytics with machine learning algorithms, AI-remediated COP simplifies the data and facilitates interpretation and prediction.

AI OS APIs COPS

It helps build transparency and enhance leaders’ ability to act confidently under pressure. In summary, AI enables the COP to transition from a static display of information to a dynamic intelligence platform with crucial mediated information on demand.

Another notable feature of the remediated COP is the development of Personalized Operational Pictures (POPs). While COPs used to be confined to control rooms, they are now evolving into POPs that provide individualized pictures based on roles within the command room and out in the field. With APIs and AI, these personalized mobile platforms can provide role-specific and actionable intelligence into the hands of leaders and practitioners during times of crisis. In complex disaster environments, this means leaders see only what matters, empowering faster, more focused decisions without being overwhelmed by irrelevant data.

This chart illustrates data flow of the suggested model.

Data flows

Now then, let me move on to the design principles in order to achieve effective leadership with this remediated system in practice. As technology alone does not guarantee the success of this system, how to apply the theory into practice in a way to enhance trust and leadership is key.

The first principle is about data curation and visualization. When in crisis, more data does not necessarily lead to a better decision. However, this does not mean the complexity of a disaster should be ignored. The key feature of the suggested platforms is, therefore, providing remediated visuals to help leaders grasp the full picture of a disaster with contextualized and curated data. AI-assisted COPs can summarize trends of multiple data streams.

This remediated platform must be designed to build trust through a transparent yet effective explanation of complexity. Rather than simply being exposed to the increasing volume and variety of data, users need to be able to make sense of it through effective platforms.

The next principle regards communication. Using official websites or applications is recommended, but also, the hypermediated platforms should support users with targeted messages based on their roles and location, for example.

A technical feature for two-way interaction through the platform and advanced dashboards is also recommended for clear and credible communication within teams and with the public.

Effective COPs balance the two logics of remediation: 1) Hypermediation reveals data-richness, layered complexity, and enables tailored views for different roles. Transparent Immediacy involves delivering real-time clarity and minimizing cognitive load in urgent situations. Together, they ensure that leaders are not overwhelmed by data, but empowered by insight.

This design philosophy extends beyond control rooms. Public-facing dashboards showing real-time rainfall, river levels, or evacuation orders foster institutional trust and transparency — key components of effective disaster leadership.

As these systems evolve, leaders must also evolve. Effective disaster leadership today means:

– Championing user-centered design
– Ensuring interoperability across media systems
– Training not just on using technology but on interpreting complex information environments. In short, leadership is becoming interface-native.

In this presentation, AI is not just as a processor, but as a media theorist in action. AI is the operating core of the modern COP, serving as an intelligent media system that delivers clarity for responders, in-depth insights for analysts, and trust for the public.

Guided by Remediation Theory, AI-enabled COPs become more than tools or information repositories — they are strategic tools for decision-making, trust-building, and narrative construction during crises.
AI-assisted COPs empower responders, inform the public, and bring more order to chaotic disaster scenarios. Still, we need to ensure that the balance between immediacy and accuracy, as well as between simplicity and integrity, remains in the AI-assisted COP. That is why remediated COPs is more than technology.

Lastly, let me leave you with this quote by Bolter and Grusin:

“Our culture wants both to multiply its media and to erase all traces of mediation.”

As leaders, we must balance these tensions — embracing complexity without losing clarity, and ensuring that our technologies remain tools for human judgment, not barriers to it.

Thank you for your attention. I look forward to discussing how we can co-create smarter, more trusted disaster information systems that empower leadership across the Asia-Pacific region.

Citation APA (7th Edition)

Pennings, A.J., Jin, S. (2025, Sep 13) APEC Presentation: Healing the COP (Common Operating Picture) with AI and APIs for Disaster Management. apennings.com https://apennings.com/crisis-communications/apec-presentation-healing-the-cop-common-operating-picture-with-ai-and-apis-for-disaster-management/

Notes

[1] APEC stands for Asia-Pacific Economic Cooperation. It is a regional economic forum that promotes trade, investment, economic growth, and cooperation among its 21 member economies around the Pacific Rim. The group aims to foster prosperity, sustainable economic development, and more resilient economies in the Asia-Pacific region.
[2] Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: MIT Press, 2000.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and holds a joint Research Professor position for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Equilibrium and the Turn from Political Economy to Economics

Posted on | September 7, 2025 | No Comments

This post looks at the historical turn from political economy to “economics,” when the concepts of equilibrium and marginal analysis transformed economics from logical reasoning, historical analysis, and philosophical arguments into a neutral science. This shift moved economics away from political economy, focusing instead on market mechanisms and consumer behaviors that could be represented in graphs, equations, and supply-demand modeling with mathematical connections.[1]

The Enlightenment (1685-1815) in Europe brought a new emphasis on reason, individual rights, natural law, and social progress. It began to replace the worldview dominated by absolute monarchy, mercantilism, rigid social hierarchies, and religious dogma. This shift in thinking had a profound impact on European societies, laying the groundwork for the development of modern democracies and our understanding of political economies.

Thinkers like Adam Smith, often considered the father of modern economics, and other classical political economists, such as David Ricardo, John Stuart Mill, and François Quesnay, developed their theories on political economy based on logical reasoning, historical analysis, observation, and philosophical arguments. They used illustrative examples rather than mathematical precision to argue their positions.

In an age before political arithmetic and the collection of national “state-istics,” began to create a numerical representation of the state, political economists sought to develop arguments that would influence merchants, monarchs, and other politicians, not just other scholars. They were “policy” oriented and sought to make society better.

However, economics would take a turn away from “normative economics” (what should be) to what are called “positive economics” (describing what is), due primarily to the work of the “marginalists” in the 1870s. This group included Carl Menger, William Stanley Jevons, and Léon Walras. Alfred Marshall gave visual legitimacy to the marginalists’ ideas by developing the graphical representation of supply and demand, which is taught in economics courses.

While they worked independently, these four economists converged on a revolutionary idea. They argued that the economic value of a good is not determined by the labor required to produce it (the classical political economy view) but by the “marginal utility” it provides to a consumer. What is the usefulness of one additional unit of that good? And how would that influence price? This “marginalist” perspective was born from a new philosophical and corporate environment that prized scientific positivism and individualism over the historical and social concerns of the Enlightenment’s early political economists.

There was a powerful drive to make the social sciences as rigorous and objective as the natural sciences, like physics. The goal was to discover universal, mathematical laws that governed human society, just as Newton had discovered laws that governed the planets. The language of calculus and equilibrium was seen as more scientific than the historical narratives of the political economists.

They were also under pressure to develop narratives that countered the growing interest in critical perspectives. The classical labor theory of value had been used by Karl Marx to argue that capitalism was inherently exploitative. The marginal utility theory provided a powerful counterargument, suggesting that if value originates from a consumer’s subjective desire, rather than from labor, then there is no inherent conflict between capital and labor. Instead, the market is a harmonious system where rational individuals all pursue their own self-interest, leading to a stable and efficient equilibrium.

Carl Menger (Austrian School) argued that value is not an inherent property of a good but exists in the individual who needs it. His focus was on human action and causality, explaining how individuals make choices based on their personal, ranked preferences. The subjective theory of value challenged the labor theory of value proffered by Adam Smith and Karl Marx.

William Stanley Jevons (British School) developed the theory of marginal utility and one price, arguing that the value of a good is determined by the satisfaction gained from consuming one more unit of it, not by the cost of production. Jevons sought to make economics a true science by applying mathematics. He framed economics as a “calculus of pleasure and pain,” arguing that a rational person would consume a good up to the point where the pleasure from the last unit (its marginal utility) equals the pain or cost of acquiring it.

His one price theory argued that prices would converge to one price where markets would clear was important in the development of supply and demand charts. William Stanley Jevons gave one of the first and most explicit formulations of the Law of One Price in his 1871 book, The Theory of Political Economy. It was a foundational component of his effort to build a scientific and mathematical theory of economics.

Jevons’ core argument was that a “perfect market” would naturally lead to a single prevailing price for any identical good. His famous statement on the matter is: “In the same open market, at any one moment, there cannot be two prices for the same kind of article.” His reasoning was based on the flow of information and the actions of rational traders. If two different prices for the same item existed, a trader could instantly profit by buying the good at the lower price and selling it at the higher price. This act of arbitrage, when performed by many traders, would increase demand for the lower-priced good (raising its price) and increase the supply of the higher-priced good (lowering its price) until the two prices converged into one.

Léon Walras (Lausanne School) argued markets must be “cleared” of any excess supply and demand to be in the state of equilibrium. The existence of excess supply in one market must be matched by excess demand in another market so that both factors are balanced out. Walras was the most mathematically ambitious of the group. He developed general equilibrium theory, creating a complex system of simultaneous equations to show how supply and demand across all markets in an economy could, in theory, reach a state of equilibrium. His work became the foundation for much of modern microeconomic modeling.

Alfred Marshall’s (Neoclassical Synthesis) landmark 1890 book Principles of Economics, blended the new marginalist ideas about consumer demand and utility with the classical economists’ focus on the costs of production. He created the famous supply and demand “scissors” diagram (seen above), which remains the most recognizable tool in economics. It was Marshall who popularized the term “economics” to replace “political economy,” cementing the discipline’s shift towards a social-neutral science.[2]

All four political economists were crucial in shifting economic thought towards marginalism and developing the theory of equilibrium.

Chronology of Major Works on Marginal Analysis

1862 – William Stanley Jevons presents his paper, “A General Mathematical Theory of Political Economy,” which first outlined his theory of marginal utility. It received little attention at the time but predated the major publications of the 1870s.

1871 – Carl Menger publishes Grundsätze der Volkswirtschaftslehre (Principles of Economics). This work established the Austrian School’s subjective theory of value, arguing that the value of a good is determined by the importance an individual places on its least important use. Also that year, William Stanley Jevons publishes The Theory of Political Economy. In this book, he fully developed his marginal utility theory using calculus, arguing that rational individuals consume until the marginal utility of a good equals its price.

1874 – Léon Walras publishes the first part of his Éléments d’économie politique pure (Elements of Pure Economics). Walras independently formulated marginal utility theory but, most importantly, integrated it into a comprehensive mathematical system of general equilibrium, showing how all markets could theoretically clear simultaneously.

1890 – Alfred Marshall publishes his highly influential Principles of Economics. This work did not introduce marginalism but synthesized it with classical economic thought. Marshall combined the marginalist theory of demand (utility) with the classical theory of supply (cost of production) to create the famous supply-and-demand analysis that became the foundation of neoclassical economics.

Walras provided the most complete framework for general equilibrium, while Jevons and Menger offered key insights into individual behavior and the subjective nature of value that underpin equilibrium conditions. Increased supply of something eventually decreases the value of it and demand will decrease. Jevons analyzed how individuals would purchase goods until their marginal utilities were proportional to the exchange ratios (prices).

This would lead to equilibrium in exchanges between consumers and suppliers of goods and services. Walras is known for developing the concept of general equilibrium and envisioned the economy as a system of interconnected markets, where the prices and quantities of all goods and services are simultaneously determined by supply and demand. Walras sought to express this theory mathematically, using equations to represent the equilibrium conditions across all markets. His work laid the groundwork for formal mathematical modeling in economics.

Alfred Marshall then popularized the charting of supply and demand in graphs that are still taught in economics courses. Walras invented the initial equations, and Marshall expanded on them to develop the concepts of equilibrium price and marginal analysis that could be visualized through supply and demand curves. Marshall developed a stunning representation of prices adjusting until the quantity of goods supplied equals the quantity of goods demanded. This visualization revolutionized how economists understood markets and its participants.

Subsequently, economics became more of a technical field, relying on graphs, equations, and models rather than moral and political philosophy. Jevons and Marshall showed that prices are not just determined by historical social forces but by individual consumer preferences and firm behavior in a market system. This shift would mark the decline of classical political economy (which focused on broad social and political factors) and the rise of modern economics. The new economics centered on mathematical modeling and allowed economists to study market mechanisms in isolation, ignoring broader social forces, including the centrality of labor.

Citation APA (7th Edition)

Pennings, A.J. (2025, Sep 7) Equilibrium and the Turn from Political Economy to Economics. apennings.com https://apennings.com/dystopian-economies/equilibrium-and-the-turn-from-political-economy-to-economics/

Notes

[1] This was an intriguing question for me in graduate school. When I taught macroeconomics, comparative political economy, and media economics at NYU, I had a chance to do additional research but lacked a good publishing opportunity.
[2] Sharma, S. (2020). The Death of Political Economy: A Retrospective Overview of Economic Thought. Economic Research-Ekonomska Istraživanja, 33(1), 1750–1766. https://doi.org/10.1080/1331677X.2020.1761854

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Peirce and Derrida on the Logic and Power of Spreadsheets and Dashboards

Posted on | September 3, 2025 | No Comments

The growing ubiquity of spreadsheets, dashboards, and AI models across organizational life marks a shift in how meaning, decision-making, and authority are structured. These systems do not simply reflect reality or suggest more than an increase in computational capability. These techniques construct reality symbolically, substituting formulas, predictions, and metrics for presence, memory, and deliberation.

This post continues my analysis of the digital spreadsheet as a meaning-making technology operating globally. It addresses two issues that have hindered my previous research progress – substitution and action. How do “signs” of things and people substitute for the real world? Also, how do spreadsheets lead to actions and decisions?

Consequently, I have reached further into semiotic and post-structural theories to introduce and examine two conjuncted terms, “writing-as-substitution,” and “computation-as-symbolic-action.” They will be used as frameworks to analyze the epistemological workings of spreadsheets, and even extrapolate to other spreadsheet-like technologies, such as financial terminals like the Bloomberg “Box.”[1]

The post develops a comparative semiotic framework for understanding how spreadsheet-based and algorithmic systems govern human and institutional behavior through substitution and computation-as-symbolic-action. Drawing from Charles Sanders Peirce’s triadic semiotics and Jacques Derrida’s notion of différance and trace, it argues that media systems such as spreadsheets, dashboards, and AI models are not neutral tools but semiotic infrastructures that produce and enforce meaning through symbolic substitution, deferral, and interpretive feedback, enabling a form of technocapitalist governance grounded in semiotic abstraction.[2]

By situating formulas, metrics, and predictions within the dynamics of Peircean interpretants and Derridean traces, this post offers a critical perspective on the growing power of algorithmic systems to structure visibility, automate judgment, and inscribe control in modern information and communications technologies.

Constructed Knowledge Through Substitutional Systems

Semiotic scholars point out that language and writing remove the visible from sight, replacing it with a conceptual or textual substitute (or trace). For Charles Sanders Peirce, every sign exists within a triadic relationship consisting of a Representamen (the sign itself), the Object (what it refers to), and an Interpretant (the understanding or effect produced in a mind or subsequent sign). The interpretant is not a static meaning but a mediating process. Meaning unfolds through a recursive chain of interpretations rather than being fixed in any single instance.

In the context of spreadsheets, each numerical or textual cell can be understood as a representamen that stands for an external reality, such as a mortgage, a barrel of oil, or a share price. The interpretant occurs when an analyst, trader, or algorithm reads this value, interprets its significance, and transforms it into a new action or representation (e.g., an updated forecast, a buy/sell signal, or a risk model).

Writing recovers the world as Derrida’s trace. Where Peirce’s interpretant emphasizes continuity, Derrida’s trace emphasizes absence. For Derrida, every sign carries within it the trace of other signs that it both recalls and excludes. Meaning is not generated by correspondence but by difference and deferral (différance). The trace marks what is not present. It marks the remainders of prior substitutions and the anticipation of future ones.

Spreadsheets work through entering alphanumerical symbols that make something visible on the spreadsheet that is materially, but not entirely, absent.[3] This inherently means that words and numbers substitute for presence — they represent something absent, creating a system where presence is always deferred through representation.[4] Language becomes a ghosting of presence. For Derrida, the thing represented is hidden, but its representation remains.

In semiotic parlance, the spreadsheet is a gridwork of signs where each cell is an empty container waiting to receive inputs that substitute for real-world quantities, relationships, or categories. As Ferdinand de Saussure argued, language operates not through a natural bond between a word (signifier) and its object (signified), but through arbitrary, differential relationships within a structured system.[5] Spreadsheet cells gain meaning not in isolation, but by their position within a grid (row/column), the labels assigned in headers, and their formulaic relations.

In Derrida’s spreadsheet, each number or letter also bears the logic of the trace. A cell labeled “Revenue” or “SOFR” seems to present a concrete fact. Still, in Derridean terms, it is a deferred presence, a placeholder for a complex, deferred chain of other inscriptions. These could include contracts, transactions, data feeds, and institutional protocols. The number “5.25%” in a Bloomberg cell, for instance, is not an empirical reality but a symbolic residue of an entire infrastructural network of rate submissions, calculations, and policy expectations. Where Peirce’s interpretant focuses on how signs generate meaning through interpretation, Derrida’s trace reveals how meaning is haunted by what cannot be fully represented. The messy, material world that must be abstracted away to make computation possible.

In the spreadsheet, writing becomes substitution. Jacques Derrida would identify this substitution as the “trace.” The number “125” in a cell is not an object itself (e.g., 125 cows or units of revenue) but a symbolic stand-in: a visible mark that references an absent event, transaction, or condition. As Derrida noted in Of Grammatology, writing is “a substitute which inscribes absence.”[6] The spreadsheet thus removes the visible (the real-world referent) from sight and replaces it with a structured trace. The spreadsheet governs through symbols that don’t mean directly, but produce meaning through absence. Words, in this framework, do not point directly to what they mean, but to a network of traces — each word carries the echo of another. This absence is not a deficiency but a condition that allows for interpretation and meaning to emerge. Writing is not secondary to speech; rather, it demonstrates the displacement of the referent.

In spreadsheets, this process of knowledge construction unfolds through several processes.

– List labeling (column headers like “Net Profit,” “Customer Churn,” or “Risk Tier”)

– Formulaic logic (statistical functions like standard deviation (stdev) =STDEV, financial models using =NPV, etc.)

– Conditional computation (e.g., =IF, =VLOOKUP, =INDEX/MATCH)

These processes substitute conceptual abstractions for real-world activities such as sales, behaviors, failures, and strategies. In an organizational or personal productivity context, these elements are translated into a symbolic, calculable form. In this system, meaning is never fixed, and what appears “present” is only ever a trace. The written word functions not by presenting, but by replacing presence with representation—leaving behind traces of what is no longer visible.

How Functions and Formulas Operate Semiologically

Formulas and functions are types of writing. A formula such as =SUM(B2:B5) is a syntactic construction written in a symbolic language. They substitute for material reality and use formulas to stand in for an action. A substitution of the act of summing for an observable process. It replaces manual calculation with a rule-based trace. The value the formula returns is not immediately visible in the formula itself. Rather, it is calculated in absence, echoing Derrida’s idea that signification is always deferred and built upon an invisible referent.

Spreadsheets act relationally. Much like Saussure’s linguistic signs, formulas derive their value by referring to other cells (e.g., B2:B5), which in turn may themselves contain signs or substitutions. Derrida’s différance plays out materially in spreadsheet logic as each function defers meaning by referencing other parts of the sheet. The result is a trace of invisible logic, embedded in visual structure.

The formula result references absent and mutable data. It is a deferred, procedural statement whose outcome is dependent on other absent or invisible data. Spreadsheet grid logic organizes and constrains sign use. A formula like =IF(D3>100,”High”,”Low”) is not a result, but a script for producing conditional presence. Meaning is always deferred. Derrida invented the term différance to refer to the production of meaning through dependencies, ranges, and conditionalities.

The digital spreadsheet is a semiotic and epistemological technology, combining writing-as-substitution with computation-as-action within a structured symbolic field. It does not merely reflect the world; it replaces, structures, and governs it. Through formulas and functions, it mediates between symbol and action, data and decision, absence and presence. It constructs reality in grids, calculates from traces, and produces futures through procedural substitutions. In this way, the spreadsheet is not just where information is stored; it is where information becomes action, knowledge becomes structure, and representation becomes rule.

Jacques Derrida reoriented Saussure by emphasizing that writing is not a mirror of speech or presence. Writing is a trace, a mark of absence, and an act of deferred meaning. In spreadsheet logic, a formula like =VLOOKUP(D5,Table,2,FALSE) is a grammatological trace. It invokes a value not immediately present, referencing an array of values elsewhere. The value that appears in the cell is not the thing itself, but the outcome of a substitution chain.

Thus, the spreadsheet calculates from traces, not from essences. As Derrida would argue, every function is a gesture of absence, producing presence through reference to the elsewhere. Spreadsheet returns and an act of symbolic dislocation are made manifest through computation. The spreadsheet is not a passive record but a machine of deferrals, where value is always calculated in relation, never in isolation, and fully present in its textual inscription.

Thus, Chandler’s insight exposes the spreadsheet as a reality filter. It reifies the calculable, delegitimizes the unquantified, and normalizes visibility within a structured symbolic grammar. In a spreadsheet, data becomes visible only if it can be quantified, categorized, or calculated. What cannot be structured — emotion, ambiguity, contradiction — is rendered invisible in the abstraction process. A spreadsheet template enforces schema over substance: even subjective data must be shaped to fit predefined cells (e.g., “Employee Satisfaction: [1–10]”). Sign systems do not passively reflect reality. They construct it through symbolic choices and structures.

Lucian Bagiu’s analysis of structuralism vs. deconstruction illuminates how symbolic systems determine what can be made visible.[7] The spreadsheet, in this light, does more than represent data. It functions as an epistemological technology that enforces a regime of visibility. What appears in the cell is not “truth” but truth-as-structured-by-symbols. Conditional formatting, pivot tables, and data validation create layers of symbolic control, filtering what is emphasized, hidden, flagged, or acted upon. Bagiu helps us see that this is not a neutral process. It is an act of symbolic governance. The spreadsheet decides what counts as knowledge, what is ignored, and what is actionable.

Summary

This blog post argues that spreadsheets, dashboards, and AI models are not neutral tools but powerful “semiotic infrastructures” that actively construct and govern reality. It introduces two key concepts to explain this process: “writing-as-substitution” and “computation-as-symbolic-action.”

Drawing on the semiotic theories of Jacques Derrida and Charles Sanders Peirce, the post explains that a spreadsheet cell doesn’t just hold data; it substitutes a simple symbol for a complex, absent reality. This number or text is a “trace” that makes the world calculable by stripping away its messy, unquantifiable details.

The post then describes how formulas and functions act as a form of “computation-as-symbolic-action.” A formula like =SUM or =IF is not just a calculation but a script that automates a decision or judgment based on these symbolic traces. This process creates a chain of meaning where value is determined by its relationship to other cells, always deferring to absent data.

Ultimately, it concludes that these technologies function as systems of governance. By defining what can be measured and calculated, the spreadsheet filters reality, rendering things like emotion or ambiguity invisible. It doesn’t just reflect the world; it replaces, structures, and enforces a specific, calculable version of it.

Citation APA (7th Edition)

Pennings, A.J. (2025, Sep 03) Peirce and Derrida on the Logic and Power of Spreadsheets and Dashboards. apennings.com https://apennings.com/technologies-of-meaning/peirce-and-derrida-on-the-logic-and-power-of-spreadsheets-and-dashboards/

Notes

[1] This was a real baseline post for me although I only finished it recently. While reviewing a paper on spreadsheets in mid-July 2025 that was written with Patrick Rose, I ran across a mention of “substitution.” This started a whole new trajectory of inquiry that resulted in the conceptualization of the SACT stack.
[2] Peirce, C.S. (1955). Philosophical Writings of Peirce. Dover. Derrida, Jacques. Positions. Translated and annotated by Alan Bass. University of Chicago Press, 1982.
[3] Peirce, C.S. (1955). Philosophical Writings of Peirce. Dover.
[4] Chandler, Daniel (1995). The Act of Writing: A Media Theory Approach. Aberystwyth: University of Wales. Also see Chandler, D. (2002). Semiotics: The Basics. London: Routledge. Chandler built one of my favorite websites on semiotics. I use it in my Visual Rhetoric and IT class.
[5] F. de Saussure. 2nd trans.: Roy Harris, trans. (1983) Course in General Linguistics. La Salle, Ill.: Open Court.
[6] Derrida, J. (1976). Of Grammatology. Johns Hopkins University Press. Available at https://bit.ly/3L0yAKu
[7] Bagiu, L. (2009). Writing in Deconstruction vs Speech in Structuralism (Jacques Derrida vs Ferdinand de Saussure). Transilvania, XXXVII (CXIII)(8), 79-87. http://www.revistatransilvania.ro/nou/ro/anul-editorial-2009/doc_download/26-numrul-82009.html

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy, visual rhetoric, and engineering economics. From 2002-2012, he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Analyzing the Economics of Asymmetric Robotic Drone Warfare

Posted on | September 1, 2025 | No Comments

I’ve been intrigued by the asymmetric conflicts in Ukraine and Yemen these last few years. The introduction of low-cost drones and missiles has threatened established powers and the protection of global maritime trade. They are increasingly important topics in my classes, especially Engineering Economics, ICT for Sustainable Development, and Sensing Technologies for Disaster Risk Reduction.

Asymmetric warfare is conflict between two opposing sides with significantly different military capabilities, power, and tactics. It often involves weaker forces employing unconventional methods, such as guerrilla tactics, terrorism, or hit-and-run attacks, to exploit the strengths of their powerful adversary, thereby prolonging the conflict and challenging traditional warfare norms.

While this type of conflict typically involves a state actor versus a non-state actor, such as insurgents or resistance militias operating within controlled territory, the development of robotic drone-based warfare suggests scenarios that will likely continue to pit smaller state actors against larger ones (asymmetric) with established legacy military capabilities.

The major categories of robotic drones in warfare are best understood by their operational domains, such as air, land, sea surface, and underwater, and then by their primary mission function. These systems range from small, hand-launched vehicles to massive autonomous ships and submarines, fundamentally reshaping the modern battlefield. They are organized in the following categories:

Uncrewed Aerial Vehicles (UAVs) / Air Drones
Uncrewed Surface Vessels (USVs) / Sea Drones
Uncrewed Underwater Vessels (UUVs) / Submarine Drones
Uncrewed Ground Vehicles (UGVs) / Land Drones
Swarms and AI-Driven Systems

The best way to research the economics of asymmetric warfare using robotic drones is to employ a Cost-Imposition Framework. This approach analyzes the conflict not just as a military engagement, but as a form of economic warfare where the primary goal of the weaker actor is to force the stronger actor to spend a disproportionate amount of resources on defense.

The research should be structured into at least three parts. One is analyzing the inferior actor’s costs, the second addresses the superior actor’s costs, and then finally modeling the interaction between the two actors.

Implementing a Cost-Imposition Framework

At its core, the economic asymmetry in drone warfare is about imposing unsustainable costs on the adversary. A weaker actor uses extremely low-cost, often commercially available drones to create damage and threats that extremely high-cost, sophisticated defense systems must counter.

The key metric is the cost-exchange ratio. This ratio compares the attacker’s cost to achieve an effect versus the defender’s cost to prevent it. It measures the incremental cost to the aggressor of obtaining one additional offensive device that bypasses a defense shield. A relevant example is a $1000 UAV modified to carry a grenade, forcing a state military to fire a $2 million air defense missile to intercept it. This attack results in a catastrophic economic loss for the defender, even in the event of a successful interception.

Part 1 – Economics of the Inferior Actor (The Attacker)

The research should quantify the full “attack stack” of coordinated vehicles of the weaker side, which is defined by its overall low cost, adaptability, and strategic impact. Accounting should encompass the full range of unconventional weapons available, including guerrilla warfare, terrorism, cyberattacks, and other methods that exploit a superior opponent’s weaknesses.

This research would expand into human capital, procurement and supply chains, and research and development. Human capital involves quantifying the low cost of training operators, who often only require skills comparable to those of a video gamer, and the minimal organizational overhead of deploying decentralized cells. Procurement and supply chain dynamics encompass analyzing the costs of sourcing commercial-off-the-shelf (COTS) drones from platforms like AliExpress, 3D printing components, and adapting simple munitions. Research the illicit supply chains used to acquire these parts.

Research and development is another crucial area to examine. The “R&D” is often just rapid, low-cost tinkering and experimentation. Failure is relatively inexpensive, allowing for a significantly faster military innovation cycle where new technologies, tactics, and organizational methods are developed, tested, and adopted, progressing from conceptualization to widespread implementation and eventual obsolescence.

Part 2 – Economics of the Superior Actor (The Defender?)

The research must then analyze the defender’s cost structure, which is often characterized by high expense and slow adaptation. This examination would encompass procurement and operational costs, research and development, as well as indirect costs.

Procurement and operational costs include the multi-billion dollar price tags for dedicated air defense systems (radars, interceptors like the Patriot or Iron Dome) and the high cost of each interceptor fired. It also includes the personnel and maintenance costs required to operate these systems 24/7.

“R&D” in this case involves the massive, multi-year, and often trillion-dollar state investment in developing specialized counter-UAV technologies like lasers, directed energy weapons, and AI-driven detection systems.

Indirect economic costs can involve critical variables. The model must include the massive secondary costs of a successful or even an attempted attack, such as the economic impact of shutting down an airport, damaging critical infrastructure like a bridge or oil refinery, or the cost of deploying security measures to protect civilian areas.

Part 3 – Modeling and Synthesis

The final step is to synthesize the data into a comparative model to answer key economic questions. This synthesis involves data collection and financial modeling. Gather real-world cost data from recent conflicts in Ukraine, Syria, and Yemen for specific drone technologies and the interceptors used against them. Build a modelling system that allows a comparison of the cost-to-attack versus the cost-to-defend under various scenarios.

Summary and Recommendations

This post describes how asymmetric warfare, a conflict between actors with vastly different capabilities, is being reshaped by robotic drones. It proposes a Cost-Imposition Framework to analyze this conflict as a form of economic warfare. The core idea is that a weaker actor can use cheap, adaptable, and commercially sourced drones (across air, land, and sea) to force a militarily superior actor to spend disproportionately large sums on high-tech defenses.

The key metric for this analysis is the “cost-exchange ratio,” which compares the low cost of an attack (e.g., a $500 drone) to the high cost of defense (e.g., a $3 million missile). The research involves a three-part process: analyzing the low-cost structure of the attacker, the high-cost structure of the defender, and finally, synthesizing the data in a comparative model.

Suggested Model for the Final Step

An effective model to fulfill the final step of synthesizing the data and answering the key research questions is an Agent-Based Model (ABM). An ABM is a computational simulation that models the actions and interactions of autonomous agents (both individuals and collective entities) to see what effects they have on the system as a whole. It is suited for this problem because it can directly simulate the economic and tactical interplay between numerous, distinct assets.

Citation APA (7th Edition)

Pennings, A.J. (2025, Sep 1) Analyzing the Economics of Asymetric Robotic Drone Warfare. apennings.com https://apennings.com/digital-geography/analyzing-the-economics-of-asymetric-robotic-drone-warfare/

Note: Chat GPT and Gemini were used for parts of this post.

© ALL RIGHTS RESERVED

Not to be considered financial advice.



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and engineering economics. From 2002-2012 he taught comparative political economy, digital economics, and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

LSEG Workspace as the Central Spreadsheet Grid for Eurodollars

Posted on | August 23, 2025 | No Comments

In spreadsheet capitalism terms, LSEG’s Workspace is a major platform that continues to turn fragmented eurodollar dealing into a coordinated, cell-based, time-stamped matrix of quotes and trades.

In grid infrastructure terms, Workspace is fast becoming the organizing interface where eurodollar rates, FX swap points, and Treasury collateral pricing are aggregated and computed. Surprisingly for people who do not understand how the US dollar worldwide, it functions as a key node in the daily digital grid for offshore USD liquidity.[1]

Workspace is spreadsheet logic made infrastructure, and this post outlines the major functions + formulas that make it the newest gridmatic trading infosystem for eurodollar markets. It is part of my “formulating power” project that looks at how spreadsheet formulas have not only made spreadsheets like Excel important, but how the grid-like structure and symbolic computing has expanded into a wide variety of information-based services.

Reuters Dealing (1980s–1990s) was the first electronic dealer-to-dealer chat and price system, effectively the “proto-grid” for eurodollar spot/forward trading. Thomson Reuters’ Eikon (2000s–2010s) evolved into a spreadsheet-like terminal with live market data, analytics, and FX integration. LSEG Workspace (2020s–) emerged as the rebranded, cloud-based successor after Refinitiv’s integration into the London Stock Exchange Group (LSEG). Workspace is not entirely new, but the next-generation layer in the same “gridmatic” tradition.

But it is not exclusive, it coexists with the Bloomberg Terminal and BlackRock’s Aladdin. The Bloomberg “Box” is still dominant on many trading desks, and Aladdin structures the portfolio-level balance sheet exposures. Workspace is strongest where FX and money markets plus data grids converge. Thus, Workspace is becoming the eurodollar spreadsheet of record for traders and banks.

The eurodollar market relies on offshore USD deposits, lending, and collateralized borrowing, and its infrastructure requires:

Real-time USD funding rates (OIS, SOFR-linked, FX swap basis).

Collateral pricing (USTs, agencies).

Cross-border FX settlement data (via SWIFT/CLS).

Workspace pulls all of these streams together:

FX Dealer Chat and Matching is the modern version of Reuters Dealing and still where traders text and eurodollar liquidity is posted/negotiated.

Data Grids: yield curves, swap points, repo rates, and collateral haircuts displayed in table form, live-linked to Excel or APIs.

Analytics: symbolic computing overlays for stress-tests, correlations, and risk across currencies, but USD is always the pivot cell.

The Gridmatic Infrastructure of Workspace

Workspace is a giant spreadsheet-as-interface consisting of data tables, formula-driven analytics, and live-linked cells. Application Programming Interfaces (APIs) that connect to Excel, Matlab, and Python, or the combination (Excel/Matlab/Python) turns Workspace into a grid feeder for downstream symbolic computing in popular spreadsheets. The collaboration grid replaced the “chat window + quote” of Reuters Dealing (a real-time messaging tool for financial professionals) where the trader/analyst views the same USD-centered tables.

So, Workspace turns fragmented eurodollar dealing into a coordinated, networked interface, which is a cell-based and time-stamped matrix of quotes and trades. Workspace is becoming the organizing gridspace where eurodollar rates, FX swap points, and Treasury collateral pricing are aggregated and computed in a networked spreadsheet for offshore USD liquidity.

LSEG Workspace and Its Implications for USD Network Power

By now owning Workspace and Refinitiv FX platforms plus popular benchmark indices (FTSE Russell), LSEG positions itself as the non-US hub for eurodollar telematics. Yet because everything inside Workspace still references US Treasuries and Fed policy rates, the USD remains the anchoring cell of the eurodollar spreadsheet, even on a “foreign” platform. Workspace (LSEG), but with Bloomberg, and Aladdin, overlap and compete in anchoring the USD/eurodollar system, but with Treasuries still in the central reference cell.

LSEG Workspace is spreadsheet logic made infrastructure. Its major functions + formulas make it the newest gridmatic spreadsheet system for organizing eurodollar markets. The major functions and formulas used by Workspace work as data feeds and grid cells. Workspace ingests real-time price streams into cells that look and behave like spreadsheet ranges.

=PRICE(UST_10Y, NOW()) – live U.S. Treasury yield.

=FXSPOT(EURUSD) – spot euro-dollar exchange rate.

=OIS(USD, 3M) – overnight indexed swap rate.

=BASIS_SWAP(EURUSD, TENOR=3M) – FX swap basis points.

These are the “input cells” that anchor all calculations.

Curve Construction & Interpolation

To model funding markets, Workspace builds continuous yield curves from discrete data.

=YIELD_CURVE(INSTRUMENT=UST, METHOD=SPLINE) – fits Treasury yields across maturities.

=BOOTSTRAP(SOFR_SWAP, TENORS) – constructs discount factors from SOFR swap rates.

=INTERP(RATE, MATURITY=7D) – interpolates between tenor points (e.g., repo rate curve).

These are crucial for eurodollar term lending, where pricing depends on forward curves.

Collateral & Repo Analytics

Eurodollar mechanics are collateral-driven so Workspace handles haircut and repo pricing with the following.

=REPO_RATE(COLLATERAL=UST_2Y, TERM=ON, COUNTERPARTY=PD) – overnight repo rate for a 2Y Treasury.

=HAIRCUT(ASSET=UST_30Y, CCP=LCH) – margin requirement for long bond collateral.

=RRP_RATE(TERM=ON) – Fed’s overnight reverse repo program rate, the floor of dollar funding.

These cells define the “funding grid.”

FX & Cross-Currency Funding

Workspace integrates FX swaps, which are the offshore synthetic dollar creation mechanism.

=FXSWAP(EURUSD, TENOR=1W) – implied cost of borrowing USD via FX.

=COVERED_IRP(USD, EUR, TENOR=3M) – checks covered interest parity (CIP).

=BASIS(EURUSD, 3M) – eurodollar “basis spread,” measuring stress in offshore dollar lending.

This function set is where the eurodollar meets FX.

For banks and dealers, Workspace provides grid-like portfolio functions for balance sheets and capital allocation.

=VaR(PORTFOLIO, CONF=99%) – Value-at-Risk of USD positions.

=LCR(LIQUID_ASSETS, NET_OUTFLOWS) – liquidity coverage ratio.

=BALANCE_SHEET(CURRENCY=USD, ASSETS-COLLATERAL, LIABILITIES-FUNDING) – structured balance sheet grid.

This mirrors how eurodollar institutions literally tabulate their offshore books in grid form.

Policy & News Integration

Financial news is also tied right into the spreadsheet logic.

=FOMC_TARGET_RATE() – current Fed funds target.

=ECON_RELEASE(CPI_US, DATE) – inflation print automatically populates.

=NEWS_SENTIMENT(FED, WINDOW=1D) – text mining of Fed statements, outputting a score.

This allows policy shocks to instantly cascade through formulas across FX, repo, and swaps.

Optimization & Allocation

Workspace incorporates optimizers, which are like Solver in Excel:

=ALLOCATE(FUNDING, CONSTRAINTS={DURATION<2Y, COSTEurodollar-Specific Grid Structure

If you were to visualize Workspace as a spreadsheet, its eurodollar tabs would include:

Tab 1 – Market Inputs: Treasuries, SOFR, repo, FX spot.
Tab 2 – Funding Curves: OIS, cross-currency basis, term structure.
Tab 3 – Collateral: Haircuts, repo spreads, RRP anchors.
Tab 4 – Balance Sheet: Offshore USD assets/liabilities in grid form.
Tab 5 – Optimizer: Allocation solver for cheapest-to-deliver funding.
Tab 6 – News/Policy: FOMC targets, Fedwire flows, central bank interventions.

Conclusion

LSEG’s Workspace is the newest gridmatic infrastructure for the eurodollar system. It transforms fragmented offshore USD trading into a cell-based, time-stamped spreadsheet interface, where Treasury yields, FX swaps, and repo collateral are continuously aggregated and recalculated.

LSEG Workspace works like a giant live spreadsheet that organizes the eurodollar system into cells and formulas. Treasuries and Fed rates are the anchor cells. Repo, FX swaps, and offshore deposits are formula-driven references. Optimizers and risk tools act like Solver, wiring symbolic computing into the market grid.

The “gridmatic infrastructure” of Workspace is what allows offshore USD lending to stay synchronized with Fed policy and Treasury collateral — keeping the U.S. dollar anchored as the central cell of global finance.

Not intended to be investment advice of any kind.

Notes

[1] I started my research on eurodollars for my MA thesis in 1986 that investigated how deregulation in finance and telecommunications were allowing new financial services to emerge like SWIFT, CHIPS, and Reuters Monitor and Dealing. I continued to write about electronic money using the term digital monetarism but drawing on my PhD, what we call digital monetarism is better understood as spreadsheet capitalism: a system where gridmatic infrastructures (Bloomberg, Aladdin, Workspace) continuously synchronize global liquidity, collateral, and pricing to keep the U.S. dollar at the core of world money.
[2] Chat GPT was an important research aide in this inquiry. In accordance with the determiniations set by the US Copyright Office, I claim the results of prompts as property. I guide the inquiry and carefully edit any LLM returns. Prompt 1: Does the USD operate globally with network effects?
Prompt 2: The US dollar’s global power is not just about US GDP or military strength. It’s about network effects built into financial infrastructure, where the dollar sits at the center cell of the global spreadsheet, anchoring liquidity, collateral, and pricing. Continuing the analysis of spreadsheet capitalism elaborate on how technological systems like BlackRock’s Alladin, Bloomberg’s “Box” and Thomson Reuter’s Eikon work with treasury markets, foreign exchange trading, financial news, and central banks to keep the USD anchored as the world’s primary reserve and payment system.
Prompt 3: Is LSEG Workspace the new central gridmatic organizing spreadsheet system for the eurodollar market?

Citation APA (7th Edition)

Pennings, A.J. (2025, Aug 23) LSEG Workspace as the Central Spreadsheet Grid for Eurodollars. apennings.com https://apennings.com/technologies-of-meaning/lseg-workspace-as-the-new-central-spreadsheet-for-eurodollars/

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

Excel vs. Java: Two Languages, Two Worlds of Thought: Symbolic Grid vs. Algorithmic Script

Posted on | August 12, 2025 | No Comments

Spreadsheets are often dismissed as “just” tables with lots of numbers. Still, their symbolic computing functions operate in a very different register than traditional programming, and that difference is why they’ve been so powerful in business, finance, and administration.

In this post, I examine the differences between Excel spreadsheets and traditional programming, primarily Java, but with some references to Python. Other computer languages like C and Fortran could be included but that would make the analysis too complex. My major objective is to explore the unique symbolic computing capabilities of the spreadsheet.

This inquiry comes from my long history into inquiring into both areas of computing, mostly spreadsheets. But also my more recent experience in a program that stresses students learn Java and Python. It ends with some speculation about AI and its influence on both.[1]

Excel vs. Java/Python

When we talk about programming, it’s easy to imagine a single tradition — algorithms, variables, loops — the kind of logic you find in computer science textbooks. But spreadsheets operate in a parallel tradition. They use symbolic computing in a grid environment. And when you set Excel side-by-side with Java and Python, the contrasts reveal not just different tools, but two very different ways of thinking about computation, knowledge, and work.

Excel and Java are representatives of two cultural histories of computing. The ledger-turned-simulator, serving the micro-decisions of commerce and finance. And the algorithm-turned-platform, serving the macro-architectures of global software development. Both shape our world, but through different epistemologies, and in service to different forms of power.

Excel is declarative and reactive. You don’t tell Excel how to compute something step-by-step; you state the relationship between values — =SUM(A1:A10). Then let the spreadsheet engine figure out the rest. Users declare relationships between cells; the engine maintains them automatically. The primary metaphor for the spreadsheet is the ledger. Spatial placement and references produce meaning. The “program” is a living grid where changes ripple through automatically with updated information.

Java’s procedural lineage grew from Cold War software engineering. It is procedural and object-oriented. You write explicit instructions: declare variables, loop over arrays, branch with conditions. The “program” is a designed machine — it does nothing until you start it, and it runs in the sequence you prescribe. Likewise, Python is a multi-paradigm programming language that is imperative, procedural, and mostly object-oriented (with functional elements). Users specify step-by-step instructions in human-readable syntax. The primary metaphor is the script, a narrative sequence of commands to be executed.

Python was developed in the early 1990s in the tradition of general-purpose scripting languages like Unix shells and C. It was designed for readability, minimal syntax noise, and broad applicability — from system scripting to AI/ML. Python code runs from top to bottom, unless redirected by flow control. Scripts can be interactive (REPL/Jupyter) or batch-run, but they start from a blank state each run unless persisted.

Excel’s grid is both interface and execution environment. A cell like “B2” is a symbol in a two-dimensional space, not a memory address. Ranges (A1:C3) are symbolic collections, defined by their position in the grid. The layout is the program. In Java, memory is abstract and invisible. You construct arrays, lists, and objects, but they live behind the scenes. The UI, if you build one, is separate from the computation. The layout of your code is not the same as the data it processes.

In Excel, execution is continuous and event-driven. Change one value, and the dependency graph kicks in: only cells that rely on that value are recalculated. There’s no “Run” button for formulas — the sheet is always “on.” In Java, execution is discrete. You compile and run the program, it executes a defined sequence, and it stops. Event-driven behavior (like GUIs) is possible, but it must be explicitly coded with listeners and handlers.

In Excel a cell can hold a number, text, a boolean, or an error without predeclaring its type. This flexibility makes rapid modeling possible, but it also hides data-type errors until they matter. In Java, every variable must have a type before it can be used, and the compiler enforces it. This prevents certain categories of bugs but slows the fluid improvisation that spreadsheets enable.

The spreadsheet user thinks spatially — “This cell depends on that one; those three cells add up to this total.” Programming in Excel feels like maintaining a ledger or building a map — computation embedded in geography.

The Java programmer thinks algorithmically — “First I do this, then I do that.” Programming in Java feels like designing a machine — computation as a process unfolding in time, not space.

Economic and Cultural Lineages

Excel descends from accounting, tabulation, and ledgers. It carries the epistemology of the bookkeeper — numbers aligned in rows and columns, meaning attached to placement, relationships visible at a glance. Java descends from structured programming and software engineering. It carries the epistemology of the engineer — modularity, encapsulation, and abstraction layers designed to scale from small programs to global systems.

Why it matters? Spreadsheets turned computation into a cultural form accessible to non-programmers, embedding symbolic computing into the everyday life of business, government, and finance. Java, and languages like it, built the backbone of global software infrastructure. Both changed the world — but they changed different worlds.

Excel’s “gridmatic” logic empowers rapid prototyping and live simulation, perfect for domains like finance where decisions are model-driven and iterative. Java’s “narrative” logic excels in building robust, reusable systems where execution order and data integrity are paramount.

The Role of AI

Excel’s symbolic, reactive grid and Python/Java’s procedural, text-based scripts represent two mature modes of computation. AI — especially large language models (LLMs) and generative systems — introduces a third mode: probabilistic, pattern-driven inference.

Instead of Excel’s symbolic mapping (“Cell B2 = sum of A1:A10”) and traditional programming’s procedural instruction (“Loop through this array and sum the numbers”), we now get pattern completion (“Given these values and my training data, here’s the sum, the trend, and a prediction — without you explicitly defining the steps”). AI doesn’t just run code — it generates it, interprets messy inputs, and adapts behavior without explicit reprogramming.

Excel is already being transformed into a natural-language modeling surface with Copilot for Excel that lets you type “Show me sales trends for Q3” and get formulas, pivot tables, or charts without knowing the syntax. AI can generate whole spreadsheets from plain text or PDFs, eliminating manual cell-by-cell building. Predictive modeling (e.g., forecasting, anomaly detection) is moving inside Excel’s grid without the user writing formulas.

AI is turning Excel into “spreadsheet as conversation.” The grid remains, but much of the symbolic formula-writing is offloaded to an AI intermediary. Excel’s tabular form will survive because the grid is a powerful cognitive interface. But its internal programming layer will be increasingly AI-written rather than human-written.

For procedural languages, AI is already functioning as a code generator where LLMs write Python/Java from natural language prompts.
It also works in refactoring and as a debugging assistant — turning vague goals into structured, tested, deployable code. AI can be seen as an Integration orchestrator, pulling APIs, databases, and machine learning models together without the developer having to remember all syntax and library calls.

Python in particular is symbiotic with AI — it’s the dominant language for training and deploying models, and AI frameworks (PyTorch, TensorFlow) are Python-native. Java will continue in large-scale systems and enterprise software, with AI increasingly writing parts of it (e.g., data-access layers, boilerplate). Procedural coding won’t vanish — but the human role will shift from “writing code” to “specifying requirements, constraints, and reviewing AI output.”

Excel democratized computing for business users without replacing procedural programming. It just absorbed a huge chunk of modeling work. AI will do the same. It will democratize and accelerate both symbolic and procedural programming, not erase them.

AI will continue to replace the rote coding of standard programming patterns, but humans will still be needed to define problems precisely, decide trade-offs between performance, accuracy, and interpretability, and ensure compliance, security, and business alignment.

Conclusion

Spreadsheets like Excel embody a distinct symbolic computing tradition, separate from conventional programming languages such as Java. Excel is declarative, reactive, spatial, and loosely typed, with computation embedded directly in its two-dimensional grid. Java is procedural, algorithmic, temporal, and strongly typed, with computation separated from its interface.

These differences reflect their cultural and economic lineages. Excel emerges from accounting and tabulation, enabling rapid, accessible modeling for business and finance, while Java descends from structured programming, supporting large-scale, robust software systems. Both have profoundly shaped modern life, but in different domains and through different epistemologies.

Understanding these perspectives side-by-side isn’t just academic. It’s a reminder that “programming” isn’t one thing — it’s a family of practices, each with its own history, cultural lineage, and vision of what computers are for. Digital spreadsheets will continue to be used because it’s a human-readable canvas for understanding and auditing results.

Excel and Python/Java will increasingly become AI-mediated layers in a bigger computational infosystem. Excel will be the visual decision surface while Python/Java are the execution engines and integration layers. AI will be the translator and generator between human intent and machine implementation.

Notes

[1] I wrote my MA thesis on eurodollars and telecommunications deregulation but didn’t appreciate the importance of the spreadsheet until my PhD student days, and my personal experience running a small Center for Modern Media at the University of Hawaii in the summers during graduate school. My overriding purpose is to develop a more sophisticated understanding of the digital spreadsheet and its impact on finance and other social areas.
[2] Many thanks to my Stony Brook colleague at SUNY Korea Antonino N. Mione who teaches the Java course for reviewing this article. Any remaining mistakes are strictly mine. He made a interesting point that I have not integrated into the text yet. “You can write a spreadsheet application in Python or Java, but it would be very hard to develop a Python interpreter or Java compiler simply using an Excel application.”

Prompt: I used a very general ChatGPT prompt to get this conversation started. Describe the symbolic computing functions in spreadsheets and how they differ from traditional computer science concepts. Then I followed up with requests to include Java and Python.

Citation APA (7th Edition)

Pennings, A.J. (2025, Aug 12) Excel vs. Java: Two Languages, Two Worlds of Thought: Symbolic Grid vs. the Algorithmic Script. apennings.com https://apennings.com/meaning-makers/digital-spreadsheets-power-over-people-and-resources/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a professor at the Department of Technology and Society, State University of New York, Korea and a Research Professor for Stony Brook University. He teaches AI and broadband policy. From 2002-2012 he taught digital economics and information systems management at New York University. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in Korea.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor (full) at State University of New York (SUNY) Korea since 2016. Research Professor for Stony Brook University. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and global political economy

    You can reach me at:

    anthony.pennings@gmail.com
    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on X

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    February 2026
    M T W T F S S
     1
    2345678
    9101112131415
    16171819202122
    232425262728  
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present. Articles are not meant as financial advice.

  • Verified by MonsterInsights