Moving Economic and Financial Curves
Posted on | March 9, 2025 | No Comments
I’ve previously written about how the historical development of “one price” and equillibrium changed political economy to economics due to the development of market graphs. In these visualizations that empowered a new economics, supply and demand curves intersect at a “market clearing” price where suppliers and buyers of a good or service are happy to enact the transaction. This post lays out market logic and discusses what happens when changes in prices occurs, and when other factors might influence suppy and demand. The second part looks more closely at how supply and demand are both important and different in financial markets.
In financial markets, for example, the law of one price essentially states that identical or equivalent financial assets should trade at the same price, regardless of where they are traded. This is driven by the idea that if a discrepancy exists, arbitrageurs will exploit it, buying the asset where it’s cheaper and selling it where it’s more expensive, thus driving the price toward equality. If a company’s stock is traded on multiple exchanges, the law of one price suggests that its price should be the same across those exchanges (accounting for currency exchange rates, if applicable).
The equilibrium price is not static. It constantly adjusts based on shifts in supply and demand. If demand increases, the equilibrium price will likely rise, and vice versa. This post looks at factors that increase or decrease economic and financial curves.
Alfred Marshall, pictured below, made a valuable contribution to our understanding of supply and demand with his visible representation of the equillibrium price. Consequently, this framework provides a valuable graphical and mathematical foundation for understanding economic and financial market dynamics.
There are basic causes of a price change to be noted – shifts in demand (increase or decrease), supply (increase or decrease), or both
Although it’s important to distinguish between a “movement along” the demand curve (caused by a change in price), “shifts” of the demand curve can be caused by many other factors.
- Demand shifts to the right – An increase in demand shifts the demand curve to the right. This raises the price and output.
- Demand shifts to the left – A decrease in demand shifts the demand curve to the left. This reduces price and output.
- Supply shifts to the right – An increase in supply shifts the supply curve to the right. This reduces price and increases output.
- Supply shifts to the left – A decrease in supply shifts the supply curve to the left. This raises price but reduces output.
Factors That Shift the Demand Curve
– Income
For normal goods, an increase in income leads to an increase in demand (a rightward shift). For normal goods, a decrease in income leads to a decrease in demand (a leftward shift). For inferior goods (like generic brands), an increase in income leads to a decrease in demand, and vice versa.
– Prices of Related Goods
If the price of a substitute good increases, the demand for the original good increases (a rightward shift). If the price of a complementary good increases, the demand for the original good decreases (a leftward shift).
– Tastes and Preferences
Changes in consumer tastes and preferences, often influenced by advertising, trends, or cultural shifts, can significantly impact demand. Increased preference leads to increased demand (a rightward shift). Decreased preference leads to decreased demand (a leftward shift).
– Expectations
Consumer expectations about future prices, income, or availability can influence current demand. If consumers expect prices to rise in the future, current demand increases (a rightward shift). If consumers expect prices to fall in the future, current demand decreases (a leftward shift).
– Number of Buyers
An increase in the number of buyers in a market increases overall demand (a rightward shift). A decrease in the number of buyers decreases overall demand (a leftward shift).
– Demographic Changes
Changes in the size and composition of the population. For example a increase in the elderly population increases the demand for healthcare.
Accordingly, any factor that changes consumers’ willingness or ability to purchase a good or service at a given price will cause the demand curve to shift.
In market graphs, the supply curve illustrates the relationship between the price of a good or service and the quantity that producers are willing to supply. Once again, it’s important to differentiate between a “movement along” the supply curve (caused by a change in price) and a “shift” of the supply curve (caused by other factors).
Factors That Shift the Supply Curve
– Costs of Production
Changes in the prices of inputs, such as labor, raw materials, and energy, directly affect the cost of production. Increased costs shift the supply curve to the left (a decrease in supply). Decreased costs shift the supply curve to the right (an increase in supply).
– Technology
Technological advancements can improve production efficiency, reducing costs and increasing output. New technology generally shifts the supply curve to the right.
– Government Policies
Taxes on production increase costs, shifting the supply curve to the left. Subsidies reduce production costs, shifting the supply curve to the right. Regulations can increase or decrease production costs, depending on their nature, and therefore shift the supply curve accordingly.
– Number of Sellers
An increase in the number of sellers in a market increases the overall supply, shifting the curve to the right.
A decrease in the number of sellers decreases supply, shifting the curve to the left.
– Expectations of Future Prices
If producers expect prices to rise in the future, they may reduce current supply to sell more later, shifting the curve to the left.
If they expect prices to fall, they may increase current supply, shifting the curve to the right.
– Prices of Related Goods
If the price of a related good that producers could also produce increases, they may shift production towards that good, decreasing the supply of the original good (shifting the supply curve to the left).
– Natural Disasters
Natural disasters can heavily effect the amount of goods that can be produced. Therefore these events can cause massive shifts in the supply curve.
Basically, any factor that changes the producers’ ability or willingness to supply a good or service at a given price will cause the supply curve to shift.
What Factors Change Demand and Supply Curves in Financial Markets?
In financial markets, like any other market, the interplay of supply and demand determines prices. However, the factors that shift these curves have some unique characteristics.
Factors Affecting Demand in Financial Markets
– Interest Rates
When interest rates fall, borrowing becomes cheaper, increasing the demand for loans and other debt instruments. Conversely, higher interest rates reduce borrowing and can increase the demand for interest-bearing assets like bonds.
– Investor Sentiment
Optimism about the economy or a particular asset can increase demand. Fear and uncertainty can lead to a decrease in demand, as investors seek safer havens.
– Economic Data
Strong economic indicators, like GDP growth or low unemployment, can increase demand for stocks and other risk assets. Weak economic data can have the opposite effect.
– Inflation Expectations
Rising inflation expectations can decrease demand for bonds, as their real return erodes.
Conversely, it can increase demand for assets that are expected to outpace inflation, like commodities or certain stocks.
– Government Policies
Fiscal policies, like tax cuts or increased government spending, can stimulate demand. Monetary policies, like changes in the money supply, can also influence demand.
– Changes in Risk Aversion
When investors risk aversion is low, they are more willing to purchase riskier assets, increasing demand. When risk aversion is high, demand shifts to safer assets.
Factors Affecting Supply in Financial Markets
– Central Bank Policies
Central banks influence the supply of money through open market operations, reserve requirements, and the discount rate. These actions directly impact the supply of credit and other financial instruments.
– Corporate Issuances
Companies issue stocks and bonds to raise capital, increasing the supply of these instruments.
The number of corporate issuances depends on factors like economic conditions and interest rates.
– Government Issuances
Governments issue bonds to finance their spending, adding to the supply of debt instruments.
– Investor Expectations
If investors expect the price of an asset to fall, they may increase their supply of that asset in order to sell before the price drops.
– Profitability Expectations
If a company is expecting high profitability, they may issue more stock, increasing supply.
Key Differences from Traditional Goods Markets
In reality, frictions in financial markets like transaction costs, taxes, and information asymmetries can prevent the law of one price from holding perfectly. Also, the degree to which the law of one price holds depends on the efficiency of the market. In highly efficient markets, price discrepancies are quickly eliminated. Lastly, some financial instruments are highly complex, making it difficult to determine whether they are truly identical.
In financial markets, expectations play a much larger role than in markets for physical goods. Information flows very rapidly, leading to quick adjustments in supply and demand. Psychological factors, like fear and greed, also have a significant impact on market dynamics.
Citation APA (7th Edition)
Pennings, A.J. (2025, March 10) Moving Economic and Financial Curves. apennings.com https://apennings.com/dystopian-economies/moving-the-curves-to-achieve-equillbrium-prices/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Markets and Prices: Pros, Cons
Posted on | February 23, 2025 | No Comments
Economists argue that using a money system with flexible prices is the best way to ration scarce goods and services in a society. They point to alternative approaches – lotteries, political or physical force, random assignment, and queues/lines – as seriously flawed distribution strategies. The specter of people in Communist countries lining up for their commodities provided compelling images and narratives supporting the price mechanism and the social dynamic that many people call “markets.” This post examines how markets emerged and what is valuable and detrimental about them. It discusses the underlying economic understanding of markets and includes several critiques of this term and our allegiance to them.
The term “market” evokes imagery of a medieval city center or town square filled with merchants peddling food or wares and haggling over prices with interested visitors and potential customers. It has achieved considerable circulation as the dominant metaphor for understanding the modern “free enterprise” economy.
For economists, markets refer to arrangements, institutions, or mechanisms facilitating the contact between potential sellers and buyers. In other words, a market is any system that brings together the suppliers of goods or services with potential customers and ideally helps them negotiate and settle the terms of a transaction, such as the currency and the price. But what are the downsides to our reliance on markets?
If you spell market backward, you get “tekram,” an interesting, if hokey, reminder that markets are social technologies and need to be created and managed. RAM or random-access memory is a computer term that signifies how much memory or “working space” you have available on your device. The more RAM, the more applications you can keep running simultaneously without losing significant speed. Likewise, markets need environments accommodating many participants and providing safe, swift, and confidential transaction capabilities without downtime or other technical problems. The more buyers and sellers, the better a market can work. Economists have begun recognizing that digital markets require attention to several conditions, including privacy, interoperability, and safety, to facilitate transactions and make a digital economy work effectively.
Characteristics of Markets
Economists have identified specific characteristics of the market phenomenon. For one, a market depends on the conditions of voluntary exchange where buyers and sellers are free to accept or reject the terms offered by the other. Voluntary exchange assumes that trading between persons makes both parties to the trade subjectively better off than before the trade. Markets also assume that competition exists between sellers and buyers and that the market has enough participation by both to limit the influence of any one actor.
Effective economic models of markets are based on the idea of perfect competition, where no one seller or buyer can control the price of an “economic good.” In this vision of a somewhat Utopian economic system, the acts of individuals working in their self-interest will operate collectively to produce a self-correcting system. Prices move to an “equilibrium point” where producers of a good or service will be incentivized or motivated to supply an adequate amount to meet the demand of consumers willing to pay that price. Unless someone feels cheated, both parties end the transaction satisfied because the exchange has gained them some advantage or benefit.
Central to any market is a mutually acceptable means of payment. A crucial condition is the “effective demand” of consumers – do the buyers of economic goods have sufficient currency to make the purchase? Consumers must desire a good, plus the money to back it up. While parties may exchange goods and services by barter, most markets rely on sellers offering their goods or services (including labor) in exchange for currency from buyers. The system depends on a process of determining prices.
A media perspective on economics starts with media, which includes money and other symbolic representations that influence economic processes. Markets only function well with the efficient flow of information about prices, quantities, and the availability of goods and services. Buyers need to know what’s available and at what cost, while sellers need to understand what consumers are willing to pay. Information systems and media help reduce information asymmetry, where one party has more information than the other, which can lead to unfair or inefficient outcomes.
Media, in various forms, play a crucial role in disseminating this information. Traditionally, this involved newspapers, trade publications, and physical marketplaces. Today, digital media like websites, e-commerce platforms, and social media play a dominant role in conveying price information and connecting buyers and sellers. Consumers can compare prices, research products, and read reviews, while producers can track market trends, analyze competitor behavior, and adjust their strategies accordingly. Information systems and media then facilitate interaction by providing platforms for communication, negotiation, and transaction processing.
The Price System
How do prices decrease or increase? The quick answer is that companies adjust their price sheets. However, some companies have more pricing power than others, so let’s go through the economic arguments and explanations about prices.
Central to the process of economic exchange is the determining of prices. A price is commonly known as the cost to acquire something, although it is important to keep some distance between the words “price” and “cost.” Both have specific accounting meanings. In accounting, cost represents the internal investment made by a business to produce or acquire something, while price represents the external value exchanged with customers in the marketplace. Price is the money (or other currency) a customer or buyer pays to acquire a good, service, or asset.
Prices are the value exchanged to benefit from the good or service. Prices can be figured in monetary terms, but other factors like time, effort, or even foregoing other opportunities are often considered. Businesses need to set prices that cover their costs and generate a profit while remaining competitive in the market.
Buyers determine what value they will get from the purchase. While prices are ultimately a factor in the one-on-one relationship between a buyer and a seller, prices are determined within a social context.
Besides helping to reconcile the transaction between a buyer and a seller, a system of prices helps signal what is scarce and what is abundant. It helps gauge the demand for something and the incentives for producers to supply it. A price system allocates resources in a society based on these numerical monetary assignments and is, hopefully, determined by supply and demand.
Prices are influenced by society’s communication systems. They are negotiated within the confines of languages, modes of interaction, and the ability to be displayed by signage and posted on various media. Reuters created the first online market for global currencies in the 1970s by linking up computer terminals in banks worldwide. It was a time shortly after the US dollar went off the gold standard and global currencies were in flux.
They charged them to list their prices for various national monies and again for subscribing to the system. It was the early 1970s, so the numbers and letters were listed in old-style ASCII computer characters, and traders concluded deals over the phone or through teletype. However, having the prices of each currency listed by each participating bank created a virtual market. By the 1980s, digital technology was dramatically transforming the globe’s financial system, trading trillions of dollars daily in currencies and other financial instruments.
Amazon has a dynamic pricing system that changes the rates on thousands of products during a single day. Sellers can change the price of their goods through Amazon Seller Central or participate in a dynamic repricing system like xSellco or RepriceIT that automatically undercuts competitor prices. A seller can ensure that the amount will not incur a loss by setting a minimum price. If you’re a seller on Amazon.com, a critical factor for your online success is keeping your inventory priced right so it doesn’t stagnate or lose money. Innovations like Honey and Piggy provide free browser extensions that find better prices on Amazon and other e-commerce sites and apply coupons at checkout.
Economists consider allocating a society’s goods and services by price setting to be the most efficient system of rationing—the role of rationing increases when evaluations of scarcity emerge. The rationing process can be either price or non-price-based. However, the latter approaches, including lotteries, queues, coupons, force, or even sharing, impose additional costs such as waiting times and may result in black markets. The price system tends to be responsive but, as will be mentioned later, imposes other social costs.
Equilibrium and the Turn from Political Economy
In economic theory, a working and efficient market is based on prices converging. All the same items must have only one price. Otherwise, buyers will to continue to search for a better price or opportunities for arbitrage arise. In other words, items could be easily be purchased from the merchant offering the lower price or the product will be bought by middlemen and sold to another seller.
William Stanley Jevons, a Professor of Political Economy at University College London came up with the Law of One Price in the mid-19th century, but it goes more often by its contemporary formulation, the “equilibrium price” or the market-clearing price. When a good or service reaches such a price it will be attractive to for buyers to buy, and sellers to sell. Ideally, the market will clear, i.e., all customers will get the amounts of the product they are satisfied with and all items will be sold as the suppliers will be satisfied with the price as well.[1]
In Jevons’ Theory of Political Economy contributions, he helped separate modern “neoclassical” economics from political economy. With the one price formulation, mathematics replaced history as the central vehicle for constructing theories about economics by providing an internally coherent system for producing models of market processes and a logic for predicting human economic behaviors.
The reader should search the Internet for images of supply and demand charts while reading this section.
With the one price theory came a new emphasis on the supply-demand framework and graphs that allowed prices to be plotted and optimized. These graphs use supply and demand curves that describe the relationship between the quantity of goods or services supplied and demanded. They describe not just the equilibrium price but suggest how much of a good or service would be sold at various prices.
Alfred Marshall wrote the Principles of Economics (1890) and explained how the supply-and-demand logic could be graphed. He described supply and demand curves and how they connected to various prices, including the market equilibrium price. An increase in the price of a good is associated with a fall in the quantity demanded of that good and an increase in the amount that will be supplied by producers. As a product gets more expensive, less of it sells.[2]
Conversely, a decline in the price of a good is associated with an increase in the quantity demanded of that good and in a decline in the number supplied by producers. This last point is important, the lower the price of the good, the less incentive to produce it. These dynamics result in the representation of a law of supply depicted by an upward-sloping curve while the law of demand is presented by a downward-sloping curve. The equilibrium price can be found at the point where the two curves intersect. This is the magical “clearing point” where all goods are sold. Price discovery is a common term in economics and finance to describe the process of determining the price of an asset by the interaction of buyers and sellers. It is a key function of a marketplace, even digital markets. When the “one price” is discovered, it leads to a clearing of the market.
An important concept in understanding prices is a product’s degree of elasticity. This refers to the influence of price changes on the quantities of product that are desired by consumers. Will a significant change in the price of buying a movie ticket influence audience attendance at theaters? And by how much? Will the launch of a new Apple iPhone that comes with a substantial price increase result in decreased sales, or will factors such as brand loyalty or customer captivity diminish the influence of the price increase? Understanding the elasticity of a product’s prices will tell us something about the sensitivity of that product’s market to changes in the price. A business can risk charging higher prices if the demand for the product’s price is inelastic. If it is elastic, a change in price is likely to result in major changes in consumption.
The pros and cons of markets are hotly debated today. Some believe markets are an ideal system to organize society. Proponents often cite Adam Smith’s famous “invisible hand” as the God-given mechanism that organizes a harmonious society based on market activity. But Smith only used the term once, at the end of The Wealth of Nations (1776) and was more focused on the production of economic activity by working people.
Others believe markets are prone to failure and give rise to unequal conditions and challenge democratic participation. It is no surprise that Karl Marx was a voracious reader of Adam Smith and his theories that the population was the source of economic wealth. Marx just didn’t think the people who did the work got a fair deal in the process of economic production. Marx believed that the conditions of capitalist markets meant that the wealth of economic activity went mainly to the owners of “the means of production” through profits. And the people doing the work were forced to compete against each other in the labor market, driving wages down. The following section looks at other concerns about markets from conteporary economists.
Evaluating the Market System: Pros and Cons
One of the best explanations of the strengths and weaknesses of the market system came from The Business of Media: Corporate Media and the Public Interest (2006) by David Croteau and William Hoynes. They pointed to the strengths of markets, such as efficiency, responsiveness, flexibility, and innovation. They also discuss the limitations of markets as well. These include enhancing inequality, amorality, failure to meet social needs, and the failure to meet democratic needs.[3]
The market provides efficiency by forcing suppliers to compete with each other and into a relationship with consumers that requires their utmost attention. The suppliers of goods and services compete with one another to provide the best products, and the competition among them forces them to bring down prices and improve quality. Firms become organized around cutting costs and finding advantages over other companies. They have immediate incentives to produce efficiencies as sales and revenue numbers from market activities provide an important feedback mechanism.
Responsiveness is another feature of markets that draws on the dynamics of supply and demand. Companies strive to adapt to challenges in the marketplace. New technologies and expectations, changes in incomes as well as tastes and preferences of consumers require companies to make alterations in their products, delivery methods, and retail schedules.
Likewise, consumers respond to new conditions in their ability to shop for bargains, find substitute goods, and adopt to new trends. Going online has meant new options for discovering, researching, and purchasing products. Combined with logistical innovations by companies like FedEx and TNT, ecommerce has shifted the consumption dynamic making it easier for customers to search for products, read the experiences of other consumers for that product, and have it delivered right to their homes.
Flexibility refers to the ability of companies to adapt to changing conditions. In the absence of a larger regulatory regime, companies can manufacture new products, new versions of products, or move in entirely new directions. In a market environment, companies can compete for consumers by making changes within their organizational structure, including adjustments in production, marketing, and finance.
Lastly, markets stimulate innovation in that they provide rewards for new ideas and products. The potential for rewards, and necessities of gaining competitive advantages, drive companies to innovate. Rewards can include market share, but also increased profits. Without competition, firms tend to avoid risk, an essential component of the innovation calculus, as many experiments fail.
Croteau and Hoynes and others point out serious concerns about markets that economists do not generally address. The tendency of markets to reproduce inequality is one important drawback to markets. While some inequality produces contrast and incentives to work hard or to be entrepreneurial, a society with a major divide between haves and have-nots will tend towards decreasing opportunities and incentives to work and innovate. Places with major divisions in wealth tend to be dystopic, a “sick” place. Sir Thomas More’s classic Utopia (1506) told of a mythical island where money was outlawed. Having gold was ridiculed and used for urinals and chains for slaves. It was a mythical place meant to be a social critique of inequalities of England. It denounced private property and advocated a type of socialism.
Thomas Piketty’s Capital (2015) addressed this issue of inequality head-on and warns that investment money gravitates towards more inequality. He targeted the trickle-down effects of capitalism and its tendency to lead to a slower and slower drip of money to those in the lower rungs. Neo-elites benefiting from the rolling back of the estate tax have advantages that others don’t have while often contributing less to the economy. We now have a generation inheriting money from their parents’ windfalls during the digital revolution.
“One dollar, one vote” is a common metaphor and an area of research to refer to the advantages the rich have over the poor. As they have many more dollars, the rich have many more votes to influence the political economy. Countries with a greater concentration of wealth at the upper incomes tend to have less progressive tax and welfare policies while countries with a richer poor tend to have a more government support for poorer people.
The second concern they have about markets is that they are amoral. Not necessarily immoral, but rather that the market system only registers purchases and prices and doesn’t make moral distinctions between, for example, human trafficking, drug trafficking, and oil trafficking. The commerce in a drug to cure malaria does not register differently from a recreational drug that provides a temporary physical stimulation (Illegal drugs are not registered in GDP). Markets do not judge products unless it registers changes in demand. It does not favor child care, healthy foods, or fuel-efficient cars, unless customers make their claims in currency via increased demand.
Can markets meet social needs? A number of services and sometimes goods should probably be provided by some level of government – defense, education, family care and planning, fire protection, food safety, law enforcement, traffic management, roads and parks. More complicated are issues related to electricity and telecommunications. A pressing question for the last thirty years has been the increasing privatization of activities that governments had actively engaged in. The telecommunications system for example, was considered a natural monopoly at first in order to protect its mission to provide universal telephone service, usually through government agencies called PTTs (Post, Telephone, and Telegraph) or through heavily regulated monopolies like AT&T in the US. Through the 1980s and 1990s, these entities were deregulated, opened to competition, and sold off to private investors. This allowed a global transformation to Internet Protocols (IP), but has challenged longstanding commercial traditions such as net neutrality and common carriage that restricts telecommunications and transportation organizations from discriminating against any customer.
Can markets meet democratic needs? Aldous Huxley warned of becoming a society with too many distractions, too much trivia, seeped in drugged numbness and pleasures. Because markets are amoral, they can become saturated with economic goods that service vices rather than public spirit. Competition, in this case, may result in a race to the lowest common denominator (sugar foods) rather than higher social ideals. Rather than political dialogue that would enhance democratic participation, the competition among media businesses tends to drive content towards sensationalist entertainment. This includes social media that allows participants to share information from a variety of news sources that are biased, one-sided, and often distorted.
Comedian Robin Williams once quipped, “Cocaine is God’s way of telling you that you are making too much money.” Markets provide powerful coordination systems for material production and creative engagement, but they also generate inequalities, often with products and services that are of dubious social value. How a society enhances and/or tempers market forces continues to be a major challenge for countries around the world.
For a market to function effectively, it needs several dynamics to succeed. One of the most important factors for a market to prosper is a successful currency. A medium of exchange will depend on trust in the monetary mechanism as buyers and sellers must readily accept and part with it. Money has had a long history of being things, most notably gold. Gold has striking physical attributes: it doesn’t rust, it doesn’t chip, and it can be melted into a variety of shapes. Other metals such as silver and platinum have also served as money. Credit cards, third party payment systems such as Paypal, and new digital wallets like Apple Pay and Samsung Pay provide new conveniences that facilitate economic transactions.
It is interesting that societies gravitate towards the use of some symbolic entity to facilitate these transactions. As discussed in the previous chapter, money can be anything that a buyer and seller agree is money. At times, commodities such as rice or tobacco and even alcohol have served the roles money. Market enthusiasts often overlook the importance of money, focusing instead on the behaviors of market participants. But money has proved to be central to market activities.
Summary
This essay explores the concept of markets in economics, highlighting their characteristics, the role of the price system, and the shift towards a mathematical understanding of markets with the concepts of equilibrium and marginal analysis. It also discusses the strengths and weaknesses of the market system, including its potential to enhance inequality, its amoral nature, and its limitations in meeting social and democratic needs.
The essay begins by defining markets as mechanisms that facilitate the exchange between buyers and sellers, emphasizing that they are social technologies that need to be created and managed. It then discusses the characteristics of markets, such as voluntary exchange, competition, and the use of a mutually acceptable means of payment.
The essay then delves into the price system, explaining how prices are determined by supply and demand and how they act as signals of scarcity and abundance. It also provides examples of dynamic pricing systems used by companies like Amazon.
It then discusses the shift from political economy to neoclassical economics, highlighting the contributions of William Stanley Jevons and Alfred Marshall in developing the concepts of equilibrium price and marginal analysis. This shift led to a more mathematical and technical approach to economics, focusing on market mechanics rather than broader social and political factors.
Finally, the essay evaluates the strengths and weaknesses of the market system, drawing primarily on the work of David Croteau and William Hoynes. It discusses the efficiency, responsiveness, flexibility, and innovation fostered by markets, but also acknowledges their potential to enhance inequality, their amoral nature, and their limitations in meeting social and democratic needs.
In summary, the essay provides an overview of the concept of markets in economics, highlighting their characteristics, the role of the price system, the shift towards a mathematical understanding of markets, as well as the ongoing debate about their strengths and weaknesses.
Notes
[1] Jevons, W. S. (1871) The Theory of Political Economy. Macmillan and Co.
[2] Marshall, A. (1890) Principles of Economics. Macmillan and Co.
[3] Croteau, D. and Hoynes, W. (2006) The Business of Media: Corporate Media and the Public Interest (2006).
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 23) Markets and Prices: Pros, Cons. apennings.com https://apennings.com/dystopian-economies/markets-and-prices-pros-cons/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Alfred Marshall > Bloomberg box > Reuters Monitor > William Stanley Jevons
AI and Government: Concerns Shaped from Classic Public Administration Writings
Posted on | February 9, 2025 | No Comments
Recent events in US politics have highlighted tensions over our conception of government, the role of business in public affairs, and even how artificial intelligence (AI) should be used in the bureaucratic systems of the US federal system. This post covers some of the primary reasons why government, even in its search for efficiency, differs from business by drawing on historical writings in public administration. Insights from these analyses can start to specify some constraints and goals for government AI systems so that they are designed to be transparent, explainable, and regularly audited to ensure fairness, avoid bias and discrimination, and protect citizen privacy. This means that the data used, the algorithms employed, and the reasoning behind decisions should be clear and understandable to human overseers and the public. This is crucial for accountability and building trust in governmental AI mechanisms.
Public administration is the art and science of managing public programs and policies and coordinating bureaucratic strategies and public affairs. Public administration was an important part of my PhD in Political Science, and I was particularly interested in the role of information technology (IT) and networking, including its use in financial tasks. As we move forward with IT and even AI in government, it is critical that they be designed and programmed with ideals gleaned from years of public administration deliberations.
The debate over whether government should be run like a business has been a long-standing issue in public administration. The historical writings of public administration offer compelling reasons why government is fundamentally different from business. Scholars such as Paul Appleby, Harland Cleveland, Dwight Waldo, Max Weber, and even US President Woodrow Wilson have articulated key differences between government and business, emphasizing the distinct purposes, structures, and constraints that define the public administration of government. Also important is the role of politics, a fundamental component of the democratic agenda, but one that is not always conducive to efficiencies and values present in the private sector.
This post covers some of the primary reasons why government differs from business, drawing on historical writings in public administration, including political constraints, public interest vs. profit maximization, accountability and transparency, decision-making and efficiency constraints, monopoly vs. market competition, legal and ethical constraints, and distinctions between service vs. the consumer model.
By carefully considering these challenges and drawing on the wisdom of the classics of public administration, it may be possible to start to train the power of AI to create “smart” but ethical government systems that serve the public interest and promote the well-being of all citizens. Currently, the Trump administration with Elon Musk seems to be building a “digital twin” of the payment system at the Treasury and other parts of the administrative heart of the US government, probably in the new datacenter called Colussus built in Memphis.
Digital twins are a powerful tool for training AI models as they can help to generate data, simulate scenarios, and explain AI models. It mimics current systems as it trains a new AI engine with the goal of developing new types of digital bureaucracies and services. As digital twin technology develops with faster chips and larger data centers, it will likely play an even greater role in training AI government models. This innovation is new and unprecedented and should only be pursued with the highest intentions and a solid basis in democratic and public administration understanding.
Political Constraints and Efficient Bureaucracies
Woodrow Wilson (1887), in “The Study of Administration,” addressed the issue of government efficiency and argued that public administration should be distinct from politics. For him, government is ultimately driven by the public good, not financial gain. He emphasized the need for a professional and efficient bureaucracy to implement public policy. Wilson’s emphasis on the separation of politics and administration highlighted the need for a professional and impartial bureaucracy.
Paul Appleby (1945) reinforced this position by stating that government serves a broad public interest rather than a select group of stakeholders. Government’s core purpose is to serve the public interest and promote the general welfare of society. This includes providing essential services, protecting citizens, and promoting social equity.
Governments often operate with a longer-term perspective, considering the needs of future generations and the long-term sustainability of policies and programs. Businesses, while also concerned with long-term success, often prioritize shorter-term financial goals. Businesses prioritize profit, efficiency, and shareholder value, whereas governments must balance equity, justice, and service delivery even when it’s not profitable (e.g., social security, public education). For example: The government provides social services like healthcare for seniors, unemployment reflief, and welfare, which businesses would find unprofitable.
Businesses are legally required to maximize profits for their shareholders. In contrast, government’s core purpose is to serve the public interest and promote the general welfare of society. This includes providing essential services, protecting citizens, and promoting social equity. By keeping Appleby’s insight at the forefront, AI development in government can be guided by a commitment to serving the broad public interest and strengthening democratic values.
Accountability, Transparency, and Legitimacy
Max Weber emphasized that government agencies operate under legal-rational authority, meaning they follow laws, regulations, and procedures that are meant to ensure transparency and accountability. Businesses operate under market competition and corporate governance, where decisions can be made with greater discretion without public oversight. Weber’s work on bureaucracy underscores the importance of formal rules, clear procedures, and hierarchical structures in government organizations. This translates to AI systems needing well-defined architectures, clear lines of authority for decision-making, and specific functions for each component. These frameworks may ensure accountability and prevent AI from overstepping its intended role.
In his seminal work, Economy and Society (1922), Weber articulated fundamental differences between government and business.
His analysis highlighted the structural, operational, and accountability-based distinctions between the two domains. He distinguished government from business in several ways: Government bureaucracy operate under legal authority, meaning it follows a fixed set of laws and regulations. Business bureaucracy is primarily driven by profit motives and market competition, with more flexibility in decision-making. Government officials also follow formal rules and legal mandates, while business executives can make discretionary decisions based on market conditions. For example: A government agency must adhere to strict procurement laws when purchasing supplies, whereas a business can choose vendors based on cost efficiency alone.
Dwight Waldo (1948) in The Administrative State highlighted that government accountability is complex because it must answer to multiple stakeholders (citizens, courts, legislatures), unlike businesses that primarily answer to investors. For example, governments hold public hearings and legislative reviews before making budgetary decisions, whereas businesses do not require public approval before adjusting financial strategies.
Waldo challenged the traditional view that public administration could be purely technical and neutral. Governments are accountable to the public and operate under greater transparency requirements than businesses. This includes open records laws, public hearings, and legislative oversight. Public officials are also held to higher ethical standards than private sector employees, with expectations of impartiality, fairness, and integrity in their decision-making.
Waldo argued that bureaucracy is not just an administrative tool but a political institution, shaped by values, ideologies, and democratic principles. This makes accountability more complex than in business, where efficiency and profit are the primary concerns. His main points were:
– Bureaucracy is inherently political, not just administrative.
– Government agencies must answer to multiple, often conflicting, stakeholders.
– Bureaucratic power must be controlled through democratic institutions.
– Efficiency must be balanced with justice, ethics, and public values.
Governments possess coercive power, including the ability to tax, regulate, and enforce laws. Businesses, while also subject to regulations, primarily rely on market forces and voluntary transactions. Governments derive their legitimacy from democratic processes and the consent of the governed. Businesses, while also subject to societal expectations, primarily focus on satisfying customer demand and generating profits for investors.
Decision-Making and Efficiency Constraints
Herbert Simon (1947) in Administrative Behavior introduced the concept of “bounded rationality,” challenging the notion of perfect rationality in decision-making and explaining that government decisions are constrained by political pressures, competing interests, and complex regulatory environments.
Bounded rationality is often considered a more realistic model of human decision-making in organizations, recognizing the inherent limitations individuals face. Understanding bounded rationality can inform organizational design, promoting structures and processes that support effective decision-making within these constraints. Developing decision support tools and technologies can help overcome some of the limitations of bounded rationality, providing decision-makers with better information and analysis.
This concept recognizes that individuals, particularly in organizational settings, face inherent limitations preventing them from making perfectly rational decisions. These include limitations due to limited cognitive capacity and the inability to process all available information or consider every possible alternative when making decisions. Decision-makers also lack complete information about the situations, the potential consequences of their choices, or the preferences of others involved. Individuals are also prone to cognitive biases, such as confirmation bias (seeking information that confirms existing beliefs) and anchoring bias (over-relying on the first piece of information received), which can distort their judgment.
Simon argued that officials often “satisfice” instead of optimize. They make “good enough decisions” due to these limitations. They often choose the first option that meets their minimum criteria, rather than searching for the absolute best solution. Satisficing is often a more efficient approach, as it conserves cognitive resources and allows for quicker decision-making. However, it may not always lead to the optimal outcome.
By acknowledging the limitations of human rationality and designing AI systems that work within those constraints, governments can leverage AI to make more informed, efficient, and effective decisions. It’s about creating AI that assists human decision-makers in navigating the complexities of the real world, rather than attempting to achieve an unrealistic ideal of perfect rationality.
By acknowledging the limitations of human rationality and designing AI systems that work within those constraints, governments can leverage AI to make more informed, efficient, and effective decisions. It’s about creating AI mechanisms that assist human decision-makers in navigating the complexities of the “real world,” rather than attempting to achieve an unrealistic ideal of perfect rationality.
Philip Selznick (1949) in TVA and the Grass Roots conducted an important case study that showed how government decision-making is influenced by political negotiation and social considerations rather than just economic rationality. It challenged the traditional view of bureaucracy as a purely neutral and rational system. Instead, Selznick demonstrated that bureaucratic organizations are deeply political and shaped by social forces. His analysis of the Tennessee Valley Authority (TVA) revealed how local power dynamics, institutional culture, and informal relationships influence public administration.
The TVA, was a New Deal-era federal agency created in 1933 to promote regional economic development through infrastructure projects like dams and electricity generation. The TVA was originally designed as an apolitical, technocratic institution that would implement policy based on expertise rather than political considerations.
However, Selznick’s study showed that the TVA had to negotiate with local elites, businesses, and community groups to gain support for its programs. Rather than being a neutral bureaucracy, the TVA absorbed the interests and values of local stakeholders over time.
Political compromises often weakened the agency’s original mission of social reform and economic equality. For example, the TVA partnered with local conservative agricultural interests, even though these groups resisted social reforms that would have empowered poor farmers.
Selznick introduced the concept of “co-optation, which describes how bureaucratic organizations incorporate external groups to maintain stability and legitimacy. Instead of enforcing policies rigidly, agencies often have to adjust their goals to align with influential local actors. Co-optation helps agencies gain support and avoid resistance, but it can also dilute their original purpose. This explains why public organizations often fail to deliver radical change, even when they are designed to do so. For example, the TVA originally aimed to empower small farmers and promote land reform, but over time, it aligned itself with local business leaders and preserved existing power structures instead.
By embracing the principles of co-optation, governments can develop AI systems that serve the broader public interest and that development is guided by community engagement, transparency, and collaboration. AI development in government should involve active engagement with a wide range of stakeholders, including citizens, community groups, experts, and advocacy organizations. Co-optation can be used to address concerns and objections raised by external groups. By incorporating their feedback and making adjustments to AI systems, governments can mitigate potential opposition and build consensus.
Monopoly vs. Market Competition
Governments often hold a monopoly over essential services (e.g., national defense, law enforcement, public infrastructure) where competition is neither feasible nor desirable. Governments have broader responsibilities than businesses, encompassing national defense, social welfare, environmental protection, and infrastructure development. Technological changes, however, can change the dynamics of specific utilities. Telecommunications, for example, were primarily government-run facilities that worked to ensure universal service. To upgrade to the global Internet, however, these operations were largely deregulated or sold off to the private sector to invest in innovative new services. More recent discussions have pointed to “net neutrality” and even “cloud neutrality” to address the monopolization of services at the Internet’s edge, such as AI.
Leonard White (1926) in Introduction to the Study of Public Administration pointed out that government agencies do not face direct market competition, which affects incentives and operational efficiency. In contrast, businesses operate in a competitive market where consumer choice determines success. For example, the police department does not compete with private security firms in the way that Apple competes with Samsung in the smartphone market.
White also believed that public administration is the process of enforcing or fulfilling public policy. Since profit is not the primary goal, it’s crucial to define what constitutes “success” for AI systems in government. This might include citizen satisfaction, efficiency gains, improved outcomes, or reduced costs. By carefully considering the unique dynamics of government agencies and incorporating AI in a way that addresses the challenges of limited market feedback and different incentive structures, governments can leverage AI to create more effective, responsive, and citizen-centric services.
Legal and Ethical Constraints
Governments must operate under constitutional and legal constraints, ensuring adherence to democratic principles and human rights. Frank Goodnow (1900) in Politics and Administration argued that public administration is shaped by legal frameworks and public policy goals rather than market forces.
Public officials must follow strict ethical codes and conflict-of-interest regulations that go beyond corporate ethics policies. For example, a government agency should not arbitrarily cut services to boost its budget surplus, whereas a corporation can cut unprofitable product lines without legal repercussions.
Goodnow was one of the first scholars to formally separate “politics” from “administration,” arguing that politics involves the creation of laws and policies through elected representatives. Administration is the implementation of those laws and policies by bureaucratic agencies. Public administration should be neutral, professional, and guided by legal rules, rather than influenced by political pressures.
For example, Congress (politics) passes a law to regulate environmental pollution, and the Environmental Protection Agency (EPA) or the Federal Communications Commission (FCC) (administration) enforces and implements their laws and regulations through technical expertise and bureaucratic processes. Goodnow emphasized that public administration derives its legitimacy from legal and constitutional frameworks, not from market competition.
He argued that government agencies must operate within the rule of law, ensuring fairness, justice, and accountability. Laws define the scope of administrative power, unlike businesses that act based on profit incentives. Bureaucrats should be trained professionals who follow legal principles rather than respond to political or market forces. A tax agency must enforce tax laws uniformly, even if doing so is inefficient, whereas a private company can adjust its pricing strategies according to profit strategies.
Unlike businesses, which prioritize efficiency and profitability, Goodnow argued that government agencies serve the public interest. They provide services that markets might ignore (e.g., public health, education, law enforcement). Public agencies must prioritize equity, justice, and democratic values rather than cost-cutting. The effectiveness of government is measured not just by efficiency but by fairness and public trust. For example, governments fund public schools to ensure universal education, even if private schools might cater to specific family or community preferences.
By adhering to strict ethical principles and conflict-of-interest regulations, governments can ensure that AI is used in a way that builds trust, promotes fairness, and serves the public interest. It’s about creating AI systems that are not only effective but also ethical and accountable.
Service vs. Consumer Model
Citizens are not “customers” in the traditional sense because they do not “choose” whether to participate in government services (e.g., paying taxes, following laws). Harlan Cleveland, a distinguished diplomat and President of the University of Hawaii in his later years argued in his (1965) article “The Obligations of Public Power,” that public administration must ensure universal access to critical services, regardless of financial status. Businesses, on the other hand, serve paying customers and can exclude non-paying individuals from services. For example, a government hospital must treat all patients, including those who cannot afford to pay, whereas a private hospital can refuse service based on financial capacity.
His arguments focused on the ethical, political, and practical challenges faced by government officials in wielding public power. The ethical responsibility of public officials included holding power on behalf of the people, meaning they must act with integrity and accountability. Cleveland warned against abuse of power and the temptation for bureaucrats to act in self-interest rather than the public good. He stressed the need for ethical decision-making in government to prevent corruption and misuse of authority. For example, a government official responsible for allocating funds must ensure fairness and avoid favoritism, even when pressured by political influences.
Public administration should strive to be effective but must not sacrifice democratic values to pursue efficiency. He argued that bureaucratic decision-making should be transparent and participatory, ensuring citizens have a voice in government actions. Efficiency is important, but equity, justice, and citizen involvement are equally critical. For example, government programs should not cut social programs simply because they are expensive—public welfare must be prioritized alongside financial considerations.
Cleveland emphasized that public power must be answerable to multiple stakeholders, including the public (through elections and civic engagement), legislatures (through oversight and funding), and the courts (through legal constraints and judicial review). Unlike businesses, which are accountable mainly to shareholders, government agencies must navigate complex and often conflicting demands from different groups. For example, a public health agency must justify its policies to elected officials (who determine budgets) and citizens (who expect effective services).
Cleveland also pointed to the growing complexity of governance, a term he was one of the first to use. Government agencies were becoming more complex and specialized, requiring public administrators to manage technological advancements and expanding regulations as well as international relations and globalization. Cleveland worried that bureaucracies might become too rigid and disconnected from the people, creating a gap between government and citizens.
By keeping Cleveland’s principle at the forefront, governments can leverage AI to create a more just and equitable society where everyone has access to the services they need to thrive. It’s about using technology to empower individuals, reduce disparities, and ensure that everyone has the opportunity to reach their full potential.
As government agencies adopt AI and data-driven decision-making, they must ensure that technology serves human interests and does not lead to excessive bureaucracy or loss of personal agency. Cleveland called for adaptive, innovative leadership in public administration to keep up with social, political, and technological changes. He criticized government agencies that resist reform or fail to evolve with society’s needs. Public administrators must be proactive, responsive, and forward-thinking rather than merely following routine procedures. For example, climate change policies require public agencies to anticipate future risks, rather than simply reacting to disasters after they occur.
For Cleveland, public service was a moral obligation, not just a technical or managerial function. He believed that serving the public is an ethical duty, requiring commitment to justice, fairness, and the common good. Bureaucrats must see themselves as stewards of public trust, not just rule enforcers.
Harlan Cleveland’s emphasis on universal access to critical services, regardless of financial status, is a fundamental principle that must guide the design of AI mechanisms in government. Cleveland argued that public administration, unlike business, has a fundamental obligation to serve all citizens regardless of their ability to pay, and must balance efficiency with democratic values like equity, justice, and citizen participation. He stressed the ethical responsibility of public officials to act in the public interest, be accountable to multiple stakeholders, and adapt to the growing complexity of governance.
These principles are crucial for guiding AI development in government. AI systems should be designed provide universal access to critical services, overcoming barriers like financial constraints, location, and digital literacy. AI systems should avoid sacrificing democratic values in the pursuit of efficiency while maintaining transparency and accountability, allowing citizens to understand and participate in AI-driven decision-making. Ultimately, AI in government should be a tool for enhancing public service and promoting the common good, not just a means to increase efficiency.
Conclusion: The Unique Role of Government and the Implications of AI
Public administration scholars have consistently emphasized that government is not simply a business; it operates under different principles, constraints, and objectives. While efficiency is valuable, the government’s primary goal is to serve the public good, uphold democracy, and ensure fairness and justice, even at the cost of financial efficiency. The writings of Appleby, Cleveland, Waldo, Weber, and Wilson, continue to reinforce the fundamental distinction between governance and business management.
Drawing on the classics of public administration, we can start to specify some constraints and goals for artificial intelligence (AI) and develop a “smart” but ethical government that is efficient but also responsive to public concerns.
Possibilities and Oversight
– AI systems used in government should be transparent, with open access to the data and algorithms used to make decisions. This allows for public scrutiny and accountability.
– Regular audits and oversight of AI systems are vital to ensure they function as intended and do not produce unintended consequences.
– AI systems should be designed to protect the privacy of citizens and ensure that their data is used responsibly and ethically.
– While AI can automate many tasks, human oversight is essential to ensure that AI systems are used in a way that aligns with ethical principles and societal values.
– AI can make government information more accessible to citizens, providing clear and concise explanations of policies and programs.
– AI can gather and analyze citizen feedback, providing valuable insights for policymaking and service delivery.
– AI can facilitate participatory governance, enabling citizens to contribute to decision-making processes and shape public policy.
Challenges and Considerations
– AI systems can perpetuate existing biases if not carefully designed and monitored. It’s essential to ensure that AI systems are fair and non-discriminatory.
– The automation of specific government tasks may lead to job displacement. It’s important to develop strategies for workforce transition and retraining.
– Building and maintaining public trust in AI is crucial for its successful adoption in government. This implementation requires a commitment to transparency, explainability, and accountability in all AI-related processes and decisions.
By carefully considering these opportunities and challenges and drawing on the wisdom of the classics of public administration, we can start to harness the power of AI to create a “smart” but ethical government that serves the public interest and promotes the well-being of all citizens. In future posts, I plan to draw on subsequent generations of public administration practitioners and scholars who provide more critical perspectives of more complex government structures that have emerged in the last century. Women’s voices, such as Kathy Ferguson’s critique of bureaucracy and Stephanie Kelton’s critique of government budgeting, are extremely valuable perspectives going forward. AI is undoubtedly on the near horizon for government services, and it should be approached with the understanding that such systems are capable, but not guaranteed, of being designed for the public good.
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 09) AI and Government: Concerns from Classic Public Administration Writings. apennings.com https://apennings.com/digital-geography/ai-and-government-concerns-from-classic-public-administration-writings/
Bibliography
Appleby, P. H. (1945). Big Democracy. Alfred A. Knopf.
Cleveland, H. (1965). The Obligations of Public Power. Public Administration Review, 25(1), 1–6.
Ferguson, Kathy (1984). The Feminist Case Against Bureaucracy. Temple University Press.
Goodnow, F. J. (1900). Politics and Administration: A Study in Government. Macmillan.
Kelton, Stephanie. (2020) “The Deficit Myth: Modern Monetary Theory and the Birth of the People’s Economy.” PublicAffairs.
Selznick, P. (1949). TVA and the Grass Roots: A Study in the Sociology of Formal Organization. University of California Press.
Simon, H. A. (1947). Administrative Behavior: A Study of Decision-Making Processes in Administrative Organizations. Macmillan.
Waldo, D. (1948). The Administrative State: A Study of the Political Theory of American Public Administration. Ronald Press.
Weber, M. (1922). Economy and Society: An Outline of Interpretive Sociology (G. Roth & C. Wittich, Eds.). University of California Press.
White, L. D. (1926). Introduction to the Study of Public Administration. Macmillan.
Wilson, W. (1887). The Study of Administration. Political Science Quarterly, 2(2), 197–222. https://doi.org/10.2307/2139277
Note: Several AI requests were prompted and parsed for this post.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: bounded rationality > Dwight Waldo > Harland Cleveland > Herbert Simon > Max Weber > Paul Appleby > Philip Selznick > Tennessee Valley Authority (TVA) > Woodrow Wilson
AI Governance and the Public Management of Transportation
Posted on | February 6, 2025 | No Comments
I’m doing an audit of my work on the topic of AI Governance of the Automatrix for publication. It’s my collection of posts on Artificial Intelligence (AI) Governance and the Automatrix or maybe “Robomatrix?” It is about the public management of the future transportation infrastructure as it becomes increasingly “smart” and electric.
Pennings, A.J. (2025, Feb 09) AI and Government: Concerns from Classic Public Administration Writings. apennings.com https://apennings.com/digital-geography/ai-and-government-concerns-from-classic-public-administration-writings/
Pennings, A.J. (2024, Nov 21) Google: Monetizing the Automatrix – Rerun. apennings.com https://apennings.com/global-e-commerce/google-monetizing-the-automatrix-2/
Pennings, A.J. (2024, Oct 10). All Watched over by Systems of Loving Grace. apennings.com https://apennings.com/how-it-came-to-rule-the-world/all-watched-over-by-systems-of-loving-grace/
Pennings, A.J. (2024, Jun 24). AI and Remote Sensing for Monitoring Landslides and Flooding. apennings.com https://apennings.com/space-systems/ai-and-remote-sensing-for-monitoring-landslides-and-flooding/
Pennings, A.J. (2024, Jun 22). AI and the Rise of Networked Robotics. apennings.com https://apennings.com/technologies-of-meaning/the-value-of-science-technology-and-society-studies-sts/
Pennings, A.J. (2024, Jan 19). How Do Artificial Intelligence and Big Data Use APIs and Web Scraping to Collect Data? Implications for Net Neutrality. apennings.com https://apennings.com/technologies-of-meaning/how-do-artificial-intelligence-and-big-data-use-apis-and-web-scraping-to-collect-data-implications-for-net-neutrality/
Pennings, A.J. (2024, Jan 15). Networking Connected Cars in the Automatrix. apennings.com https://apennings.com/telecom-policy/networking-in-the-automatrix/
Pennings, A.J. (2022, Apr 22). Wireless Charging Infrastructure for EVs: Snack and Sell? apennings.com https://apennings.com/mobile-technologies/wireless-charging-infrastructure-for-evs-snack-and-sell/
Pennings, A.J. (2019, Nov 26). The CDA’s Section 230: How Facebook and other ISPs became Exempt from Third Party Content Liabilities. apennings.com https://apennings.com/telecom-policy/the-cdas-section-230-how-facebook-and-other-isps-became-exempt-from-third-party-content-liabilities/
Pennings, A.J. (2021, Oct 14). Hypertext, Ad Inventory, and the Use of Behavioral Data. apennings.com https://apennings.com/global-e-commerce/hypertext-ad-inventory-and-the-production-of-behavioral-data/
Pennings, A.J. (2012, Nov 8) Google, You Can Drive My Car. apennings.com https://apennings.com/mobile-technologies/google-you-can-drive-my-car/
Pennings, A.J. (2014, May 28) Google, You Can Fly My Car. apennings.com https://apennings.com/ditigal_destruction/disruption/google-you-can-fly-my-car/
Pennings, A.J. (2020, Feb 9). It’s the Infrastructure, Stupid. apennings.com https://apennings.com/democratic-political-economies/from-new-deal-to-green-new-deal-part-3-its-the-infrastructure-stupid/
Pennings, A.J. (2010, Nov 20). How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/star-wars-creates-the-internet/
Pennings, A.J. (2018, Sep 27) How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part II. apennings.com https://apennings.com/how-it-came-to-rule-the-world/how-star-wars-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-japan/
Pennings, A.J. (2011, Jan 2) How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part III: NSFNET and the Atari Democrats. apennings.com https://apennings.com/how-it-came-to-rule-the-world/how-%e2%80%9cstar-wars%e2%80%9d-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-part-iii-nsfnet-and-the-atari-democrats/
Pennings, A.J. (2017, Mar 23) How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part IV: Al Gore and the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/how-%e2%80%9cstar-wars%e2%80%9d-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-al-gor/
Pennings, A.J. (2014, Nov 11). IBM’s Watson AI Targets Healthcare. apennings.com https://apennings.com/data-analytics-and-meaning/ibms-watson-ai-targets-healthcare/
Pennings, A.J. (2011, Jun 19) All Watched Over by Machines of Loving Grace – The Poem. apennings.com https://apennings.com/how-it-came-to-rule-the-world/all-watched-over-by-machines-of-loving-grace-the-poem/
Pennings, A.J. (2011, Dec 04) The New Frontier of “Big Data”. apennings.com https://apennings.com/technologies-of-meaning/the-new-frontier-of-big-data/
Pennings, A.J. (2013, Feb 15). Working Big Data – Hadoop and the Transformation of Data Processing. apennings.com https://apennings.com/data-analytics-and-meaning/working-big-data-hadoop-and-the-transformation-of-data-processing/
Pennings, A.J. (2014, Aug 30) Management and the Abstraction of Workplace Knowledge into Big Data. apennings.com https://apennings.com/technologies-of-meaning/management-and-the-abstraction-of-knowledge/
Pennings, A.J. (2021, Oct 14). Hypertext, Ad Inventory, and the Use of Behavioral Data. apennings.com https://apennings.com/global-e-commerce/hypertext-ad-inventory-and-the-production-of-behavioral-data/
Anthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea, teaching financial economics and ICT for sustainable development, holding a joint appointment as a Research Professor at Stony Brook University. From 2002 to 2012, he was on the faculty of New York University, where he taught digital economics and information systems management. When not in Korea, he lives in Austin, Texas.
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 6) AI Governance and the Public Management of Transportationapennings.com https://apennings.com/digital-coordination/ai-governance-and-the-public-management-of-transportation/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Artificial Intelligence (AI) > Connected Cars
Lotus 1-2-3, Temporal Finance, and the Rise of Spreadsheet Capitalism
Posted on | February 3, 2025 | No Comments
One of the books I read during my PhD years was Barbarians at the Gate: The Fall of RJR Nabisco (1989), about the $25 billion leveraged buyout (LBO) of the iconic conglomerate (tobacco/snacks/) company by Kohlberg Kravis Roberts & Co. (KKR). Am LBO is the purchase of a company using high amounts of short-term debt and the target company’s assets as collateral for the loans. The purchaser or “raider” plans to pay off the debt using future cash flow. KKR’s LBO of RJR Nabisco became the largest and most famous LBO of its time and a major influence on my thinking about the role of digital spreadsheets and private equity in the economy.[1]
Barbarians at the Gate rarely mentions the role of spreadsheets, but I also had my interest sparked by Oliver Stone’s movie Wall Street (1987), which had a similar theme. It is a story about a fictional corporate raider called Gordon Gekko taking over a fictional company called Bluestar Airlines. Gekko mentors Bud Fox, a young financial analyst who is anxious to be successful. The movie draws on the gangster genre, with Stone replacing the iconic guns and cars of the traditional genre with spreadsheets and cellular telephones. I wrote about the movie in my 2010 post “Figuring Criminality in Oliver Stone’s Wall Street,” where I identified digital spreadsheets as one of the “weapons” used by financial raiders.
This post looks at the early use of digital spreadsheets and two aspects of modern capitalism that emerged during the 1980s with the personal computer and a significant spreadsheet that was dominant before Microsoft’s Excel. The LBO and the use of “junk bonds” emerged in conjunction with digital technology and new financial techniques that reshaped major corporations and the organization of modern finance.
This post looks at the early use of digital spreadsheets and two aspects of modern capitalism that emerged during the 1980s with the personal computer and a significant spreadsheet that was dominant before Microsoft’s Excel. The LBO and the use of “junk bonds” emerged in conjunction with digital technology and new financial techniques that reshaped major corporations and the organization of modern finance.
The concept of time is central to the“temporal finance” of spreadsheet capitalism. It empowers techniques like forecasting, modeling, risk analysis, and decision-making that depend on time-based variables, such as cash flows, interest rates, investment returns, or market trends. These techniques translate into debt instruments like bonds, mortgages, and loans that promise future repayment with interest, acknowledging the time value of money.
Temporal finance plays a critical role in the development of spreadsheet capitalism. In disciplines like corporate finance, portfolio management, and financial engineering, spreadsheets drew on established temporal practices and alphanumeric culture yet shaped a PC-enabled technical knowledge that transformed the global political economy and filtered out into general use.
I am using the term “spreadsheet capitalism” to refer to a type of rationality brought to the political economy due to the computer application’s confluence of capabilities exemplified by Lotus 1-2-3. The spreadsheet integrated previous political technologies such as the list and table combined with accelerated processing speeds, gridmatic visibility, and the interactivity and intimacy enabled by the microprocessing “chip” in the affordable PC.
Spreadsheets integrated temporal variables into financial decision-making and accelerated the influence of innovations in financial modeling, software, and decision support systems. Early spreadsheet software, like Lotus 1-2-3 and later Excel, allowed users to automate calculations across time periods using formulas, macros, and built-in functions. Financial analysts could efficiently calculate and project periodic growth rates, depreciation schedules, or portfolio returns without manually recalculating each period.
Lotus 1-2-3 effectively leveraged the PC’s keyboard and cathode-ray display to provide a clean, ASCII text-based interface that was easy for financial professionals to learn and use. While it was less feature-rich than later tools like Microsoft Excel, several functions and formulas on Lotus 1-2-23 were particularly valuable for LBO modeling. Not surprisingly, a key function was @SUM(), which added the values in a range of cells. For example, @SUM(D1..D7) would calculate totals from D1 through D7, such as revenue, costs, or aggregate cash flow over time. @ROUND() was often used to clean up financial outputs for reporting, ensuring Indo-Arabic written outputs are rounded to the nearest dollar, thousand, or million in its positional placement numbering scheme. More functions and formulas used by financial raiders and private equity will be discussed below.
Reaganomics and the Digital Spreadsheet
The larger context for the emergence of the spreadsheet economy was the “Reagan Revolution” and its emphasis on deregulation, dollar strength, tax incentives, and a pro-finance climate. These changes created fertile ground for the rapid growth of the spreadsheet economy. The Economic Recovery Tax Act of 1981 lowered capital gains taxes, making investments in LBOs more attractive to private equity firms and individual investors. Legislative changes in depreciation schedules also allowed firms to write off investments faster, improving cash flow and making acquisitions more financially feasible. Relaxed corporate tax laws allowed companies to deduct significant interest payments on debt, a cornerstone of LBO financing, and incentivized the heavy use of leverage in buyouts.
In this environment, temporal forecasting, modeling, and risk analysis during the 1980s and early 1990s enabled the complex calculations and scenarios that made corporate raiding and other types of mergers and acquisitions (M&A) possible. Users could automate calculations across future time periods using formulas, macros, and built-in functions. These were previously too cumbersome to perform manually but were made possible with formulas, macros, and built-in functions. These capabilities made spreadsheets instrumental in structuring and executing many major business deals and LBOs during the Reagan era.
Spreadsheets worked with several new technologies that emerged at the time to fundamentally transform corporate America and diffuse into many industries and social activities. This post will focus more on the basic capabilities of the PC that enabled the CP/M and MS-DOS spreadsheets. Other posts will track the developments in GUI environments that enabled Mac and Windows-based spreadsheets.
Standardization of IBM PCs and the IBM “Compatibles” for Lotus 1-2-3
While the Reagan Revolution of the 1980s created an economic and regulatory environment that made LBOs particularly attractive, the IBM PC and its compatibles became its initial workhorses. IBM had been the dominant mainframe computer producer but noticed it was “losing the hearts and minds” of computer users to the Apple II “microcomputer” and its games like Alien Rain, AppleWriter word processor, and especially the VisiCalc spreadsheet. In response to popular demand in the business world, IBM created its own microcomputer, based on a new Intel microprocessor.
IBM released the Personal Computer (PC) in 1981 to entice the business community back to “Big Blue,” as the New York-based computer company was sometimes called. After procuring an operating system (OS) from “Micro-Soft,” it went on sale in August 1981. Early PCs were powered by Intel’s 8088, a 16-bit microprocessing chip used for its central processing unit (CPU). Although limited by the era’s hardware, the 8088 allowed Lotus 1-2-3 to process larger datasets than previous-generation microprocessors, enabling businesses to manage more comprehensive financial information.
The combination of Lotus 1-2-3’s features and the 8088’s performance made the software versatile for various financial tasks, from simple bookkeeping to advanced financial modeling. The 8088, operating at 5-10 MHz, delivered significant computational power for its time, enabling fast data processing and calculations.
The speed of the 8088, representing a 50-times speed increase over the revolutionary Intel 4004 chip that inspired Bill Gates to leave Harvard and start “Micro-Soft.” Although primarily focused on developing software, they took advantage of an opportunity to buy and configure an operating system called MS-DOS for IBM. In a historic move, Gates would outmaneuver the computer giant, however, and offer MS-DOS to many new “IBM-compatible” microcomputers that were based on the reverse-engineering of the PC.
Working on many new PCs such as the Compaq and Dell, MS-DOS allowed Lotus 1-2-3 to dominate the market, despite Microsoft’s release of its Multiplan spreadsheet in 1982. Despite using the old-style command-line interface, the new spreadsheets could handle real-time financial updates, giving users the ability to recalculate entire spreadsheets almost instantly. With MS-DOS, Lotus 1-2-3 became the de facto spreadsheet tool for businesses.
The widespread use of the 8088 established the PC as a standard computing platform, encouraging software developers like Lotus to optimize their products for this architecture. The popularity of the 8088 and Lotus 1-2-3 fostered a growing arsenal of compatible software, add-in boards, and other hardware for storing data and printing charts, further amplifying its utility for financial purposes.
The 8088’s ability to handle multitasking allowed Lotus 1-2-3 to integrate spreadsheet, charting, and database functions. Financial professionals could perform calculations, visualize data, and manage records in one program without needing additional tools.
In early 1983, Lotus 1-2-3 was released after a long incubation period while being programmed in Assembly language for faster performance. Lotus could also run on “IBM-compatible” machines, such as the Compaq portable computer that came out a few months later. Lotus 1-2-3 became known for its ability to (1) visually calculate formulas, (2) function like a database, and (3) turn data into charts for visible representation of data. The spreadsheet’s features made the software versatile for various financial tasks, from simple bookkeeping to advanced financial modeling. Lotus 1-2-3 played a pivotal role in the emergence of spreadsheet capitalism during the 1980s, and its functions and logic informed the principles of modern LBO modeling.
Spreadsheets Empower Corporate Raiding
The consolidation of corporations in the 1960s and early 1970s, mainly through mergers, created the conditions that fueled the rise of corporate raiders in the 1980s. Expanding conglomerates often created inefficient bureaucracies with poor management by acquiring companies in unrelated industries. Slow decision-making, short-term planning, and internal competition for resources hid the value of their subsidiary companies. Leveraged buyouts, enabled by Lotus 1-2-3 and junk bonds, provided the financial firepower for corporate raiders to execute hostile takeovers and break up these companies for big profits.
Corporate raiders would model the complex financing of LBOs, where a company is acquired primarily with borrowed money. These raiders would input a target company’s financial data into a spreadsheet and assess the company’s value, analyze different scenarios, and identify areas where costs could be cut or assets sold off to increase profitability. They would adjust variables like interest rates, debt levels, and projected cash flows to determine the feasibility and profitability of the LBO. The spreadsheet results served as compelling presentations to banks and investors, showcasing the potential returns of the LBO and convincing them to provide the necessary funding. The 1980s saw a wave of high-profile takeovers, often leading to significant restructuring and changes in the corporate landscape.
LBOs in the 1980s included several prominent cases. Notable was the breakup of RJR Nabisco (1988) and the Beatrice Companies (1986) conducted by KKR, an upcoming investment firm founded in 1976 by Jerome Kohlberg Jr. and cousins Henry Kravis and George R. Roberts. They had all previously worked together at Bear Stearns. KKR became a leader in the LBO space and heavily relied on computer analysis and spreadsheets to structure its deals.
RJR-Nabisco
The RJR Nabisco buyout was one of the most famous and largest leveraged buyouts (LBOs) in history, and it perfectly illustrates how these deals worked, including the subsequent asset sales to repay debt. RJR Nabisco was a massive conglomerate, owning tobacco brands (Winston, Salem), and food brands (Nabisco, Oreo). KKR borrowed heavily (mainly through junk bonds) to finance the acquisition. These loans minimized the amount of their own capital needed. The final price was a staggering $25 billion, a record at the time. This massive figure was only possible due to the availability of the financial analysis tools such as the spreadsheet and high-yield, high-risk junk bonds that will be discussed in a future post.
KKR’s core strategy was similar to other LBOs. Take control of the company through the LBO paid for by large loans. Identify non-core assets and divisions that could be sold off and divest those assets to generate cash. They would then use the proceeds from asset sales to pay down the often massive debt incurred in the LBO. KKR and its investors would profit from any remaining value after debt repayment.
The sheer size of the RJR Nabisco deal meant KKR had to raise an enormous amount of debt. This borrowing was facilitated by investment bankers, and increasingly, the junk bond market. KKR proceeded to sell off various RJR Nabisco assets, including several food brands, and overseas operations were also divested. Anything that wasn’t considered core to the remaining business was on the table. The money from these sales went directly to paying down the principal and interest on the LBO debt. While the deal was controversial, KKR and its investors made a substantial profit. RJR Nabisco was significantly smaller and more focused after the asset sales.
The firm’s ability to efficiently model debt financing and equity returns gave it a competitive edge. The deal’s complexity required detailed modeling of debt structures, cash flow scenarios, and potential equity returns. These could be calculated and managed using Lotus 1-2-3, which enabled models of loan amortization schedules, showing how much of each payment goes toward the principal and interest. Key functions included PMT to calculate the fixed periodic loan payment. IPMT to calculate the interest portion of a payment. PPMT was also used to calculate the principal portion of a payment. These formulas could also model different types of debt, such as bullet repayments, fixed-rate loans, or variable-rate loans, and the analysis could include early repayments or refinancings in the schedule to determine total debt cost.
The RJR Nabisco LBO became a symbol of the excesses of the 1980s, highlighting the power (and risks) of leveraged buyouts and junk bonds. It also led to increased scrutiny of these types of deals, although they continued as spreadsheet capitalism spread.
Beatrice Foods
Beatrice Foods was a massive conglomerate with a diverse portfolio of food brands (Hunt’s Ketchup, Wesson Oil, etc.) and other businesses.
KKR borrowed heavily to finance the acquisition, allowing them to purchase a large company with relatively little of their own capital. The Beatrice acquisition was another of the largest LBOs at the time, valued at approximately $6.2 billion.
Using computer analysis and Lotus 1-2-3, they modeled Beatrice’s sprawling operations and assessed the feasibility of breaking the company into smaller, more manageable pieces. The deal’s complexity again required detailed modeling of debt structures, cash flow scenarios, and potential equity returns, which were conducted using the PC-enabled spreadsheet.
After extended deliberations, KKR purchased Beatrice Companies for $8.7 billion in April 1986 and proceeded to break it up. KKR sold off many of Beatrice’s non-core businesses, including those in areas like chemicals and construction. They focused on strengthening Beatrice’s core businesses, primarily in the food and beverage sector. This list included brands like Chef Boyardee, Samsonite, and Tropicana.
Promus
Another important deal was Promus Companies acquiring Harrah’s Entertainment in a 1989 LBO transaction that depended on detailed modeling of casino revenues and operational expenses. Again, this was made feasible by Lotus 1-2-3 because of its dominance at the time. LBOs require complex financial modeling to project cash flows and analyze the target company’s future earnings potential. It also needed to determine the optimal debt levels and repayment schedules as well as assess the impact of different assumptions, such as interest rates and revenue growth, on the deal’s profitability.
Raiding financiers and associated firms leveraged Lotus 1-2-3 to simulate financial outcomes and quickly adjust models as negotiations or deal terms changed. Estimating the company’s value under different scenarios and determining whether the target company could generate enough cash to service debt under different economic and operational conditions was crucial. This estimation required precise tracking of various tranches of debt, their repayment terms, and interest coverage.
The Blackstone Group
Blackstone also started with the help of spreadsheets like Lotus 1-2-3 for its early buyouts, including real estate and private equity deals. Lotus 1-2-3 provided the necessary tools for these complex financial analyses, which were used to organize data, build complex economic models, and perform calculations. Macros were available to automate repetitive tasks and improve efficiency in the modeling process. Blackstone also used Lotus to generate charts, graphs, and other visualizations to help analyze investment performance and make presentations in the boardroom.
LBO models can become complex, requiring intricate formulas and linkages between spreadsheet parts. The accuracy of the LBO model heavily relies on the accuracy of the underlying data inputs. Lotus 1-2-3 could perform sensitivity analyses by changing key assumptions (e.g., interest rates, revenue growth) to understand their impact on the model’s output.
Spreadsheet Formulas and the Temporal Imperative
Time is a crucial factor in capitalism and its financial investments, but it was only after the West’s social transformation of time and sacrifice that investment took its current priority. Religious discipline, which structured earthly time for heavenly reward, met with the Reformation in 16th-century Europe to produce a new calculative rationality – financial investment.[Pennings Dissertation] Also, by solidifying time in alphanumerical base-12 measures (60 minutes, 24-hour days, 360 days a year), a new correlation – investment over time gained prominence. Sacrificing spending in the present for payoffs in the future was the cultural precondition for spreadsheet capitalism.[4]
The analysis of the time value of money (TVM) was critical for LBOs, particularly for valuing a target company, determining debt service, and return on investment, as well as understanding and managing the risks associated with the LBO. TVM calculations were time-consuming and tedious, often requiring financial tables or manual calculations using formulas.
Digital spreadsheets significantly accelerated and improved the analysis of TVM by automating calculations, enabling “what-if” analysis, increasing accessibility, and enhancing visualization. Lotus 1-2-3 introduced built-in financial functions, such as PV (Present Value), FV (Future Value), PMT (Payment), RATE, and NPV (Net Present Value). These functions simplified TVM calculations that would otherwise require extensive manual work or financial calculators. Instead of manually solving the compound interest formula to find future value, users could simply input values (e.g., interest rate, periods, and payment) into the FV function. Spreadsheets allow users to quickly change input variables (interest rates, cash flows, and time periods) and instantly see the impact on the TVM calculations.
TVM is based on the notion that a dollar today is worth more than a dollar in the future due to its earning potential. This formula (FV = PV x (1 + i / f) ^ n x f ) has empowered individuals and businesses to make more informed financial decisions. Spreadsheets allow users to create charts and graphs to visualize TVM concepts, such as the impact of compounding interest over time or the relationship between present value and future value.
A vital formula that was converted to the Lotus spreadsheet was Present Value @PV(), a crucial tool for analyzing companies. It provided a foundation for evaluating the worth of future cash flows from raided companies or their parts in the present terms. Companies generate cash flows over time, and analyzing them with PV ensures that delayed returns are appropriately considered and valued. PV helps distinguish between high-growth opportunities that justify higher valuations and overvalued prospects with limited potential.
PV quantifies this by discounting future cash flows to reflect their value today. This equation is critical in decision-making, whether assessing investments, valuing a company, or comparing financial alternatives. Present Value determines the internal rate of return (IRR) or net present value (NPV), the difference between the present value of cash inflows and the present value of cash outflows over a period of time. NPV is used in capital budgeting and investment planning to analyze a project’s projected profitability.
A related temporal technique is Future Value @FV() that was developed to project future cash or investment values. It calculates what money is expected to be worth at a future date based on current growth trends. It is particularly useful for debt paydown schedules or residual equity valuation @IRR(), the Internal Rate of Return. These calculations were crucial for evaluating the return on investment for equity holders, a core metric in LBOs.
Net Present Value @NPV() helped assess the profitability of an investment by calculating the value of projected cash flows discounted at the required rate of return. @NPV was crucial as it allowed users to input a discount rate (representing the cost of capital) and a series of future cash flows, and the @NPV function would calculate the present value of those cash flows.
@IF() determined whether a debt covenant has been breached or whether excess cash should be used for debt repayment. Payment @PMT() was useful for calculating the periodic payment required for a loan, considering principal, interest, and term.
Conclusion
Lotus 1-2-3’s capabilities on IBM and “IBM-compatible” personal computers allowed private equity firms to confidently pursue larger and more complex deals by providing a reliable platform for financial forecasting and decision-making. The tool’s role in shaping LBO strategies contributed to the emergence of private equity as a dominant force in corporate finance. Many fundamental modeling practices in these landmark deals continue to underpin private equity and LBO analyses today, albeit with more advanced tools like Microsoft Excel.
By providing the computational power needed for sophisticated spreadsheet software, the Intel 8088 chip-enabled Lotus 1-2-3 to become a powerful tool for financial analysis, transforming how businesses managed and analyzed financial data in the 1980s. The 8088’s arithmetic capabilities enabled Lotus 1-2-3 to execute complex financial formulas and algorithms quickly, making it suitable for forecasting, budgeting, and economic modeling tasks.
Summary
This article explores the role of digital spreadsheets, particularly Lotus 1-2-3, in the rise of leveraged buyouts (LBOs) and “junk bonds” during the 1980s, a phenomenon termed “spreadsheet capitalism.” The author argues that spreadsheets, combined with the economic policies of the Reagan era, enabled the complex financial modeling and analysis necessary for these deals, transforming corporate finance and the broader political economy.
Spreadsheets like Lotus 1-2-3 allowed financiers to analyze target companies, model LBO financing, and present compelling cases to investors. This facilitated the wave of LBOs in the 1980s, exemplified by deals like RJR Nabisco and Beatrice Foods.
Spreadsheets enabled sophisticated financial modeling across time periods, incorporating factors like cash flows, interest rates, and investment returns. This “temporal finance” became crucial for LBOs and other financial instruments.
The IBM PC and its compatibles, powered by Intel’s 8088 microprocessor, provided the hardware platform for Lotus 1-2-3 to thrive. The spreadsheet’s features, combined with the 8088’s processing power, made it a versatile tool for financial professionals.
Reaganomics is a key point. It argues that the “Reagan Revolution,” with its emphasis on deregulation, tax cuts, and a pro-finance climate, created a favorable environment for LBOs and the use of spreadsheets in finance.
The article highlights specific Lotus 1-2-3 functions and formulas, such as @SUM, @ROUND, @PV, @FV, @NPV, and @IF, that were crucial for LBO modeling and financial analysis.
Spreadsheets automated complex calculations related to the time value of money, making it easier to evaluate investments and compare financial alternatives.
The author concludes that spreadsheets played a pivotal role in the rise of LBOs and the transformation of corporate finance in the 1980s. The ability to model complex financial scenarios and analyze the time value of money empowered financiers to pursue larger and more complex deals, contributing to the emergence of private equity as a major force in the economy.
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 3) Lotus 1-2-3, Temporal Finance, and the Rise of Spreadsheet Capitalism. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/lotus-1-2-3-temporal-finance-and-the-rise-of-spreadsheet-capitalism/
Notes
[1] Barbarians at the Gate was written by investigative journalists Bryan Burrough and John Helyar and based upon a series of articles written for The Wall Street Journal. The book was also made into made-for-TV movie in 1983 by HBO. The book centers on F. Ross Johnson, the CEO of RJR Nabisco, who planned to buy out the rest of the Nabisco shareholders.
[2] Lotus 1-2-3 v2.3 Functions and Macros Guide. Copyright:Attribution Non-Commercial (BY-NC)
[3] Differences between Microsoft Excel and Lotus 1-2-3.
[4] The concept of time is central to the creation of debt instruments like bonds, mortgages, and loans. These instruments promise future repayment with interest, acknowledging the time value of money. The Reformation, particularly in 16th-century Europe, profoundly transformed the concept of religious sacrifice, redirecting its focus from traditional spiritual practices such as indulgences and pilgrimages to a more personal, moral, and communal framework of financial responsibility and economic participation. Driven by figures like Martin Luther and John Calvin, the Reformation emphasized salvation through faith alone (sola fide), as opposed to salvation through works or financial contributions to the Church.
Sacrificial acts, such as indulgences (payments to reduce punishment for sins) or pilgramages, were denounced and replaced by personal piety and moral rectitude became the markers of faith.
The emergent Protestantism emphasized a form of asceticism that discouraged excessive spending on luxuries and instead encouraged investment in one’s household, community, and vocation as acts of divine service. Calvinist teachings in particular associated hard work, frugality, and the accumulation of wealth with signs of God’s favor, framing secular work and financial investment as forms of religious duty legitimizing economic activity and investment as expressions of faith. Financial stewardship—managing wealth responsibly for the benefit of family and society—was seen as a spiritual obligation, transforming economic practices into acts of religious significance. The reframing of religious sacrifice as financial responsibility and moral investment influenced economic development and the encouragement of disciplined financial behavior and reinvestment contributed to the rise of capitalist economies in Protestant regions. This transformation redefined the role of the individual in their faith community, linking personal piety with economic productivity and reshaping the societal understanding of sacrifice as a moral and practical investment in the future rather than a direct transaction with the divine.
Hypertext References (APA Style)
Burrough, B. and Helyar, J. (1990). Barbarians at the Gate: The Fall of RJR Nabisco. New York: Harper & Row.
Corporate Finance Institute. (n.d.). Reaganomics. Retrieved from https://billofrightsinstitute.org/essays/ronald-reagan-and-supply-side-economics
Investopedia. (n.d.). Net Present Value (NPV). Retrieved from https://www.investopedia.com/terms/n/npv.asp
Investopedia. (n.d.). Internal Rate of Return (IRR). Retrieved from https://www.investopedia.com/terms/i/irr.asp
Investopedia. (n.d.). Residual Equity Theory. Retrieved from https://www.investopedia.com/terms/r/residual-equity-theory.asp
Pennings, A. (2010). Figuring Criminality in Oliver Stone’s Wall Street. Retrieved from https://apennings.com/2010/05/01/
The Economist. (2024, October 15). Why Microsoft Excel Won’t Die. Retrieved from https://www.economist.com/business/2024/10/15/why-microsoft-excel-wont-die
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: IBM PC > Intel 8088 > LBOs > leveraged buyouts > Lotus 1-2-3 > Reagan Revolution > time value of money (TVM)
Steve Jobs Pioneering Work in Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK)
Posted on | January 24, 2025 | No Comments
Hailing from Silicon Valley during the age of the counter-cultural movement, Steve Jobs was a keen observer of social and technological events. As a result, he significantly impacted the development of human-machine interaction (HMI) and Human-Machine Knowledge (HMK) through his work at Apple and NeXt.
Humans, at their best, excel at creativity, judgment, and complex reasoning, while machines excel at accuracy, speed, storage, processing, and speedily transmitting vast amounts of data. HMI, also known as MMI or man–machine interaction, is defined as interaction between human operators and devices through multiple interfaces.
Human-Machine Knowledge (HMK) is an emerging interdisciplinary field that explores how humans and machines can effectively collaborate, construct, and share knowledge to achieve outcomes that surpass what either could accomplish alone. HMK emphasizes creating systems where humans and machines complement each other’s strengths. HMK seeks to create a symbiotic relationship between humans and machines, where they can leverage each other’s strengths to achieve unprecedented levels of knowledge, innovation, learning, and progress.
Steve Jobs provides an interesting point of departure and discussion vehicle for understanding these two areas. He played a pivotal role in advancing human-machine interactivity and reshaping how we engage with technology and knowledge. His contributions spanned hardware, software, and design, fundamentally changing the way people interact with computers and other devices.
Working with Steve Wozniak, Jobs’ emphasis on craftsmanship, simplicity, and purpose inspired a generation of designers, engineers, and technologists to prioritize simplicity and user-centric design with the Apple II and later other products. Jobs believed in the power of simplicity, stripping away unnecessary complexity to create products that were easy to understand and use.
Jobs popularized the Graphical User Interface (GUI) with the Lisa and Macintosh “microcomputers.” Drawing on WIMP (Windows, Icons, Menus, and Pointers) technologies from developed at Xerox PARC for the Aloha Alto computer, he worked to make computers more intuitive and accessible to the masses. This revolutionized how people interacted with technology, moving beyond text commands to hand movements.
The Lisa and Mac introduced WYSIWYG (What You See Is What You Get) editing, allowing users to see their work on screen as it would appear in print, significantly improving user experience. This made the Laser printer and desktop publishing possible. He had a great appreciation for different fonts and prioritized their integration into the Mac.
Jobs championed the use of the mouse, making computer interaction more natural and intuitive. It was Doug Engelbart’s famous 1968 “Demo” at the “Augmentation Research Center” at Stanford Research Institute (SRI) that developed computer technologies that would enhance or “augment” human performance. They innovated the mouse and early prototypes of many of the interface technologies that would be used commonly in the personal computer by the late 1980s. Jobs was instrumental in adopting and refining the mouse as a critical tool for interacting with GUIs. By reducing its cost and simplifying its design, the Macintosh made the mouse a standard input device.
Jobs emphasized the importance of user-centric design, focusing on creating products that were not only functional but also aesthetically pleasing and enjoyable to use. Jobs championed design thinking, emphasizing that technology should be intuitive and beautiful. His mantra, “It just works,” drove Apple to create devices that required minimal technical expertise, democratizing technology for the masses. It’s the subtle way that Apple builds its software and hardware to work together in a seamless experience for users.
Apple software, from Mac OS to iOS, also strove for ease of use and intuitive design, making technology more accessible to a wider range of users. Jobs pioneered a seamless integration of hardware and software, ensuring that devices were not only functional but delightful to use. This holistic approach elevated user experience to new heights.
The iPod (2001) and iPhone (2007) revolutionized human-machine interaction with their multi-touch interface, eliminating physical buttons in favor of gestures like swiping, pinching, and tapping. Jobs refined the use of multi-touch gestures on the iPhone and iPad, making interactions more fluid and natural. The iPad blurred the line between consumption and creation, enabling users to read, write, draw, and interact in entirely new ways. The App Store extended this revolution, creating a vibrant online techosystem for accessing appls that enhanced knowledge, tools, and entertainment.
Jobs incorporated and refined technologies like Siri, Apple’s virtual assistant, which introduced natural language processing and voice interaction, making machines more responsive to human needs. Siri brought voice control to the mainstream, enabling more intuitive and hands-free interaction with devices.
Jobs instilled a deep appreciation for user experience across the technology industry, leading to a greater focus on creating products that are not only functional but also enjoyable and intuitive to use.
These contributions have fundamentally changed how we interact with technology, making it more accessible, intuitive, and enjoyable for billions of people worldwide.
Steve Jobs Pioneering Work in Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK)
Steve Jobs, a counter-cultural visionary from Silicon Valley, profoundly influenced Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK) through his work at Apple and NeXT. He emphasized simplicity, user-centric design, and seamless integration of hardware and software, fundamentally changing how people interact with technology and access knowledge.
His contributions to Human-Machine Interaction (HMI) included simplified design, GUI interaction, WYSIWYG editing, as well as touch (haptic) and voice interaction. Jobs prioritized intuitive, user-friendly designs, removing unnecessary complexity to make technology accessible to all. By using GUI and integrating WIMP technologies in the Lisa and Macintosh, Jobs replaced text-based commands with intuitive visual interactions. Inspired by Doug Engelbart’s innovations, Jobs refined and popularized the mouse, making it affordable and central to personal computing.
The Lisa and Macintosh introduced “What You See Is What You Get” editing, enabling users to see their work as it would appear in print and advancing tools like desktop publishing. The iPhone and iPad revolutionized interaction with multi-touch gestures such as swiping and pinching, offering a more natural and fluid user experience. With Siri, Jobs introduced natural language processing to mainstream devices, enabling hands-free, intuitive interactions.
Contributions to Human-Machine Knowledge (HMK) including democratizing knowledge, creating a App Store technosystem, and enhancing creativity with technology to empower users. Devices like the iPod, iPhone, and iPad provided tools for learning, creativity, and communication in a portable and accessible format. Jobs fostered an ecosystem of applications, enabling users to access tools for innovation, education, and entertainment, thereby enhancing knowledge-sharing and creation. Jobs blended also artistic design with technological innovation, making products that were both functional and inspiring. His designs empowered individuals to create, manipulate, and share knowledge, bypassing traditional technical or bureaucratic barriers.
Humans, at their best, excel at creativity, judgment, and complex reasoning. In contrast, machines excel at accuracy, speed, storage, processing, and speedily transmitting vast amounts of data. HMI, also known as MMI or man-machine interaction, is defined as the interaction between human operators and devices through multiple interfaces.
Human-Machine Knowledge (HMK) is an emerging interdisciplinary field that explores how humans and machines can effectively collaborate, construct, and share knowledge to achieve outcomes that surpass what either could accomplish alone. HMK emphasizes creating systems where humans and machines complement each other’s strengths. HMK seeks to create a symbiotic relationship between humans and machines, where they can leverage each other’s strengths to achieve unprecedented levels of knowledge, innovation, learning, and progress.
Steve Jobs provides an interesting point of departure and discussion vehicle for understanding these two areas. He was pivotal in advancing human-machine interactivity and reshaping how we engage with technology and knowledge. His contributions spanned hardware, software, and design, fundamentally changing how people interact with computers and other devices.
Working with Steve Wozniak, Jobs’ emphasis on craftsmanship, simplicity, and purpose inspired a generation of designers, engineers, and technologists to prioritize simplicity and user-centric design with the Apple II and later other products. Jobs believed in the power of simplicity; he stripped away unnecessary complexity to create products that were easy to understand and use.
Jobs popularized the Graphical User Interface (GUI) with the Lisa and Macintosh “microcomputers.” Drawing on WIMP (Windows, Icons, Menus, and Pointers) technologies developed at Xerox PARC for the Aloha Alto computer, he worked to make computers more intuitive and accessible to the masses. This innovation revolutionized how people interacted with technology, moving beyond command line interfaces and text commands to hand movements.
The Lisa and Mac introduced WYSIWYG (What You See Is What You Get) editing, allowing users to see their work on screen as it would appear in print, significantly improving user experience. This made the Laser printer and desktop publishing possible. He also had a great appreciation for different fonts and prioritized their integration into the Mac.
Jobs championed using the mouse, making computer interaction more natural and intuitive. It was Doug Engelbart’s famous 1968 “Demo” at the “Augmentation Research Center” at Stanford Research Institute (SRI) that developed computer technologies that would enhance or “augment” human performance. They innovated the mouse and early prototypes of many interface technologies that would be commonly used in the personal computer by the late 1980s. Jobs was instrumental in adopting and refining the mouse as a critical tool for interacting with GUIs. By reducing its cost and simplifying its design, the Macintosh made the mouse a standard input device.
Jobs emphasized the importance of user-centric design, focusing on creating products that were not only functional but also aesthetically pleasing and enjoyable to use. Jobs championed design thinking, emphasizing that technology should be intuitive and beautiful. His mantra, “It just works,” drove Apple to create devices that required minimal technical expertise, democratizing technology for the masses. It’s the subtle way that Apple builds its software and hardware to work together in a seamless experience for users.
Apple software, from Mac OS to iOS, also strove for ease of use and intuitive design, making technology more accessible to a broader range of users. Jobs pioneered a seamless integration of hardware and software, ensuring that devices were functional and delightful to use. This holistic approach elevated user experience to new heights.
The iPod (2001) and iPhone (2007) revolutionized human-machine interaction with their multi-touch interface, eliminating physical buttons in favor of gestures like swiping, pinching, and tapping. Jobs refined multi-touch gestures on the iPhone and iPad, making interactions more fluid and natural. The iPad blurred the line between consumption and creation, enabling users to read, write, draw, and interact in entirely new ways. The App Store extended this revolution, creating a vibrant online techosystem for accessing apps that enhanced knowledge, tools, and entertainment.
Jobs incorporated and refined technologies like Siri, Apple’s virtual assistant, which introduced natural language processing and voice interaction, making machines more responsive to human needs. Siri brought voice control to the mainstream, enabling more intuitive and hands-free interaction with devices.
Jobs instilled a deep appreciation for user experience across the technology industry, leading to a greater focus on creating products that are not only functional but also enjoyable and intuitive to use.
These contributions have fundamentally changed how we interact with technology, making it more accessible, intuitive, and enjoyable for billions of people worldwide.
Steve Jobs, a counter-cultural visionary from Silicon Valley, profoundly influenced Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK) through his work at Apple and NeXT. He emphasized simplicity, user-centric design, and seamless integration of hardware and software, fundamentally changing how people interact with technology and produce knowledge.
His contributions to Human-Machine Interaction (HMI) included simplified design, GUI interaction, WYSIWYG editing, as well as touch (haptic) and voice interaction. Jobs prioritized intuitive, user-friendly designs, removing unnecessary complexity to make technology accessible. By using GUI and integrating WIMP technologies in the Lisa and Macintosh, Jobs replaced text-based commands with intuitive visual interactions. Inspired by Doug Engelbart’s innovations, Jobs refined and popularized the mouse, making it affordable and central to personal computing.
The Lisa and Macintosh introduced “What You See Is What You Get” editing, enabling users to see their work as it would appear in print and advancing tools like desktop publishing. The iPhone and iPad revolutionized interaction with multi-touch gestures such as swiping and pinching, offering a more natural and fluid user experience. With Siri, Jobs introduced natural language processing to mainstream devices, enabling hands-free, intuitive interactions.
Contributions to Human-Machine Knowledge (HMK) include democratizing knowledge, creating an App Store ecosystem, and enhancing creativity with technology to empower users. Devices like the iPod, iPhone, and iPad provided tools for learning, creativity, and communication in a portable and accessible format. Jobs fostered an ecosystem of applications, enabling users to access tools for innovation, education, and entertainment, enhancing knowledge-sharing and creation. Jobs also blended artistic design with technological innovation, making functional and inspiring products. His designs empowered individuals to create, manipulate, and share knowledge, bypassing traditional technical or bureaucratic barriers.
Jobs’ work laid the foundation for more intuitive, accessible, and enjoyable technology. His contributions bridged the gap between humans and machines, enabling collaboration and knowledge-sharing at an unprecedented scale, impacting billions of users worldwide. His contributions have fundamentally changed how we interact with technology. His focus on HMI and, to a lesser extent, HMK principles, has made technology more accessible, intuitive, and enjoyable for billions of people worldwide.
Citation APA (7th Edition)
Pennings, A.J. (2025, Jan 24) Steve Jobs Pioneering Work in Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK). apennings.com https://apennings.com/artificial-intelligence/steve-jobs-pioneering-work-in-human-machine-interaction-hmi-and-human-machine-knowledge-hmk/
Links
Pennings, A.J. (2017, May 27) Not Like 1984: GUI and the Apple Mac. apennings.com https://apennings.com/how-it-came-to-rule-the-world/not-like-1984-gui-and-the-apple-mac/
Pennings, A.J. (2018, Oct 10) TIME Magazine’s “Machine of the Year”. apennings.com https://apennings.com/financial-technology/digital-spreadsheets/time-magazines-machine-of-the-year/
Pennings, A.J. (2018, Oct 19) Apple’s GUI and the Creation of the Microsoft’s Excel Spreadsheet Application. apennings.com
Pennings, A.J. (2023, April 16). The Digital Spreadsheet: Interface to Space-Time and Beyond? apennings.com https://apennings.com/technologies-of-meaning/the-digital-spreadsheet-interface-to-space-time-and-beyond/
Pennings, A.J. (2024, Dec 7) The Framing Power of Digital Spreadsheets. apennings.com https://apennings.com/meaning-makers/the-framing-power-of-digital-spreadsheets/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Human-Machine Interaction (HMI) > Human-Machine Knowledge (HMK)
Connecting a Dangerous World: Border Gateway Protocol (BGP) and National Concerns
Posted on | December 9, 2024 | No Comments
This post discusses what Border Gateway Protocol (BGP) does and some of the risks it poses. While BGP is necessary for the functioning of the Internet, it also presents several security concerns due to its decentralized and trust-based nature. This raises national concerns, and has drawn the attention of the cybersecurity efforts by regulatory agencies such as the Federal Communications Commission (FCC) in the US. This post examines how governments and malicious actors can exploit BGP vulnerabilities for censorship, surveillance, and traffic control. Techniques include implementing blacklists, rerouting traffic, and partitioning networks, as seen with China’s Great Firewall. Such actions enable monitoring, filtering, or isolating traffic and raise concerns about privacy, Internet freedom, and global access.
The Internet is considered a network of interconnected networks. People with devices like PCs or mobile phones (hosts) connect to Internet via networked Internet Service Providers (ISPs) that in turn are connected to other ISPs. Applications running on devices work with other devices through a maze of interconnections that pass data from router to router within the ISP’s domain and through or to other ISPs, enterprises, campuses, etc. The resulting network of networks is quite complex, but it connects the world’s Internet traffic with the help of specific routing protocols.
Border Gateway Protocol (BGP) is one such protocol and integral to the global Internet. Sometimes known as the “Post Office of the Internet,” it was developed in the 1980s to help networks exchange information about how to reach other networks. They basically determine the best path for data packets to reach their destination. BGP enables network administrators to manage and optimize network traffic by advertising the Internet routes they offer and the ones they can reach. With BGP, they can prioritize certain routes, influence the path selection process to balance traffic loads between different servers, and adapt to changing network conditions.
Despite its usefulness, many nations are worried about BGP’s security vulnerabilities.
In the US, the Federal Communications Commission (FCC) is concerned that the Internet presents security threats to the US economy, defense capabilities, public safety, and critical utilities such as energy, water, and transportation. These concerns are echoed by other countries. Malicious or sometimes incompetent actors can exploit or mismanage BGP vulnerabilities through hijacking or spoofing attacks. They can also reroute traffic to intercept or disrupt data flows. The FCC says that while efforts have been made to mitigate the Internet’s security risks, more work needs to be done, especially on BGP.
How Does BGP Work?
Jim Kurose does a great job explaining BGP and its importance in this video:
BGP connects the world by enabling communication and cooperation among autonomous systems, ensuring that data packets can traverse the vast and interconnected network of networks that make up the Internet. It bridges separate, but also networked ASes such as campuses, companies, and countries. BGP works to ensure that data packets originating from one location can cross over between ISPs and other WANS (Wide Area Networks) to reach their destination anywhere else on the planet. Network routers read the prefix numbers on packets to determine how which way they will be sent.
BGP’s ability to adapt to changing network conditions, manage data traffic, and facilitate redundant paths is crucial for the stability and reliability of the global Internet, but it also poses several dangers. Without the continuing implementation of software-defined networks (SDN), BGP will organize routing tables locally throughout its network. And the information for the routing calculations will be based on trust relationships with other ASes. Ideally, this will result in connections that can quickly reroute traffic through alternative paths to maintain network integrity. If one route becomes unavailable due to network outages, maintenance, or policy, it has the ability to quickly find other routes.
BGP is designed for interdomain routing, which means it focuses on routing decisions between different autonomous systems. This is in contrast to intradomain routing protocols like OSPF or Enhanced Interior Gateway Routing Protocol (EIGRP), which operate within a single AS. BGP is the protocol of choice for interdomain routing and that can mean between countries and even large scale Tier 1 ISPs.
Who Uses Border Gateway Protocols?
BGP users include telecommunications companies such as ISPs, Content Delivery Networks (CDNs) such as Akamai and Cloudflare, and Internet Exchange Points (IXPs) that bypass multiple networks and allow specific networks to interconnect directly rather than routing through various ISPs. Cloud providers like Amazon’s AWS, Google Cloud, and Microsoft Azure also use BGP to manage the traffic between their data centers and ISPs, allowing them to provide reliable cloud services globally.
Many large enterprises with extensive networks operate their own ASes and have BGP access to control routing policies across their internal networks and connections to external services. Universities and research institutions often employ their own ASes and use BGP when connecting to national and international research networks supporting scientific collaboration.
The image above from Top “tier-1” commercial Internet Service Providers (ISPs) use BGP as well. Tier-1 ISPs are considered the top-tier providers in the global internet hierarchy. They own and operate extensive networks and are responsible for a significant portion of the global Internet’s infrastructure. BGP is crucial for them route and exchange network reachability information with ASes, and it plays a crucial role in how these tier-1 ISPs manage their networks and interact with other ASes. Tier-1 ISPs use BGP to implement routing policies that align with their business strategies and network management goals.
A Tier-1 ISP has access to an entire Internet Region like Singapore solely via its free and reciprocal peering agreements with BGP as its glue. Examples include AT&T in the US or KDDI in Japan. BGP allows them to announce their IP prefixes to the wider Internet and receive routing updates from other ASes. Tier-1 ISPs use BGP to make routing decisions based on various criteria, including network policies, path attributes, and reachability information. BGP allows them to determine the best path for routing traffic through their networks, considering factors like latency, cost, and available capacity.
Tier-1 ISPs can establish BGP peer relationships with other ASes. These relationships can take the form of settlement-free peering or transit agreements. Peering involves the mutual exchange of traffic between two ASes, while transit agreements typically involve the provision of Internet connectivity to a customer AS in exchange for a fee. Network effects increase the importance and centrality of existing network hubs, giving them a stronger “gravitational pull,” making it more difficult for new entrants to establish themselves in the market. Effective relationships enable the global Internet to function as a connected network of networks.
BGP allows the managers of autonomous systems to consider various factors when selecting the best path, including network policies, routing metrics, and the reliability and performance of available routes. BGP helps maintain Internet reachability by constantly updating routing tables and responding to changes in network topology. It identifies the most efficient path for data packets to travel from source to destination and allows ISPs to advertise what routes they are able to offer other ISPs. BGP empowers network managers to control how data is routed, manage traffic, and enforce security policies.
How Governments Use and Abuse BGP
Military agencies usually maintain BGP access within their data infrastructure, especially to secure sensitive networks or manage national Internet traffic and, in some cases, control public Internet access to their networks. BGP allows militaries to define specific routing policies, such as prioritizing certain types of traffic (e.g., command-and-control data) or restricting traffic to trusted allies. In field operations, militaries use deployable communication systems that rely on satellite links and mobile base stations. BGP allows these systems to dynamically integrate into broader military networks. Militaries increasingly rely on drones and Internet of Things (IoT) devices, which require efficient routing of data. BGP works to ensure that data from these systems is routed optimally within military infrastructures.
A study of the early Russian-Ukrainian conflict revealed that Russian and separatist forces modified BGP routes to establish a “digital frontline” that mirrored the military one. This strategy involved diverting local internet traffic from Ukraine, the Donbas region, and the Crimean Peninsula. The research focused on analyzing the strategies employed by actors manipulating BGP, categorizing these tactics, and mapping digital borders at the routing level. Additionally, the study anticipated future uses of BGP manipulations, ranging from re-routing traffic for surveillance to severing Internet access in entire regions for intelligence or military objectives. It underscored the critical role of Internet infrastructure in modern conflict, illustrating how BGP manipulations can serve as tools for strategic control in both cyber and physical domains.
Government and other malicious actors can manipulate the Internet through multiple techniques including BGP hijacking, IP blacklisting and filtering, network partitioning and isolation, content monitoring and traffic analysis, traffic throttling and prioritization, shutdowns and access control, border routing as well as policies and compliance.
By influencing or manipulating BGP routes, governments or actors with access to BGP-enabled networks can reroute traffic to go through specific regions or servers. This is often done by injecting false BGP announcements to redirect traffic to specific router. This can allow governments to block, intercept, or monitor certain data flows. Such an approach was seen with incidents in various countries where traffic was rerouted through state-managed systems.
Governments worldwide can influence or control national ASes and various network providers. They can use BGP to dictate the paths data takes across the Internet on a large scale to manage, manipulate, or filter traffic for their own ends. This capability provides a point of control that governments can leverage for regulatory, security, or censorship purposes.
Government and military agencies usually maintain BGP access within their data infrastructure, especially to secure sensitive networks or manage national Internet traffic and, in some cases, control public Internet access. Governments worldwide can influence or control national ASes and local network providers. Because BGP controls traffic flow, it provides a point of control that governments can leverage for regulatory, security, or censorship purposes. They can dictate the paths data takes across the Internet on a large scale to manage, manipulate, or filter traffic for their own ends.
Governments can mandate that ISPs refuse to announce or accept specific IP prefixes or routes associated with restricted sites or content. By implementing BGP blacklists, they can prevent access to certain websites or services entirely by removing or altering the BGP routes that lead to these destinations, effectively blocking them at the network level.
Some governments impose strict routing policies that partition national networks from the global Internet. By requiring ISPs to use BGP filtering rules that isolate local traffic, they can keep Internet activity confined within national borders. China’s Great Firewall is an example, where BGP filtering and routing policies prevent certain global routes and confine users to government-approved internet spaces.
Governments can influence routing so that Internet traffic passes through surveillance or monitoring points. By injecting specific BGP routes, traffic can be directed to infrastructure where deep packet inspection (DPI) or other monitoring techniques are applied. This enables governments to analyze or even censor content in real time.
Through BGP route manipulation, governments can slow down or prioritize traffic to specific networks. For example, they may route traffic through slower networks or specific filtering points to control Internet speeds to certain services or prioritize government-approved traffic sources.
In extreme cases, governments can mandate ISPs to withdraw BGP routes to cut off access entirely, effectively disconnecting regions, communities, or entire countries from the global Internet. This can be seen in certain political scenarios or during unrest when governments initiate BGP route withdrawals, isolating the local Internet temporarily.
Governments can also enforce policies that restrict data to specific geographic boundaries, requiring ISPs to adjust BGP configurations to comply with data residency or border policies. This limits data flows outside national borders and aligns with regulatory frameworks on data sovereignty.
Concerns
Nations worldwide have growing concerns regarding the security and resilience of BGP, which is fundamental to Internet routing. While critical for directing Internet traffic between ASes, BGP has vulnerabilities that can pose significant risks to national security, data integrity, and overall network resilience.
Through these mechanisms, governments can exercise significant influence over network behavior and access at a national level, using BGP as a powerful tool for traffic control, monitoring, and regulation. Such actions raise concerns over Internet freedom, privacy, and access rights on a global scale.
Citation APA (7th Edition)
Pennings, A.J. (2023, Dec 10). Connecting a Dangerous World: Border Gateway Protocol (BGP) and National Concerns. apennings.com https://apennings.com/global-e-commerce/connecting-a-dangerous-world-border-gateway-protocol-bgp-and-national-concerns/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Amazon's AWS > AT&T > autonomous system (AS) > Border Gateway Protocol (BGP) > Content Delivery Networks (CDNs) > Denial-of-Service (DoS) > Distributed Denial-of-Service (DDoS) > Federal Communications Commission (FCC) > Google Cloud > Internet Exchange Points (IXPs) > Microsoft Azure > Open Shortest Path First (OSPF) > Resource Public Key Infrastructure (RPKI) > Tier-1 ISPs
The Framing Power of Digital Spreadsheets
Posted on | December 7, 2024 | No Comments
Digital spreadsheets like Excel have framing power because they shape how information is chosen, organized, interpreted, and presented. These capabilities directly influence decision-making and resource prioritization within organizations. The power of framing arises from the ability to define what data is included, how it is processed by the spreadsheets functions or formulas, and the visual or numerical emphasis placed on specific inputs and outcomes. Spreadsheets exert framing power through selecting and prioritizing data, building formula logic and embedded assumptions, standardizing norms and templates, simplifying complex realities, and selectively presenting results.
This post continues my inquiry into the remediation of digital spreadsheets and the techno-epistemological production of organizational knowledge. This includes a history of spreadsheet technologies including VisiCalc, Lotus 1-2-3, and Microsoft’s Excel as well as the functions and formulas they integrated over time. Digital spreadsheets built on the history of alphabetical letteracy and the incorporation of Indo-Arabic numerals, including zero (0) and calculative abiltities built up through administrative, commercial, and scientific traditions.
Spreadsheets frame discussions by determining which data is included or excluded, consequently controlling narratives and resource decisions. For instance, only presenting revenue figures without costs can create a biased perspective on the financial health of an organization. By selecting and emphasizing certain key performance indicators (KPIs) over others, spreadsheets prioritize specific organizational goals (e.g., profitability over sustainability). A budget sheet that highlights “Cost Savings” as a primary metric frames spending decisions in a cost-minimization mindset. Those designing spreadsheets gain control and power by deciding what aspects of reality are quantified and analyzed.
Spreadsheet formulas embed certain assumptions about relationships between variables (e.g., Profit = Revenue – Costs assumes no other factors like opportunity costs). The logic built into formulas can obscure biases or simplify complexities, shaping decision-making paths. For example, a financial projection using =RevenueGrowthRate * PreviousRevenue assumes linear growth and potentially oversimplifies future uncertainties. What-if scenario analysis in spreadsheets often reflects the biases or priorities of the person constructing the formulas. These biases can frame potential outcomes in specific ways.
Spreadsheets can become templates for recurring processes, setting standards for what is considered “normal” or “important” in an organization. Those who create and control spreadsheet templates have the power to define organizational norms and expectations and codify power dynamics. Establishing standardization reinforces certain frames, perpetuating specific ways of viewing and evaluating organizational performance over time. A standardized sales report may continuously emphasize gross sales, neglecting other factors like customer churn.
Spreadsheets distill complex realities into numbers, tables, and graphs, simplifying nuanced issues into quantifiable elements. This reductionism simplifies complex realities. An example is reducing employee performance to numeric scores (e.g., Productivity = TasksCompleted / HoursWorked) that may overlook qualitative contributions. Critical contextual factors, such as market volatility or employee morale, may be excluded, shifting focus to what is easily measurable. By reducing complexity, spreadsheets prioritize some perspectives while marginalizing others.
Those who design spreadsheets often decide how data is framed and can influence organizational decision-making disproportionately. Those who gain access and control over the spreadsheet design have an inside track to the power dynamics of an organization. Control over spreadsheet design shapes how others interpret and interact with the data, reinforcing the designer’s influence and power. In an organization with limited transparency, complex formulas or hidden sheets can obscure understanding of the data. This can disadvantage other users and consolidate power with those with the skills and tools to create the organization’s spreadsheets.
The order in which data is presented (e.g., revenue before costs) in spreadsheets influences how stakeholders mentally process the information. Structural and layout choices affect how data is understood and what conclusions are drawn. Visualizations like pie charts, bar graphs, and trend lines direct focus to certain comparisons or patterns, and frame how priorities are perceived.
Finally, presentation decisions about formatting (e.g., conditional highlights, bold text) and visualization (e.g., charts) draw attention to specific elements. For example, highlighting a profit margin cell in green or showing a declining revenue trend in red emphasizes success or urgency. Choosing to present aggregated data (e.g., total revenue) versus granular details (e.g., revenue by product) influences how complexity is perceived and addressed. Presentation choices affect data interpretation, often steering stakeholders toward certain conclusions.
In conclusion, digital spreadsheets are powerful technologies that frame knowledge in organizations and consequently they produce power by determining how data is selected, processed, and presented. Their influence lies not just in the data they hold but in how they structure understanding and shape decision-making. Those who design spreadsheets can wield significant control over organizational narratives and perspectives, making it critical to use these tools with awareness of their potential limitations and strengths.
Citation APA (7th Edition)
Pennings, A.J. (2024, Dec 7) The Framing Power of Digital Spreadsheets. apennings.com https://apennings.com/meaning-makers/the-framing-power-of-digital-spreadsheets/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: digital spreadsheet > Excel > remediation theory