AI and Government: Concerns Shaped from Classic Public Administration Writings
Posted on | February 9, 2025 | No Comments
Recent events in US politics have highlighted tensions over our conception of government, the role of business in public affairs, and even how artificial intelligence (AI) should be used in the bureaucratic systems of the US federal system. This post covers some of the primary reasons why government, even in its search for efficiency, differs from business by drawing on historical writings in public administration. Insights from these analyses can start to specify some constraints and goals for government AI systems so that they are designed to be transparent, explainable, and regularly audited to ensure fairness, avoid bias and discrimination, and protect citizen privacy. This means that the data used, the algorithms employed, and the reasoning behind decisions should be clear and understandable to human overseers and the public. This is crucial for accountability and building trust in governmental AI mechanisms.
Public administration is the art and science of managing public programs and policies and coordinating bureaucratic strategies and public affairs. Public administration was an important part of my PhD in Political Science, and I was particularly interested in the role of information technology (IT) and networking, including its use in financial tasks. As we move forward with IT and even AI in government, it is critical that they be designed and programmed with ideals gleaned from years of public administration deliberations.
The debate over whether government should be run like a business has been a long-standing issue in public administration. The historical writings of public administration offer compelling reasons why government is fundamentally different from business. Scholars such as Paul Appleby, Harland Cleveland, Dwight Waldo, Max Weber, and even US President Woodrow Wilson have articulated key differences between government and business, emphasizing the distinct purposes, structures, and constraints that define the public administration of government. Also important is the role of politics, a fundamental component of the democratic agenda, but one that is not always conducive to efficiencies and values present in the private sector.
This post covers some of the primary reasons why government differs from business, drawing on historical writings in public administration, including political constraints, public interest vs. profit maximization, accountability and transparency, decision-making and efficiency constraints, monopoly vs. market competition, legal and ethical constraints, and distinctions between service vs. the consumer model.
By carefully considering these challenges and drawing on the wisdom of the classics of public administration, it may be possible to start to train the power of AI to create “smart” but ethical government systems that serve the public interest and promote the well-being of all citizens. Currently, the Trump administration with Elon Musk seems to be building a “digital twin” of the payment system at the Treasury and other parts of the administrative heart of the US government, probably in the new datacenter called Colussus built in Memphis.
Digital twins are a powerful tool for training AI models as they can help to generate data, simulate scenarios, and explain AI models. It mimics current systems as it trains a new AI engine with the goal of developing new types of digital bureaucracies and services. As digital twin technology develops with faster chips and larger data centers, it will likely play an even greater role in training AI government models. This innovation is new and unprecedented and should only be pursued with the highest intentions and a solid basis in democratic and public administration understanding.
Political Constraints and Efficient Bureaucracies
Woodrow Wilson (1887), in “The Study of Administration,” addressed the issue of government efficiency and argued that public administration should be distinct from politics. For him, government is ultimately driven by the public good, not financial gain. He emphasized the need for a professional and efficient bureaucracy to implement public policy. Wilson’s emphasis on the separation of politics and administration highlighted the need for a professional and impartial bureaucracy.
Paul Appleby (1945) reinforced this position by stating that government serves a broad public interest rather than a select group of stakeholders. Government’s core purpose is to serve the public interest and promote the general welfare of society. This includes providing essential services, protecting citizens, and promoting social equity.
Governments often operate with a longer-term perspective, considering the needs of future generations and the long-term sustainability of policies and programs. Businesses, while also concerned with long-term success, often prioritize shorter-term financial goals. Businesses prioritize profit, efficiency, and shareholder value, whereas governments must balance equity, justice, and service delivery even when it’s not profitable (e.g., social security, public education). For example: The government provides social services like healthcare for seniors, unemployment reflief, and welfare, which businesses would find unprofitable.
Businesses are legally required to maximize profits for their shareholders. In contrast, government’s core purpose is to serve the public interest and promote the general welfare of society. This includes providing essential services, protecting citizens, and promoting social equity. By keeping Appleby’s insight at the forefront, AI development in government can be guided by a commitment to serving the broad public interest and strengthening democratic values.
Accountability, Transparency, and Legitimacy
Max Weber emphasized that government agencies operate under legal-rational authority, meaning they follow laws, regulations, and procedures that are meant to ensure transparency and accountability. Businesses operate under market competition and corporate governance, where decisions can be made with greater discretion without public oversight. Weber’s work on bureaucracy underscores the importance of formal rules, clear procedures, and hierarchical structures in government organizations. This translates to AI systems needing well-defined architectures, clear lines of authority for decision-making, and specific functions for each component. These frameworks may ensure accountability and prevent AI from overstepping its intended role.
In his seminal work, Economy and Society (1922), Weber articulated fundamental differences between government and business.
His analysis highlighted the structural, operational, and accountability-based distinctions between the two domains. He distinguished government from business in several ways: Government bureaucracy operate under legal authority, meaning it follows a fixed set of laws and regulations. Business bureaucracy is primarily driven by profit motives and market competition, with more flexibility in decision-making. Government officials also follow formal rules and legal mandates, while business executives can make discretionary decisions based on market conditions. For example: A government agency must adhere to strict procurement laws when purchasing supplies, whereas a business can choose vendors based on cost efficiency alone.
Dwight Waldo (1948) in The Administrative State highlighted that government accountability is complex because it must answer to multiple stakeholders (citizens, courts, legislatures), unlike businesses that primarily answer to investors. For example, governments hold public hearings and legislative reviews before making budgetary decisions, whereas businesses do not require public approval before adjusting financial strategies.
Waldo challenged the traditional view that public administration could be purely technical and neutral. Governments are accountable to the public and operate under greater transparency requirements than businesses. This includes open records laws, public hearings, and legislative oversight. Public officials are also held to higher ethical standards than private sector employees, with expectations of impartiality, fairness, and integrity in their decision-making.
Waldo argued that bureaucracy is not just an administrative tool but a political institution, shaped by values, ideologies, and democratic principles. This makes accountability more complex than in business, where efficiency and profit are the primary concerns. His main points were:
– Bureaucracy is inherently political, not just administrative.
– Government agencies must answer to multiple, often conflicting, stakeholders.
– Bureaucratic power must be controlled through democratic institutions.
– Efficiency must be balanced with justice, ethics, and public values.
Governments possess coercive power, including the ability to tax, regulate, and enforce laws. Businesses, while also subject to regulations, primarily rely on market forces and voluntary transactions. Governments derive their legitimacy from democratic processes and the consent of the governed. Businesses, while also subject to societal expectations, primarily focus on satisfying customer demand and generating profits for investors.
Decision-Making and Efficiency Constraints
Herbert Simon (1947) in Administrative Behavior introduced the concept of “bounded rationality,” challenging the notion of perfect rationality in decision-making and explaining that government decisions are constrained by political pressures, competing interests, and complex regulatory environments.
Bounded rationality is often considered a more realistic model of human decision-making in organizations, recognizing the inherent limitations individuals face. Understanding bounded rationality can inform organizational design, promoting structures and processes that support effective decision-making within these constraints. Developing decision support tools and technologies can help overcome some of the limitations of bounded rationality, providing decision-makers with better information and analysis.
This concept recognizes that individuals, particularly in organizational settings, face inherent limitations preventing them from making perfectly rational decisions. These include limitations due to limited cognitive capacity and the inability to process all available information or consider every possible alternative when making decisions. Decision-makers also lack complete information about the situations, the potential consequences of their choices, or the preferences of others involved. Individuals are also prone to cognitive biases, such as confirmation bias (seeking information that confirms existing beliefs) and anchoring bias (over-relying on the first piece of information received), which can distort their judgment.
Simon argued that officials often “satisfice” instead of optimize. They make “good enough decisions” due to these limitations. They often choose the first option that meets their minimum criteria, rather than searching for the absolute best solution. Satisficing is often a more efficient approach, as it conserves cognitive resources and allows for quicker decision-making. However, it may not always lead to the optimal outcome.
By acknowledging the limitations of human rationality and designing AI systems that work within those constraints, governments can leverage AI to make more informed, efficient, and effective decisions. It’s about creating AI that assists human decision-makers in navigating the complexities of the real world, rather than attempting to achieve an unrealistic ideal of perfect rationality.
By acknowledging the limitations of human rationality and designing AI systems that work within those constraints, governments can leverage AI to make more informed, efficient, and effective decisions. It’s about creating AI mechanisms that assist human decision-makers in navigating the complexities of the “real world,” rather than attempting to achieve an unrealistic ideal of perfect rationality.
Philip Selznick (1949) in TVA and the Grass Roots conducted an important case study that showed how government decision-making is influenced by political negotiation and social considerations rather than just economic rationality. It challenged the traditional view of bureaucracy as a purely neutral and rational system. Instead, Selznick demonstrated that bureaucratic organizations are deeply political and shaped by social forces. His analysis of the Tennessee Valley Authority (TVA) revealed how local power dynamics, institutional culture, and informal relationships influence public administration.
The TVA, was a New Deal-era federal agency created in 1933 to promote regional economic development through infrastructure projects like dams and electricity generation. The TVA was originally designed as an apolitical, technocratic institution that would implement policy based on expertise rather than political considerations.
However, Selznick’s study showed that the TVA had to negotiate with local elites, businesses, and community groups to gain support for its programs. Rather than being a neutral bureaucracy, the TVA absorbed the interests and values of local stakeholders over time.
Political compromises often weakened the agency’s original mission of social reform and economic equality. For example, the TVA partnered with local conservative agricultural interests, even though these groups resisted social reforms that would have empowered poor farmers.
Selznick introduced the concept of “co-optation, which describes how bureaucratic organizations incorporate external groups to maintain stability and legitimacy. Instead of enforcing policies rigidly, agencies often have to adjust their goals to align with influential local actors. Co-optation helps agencies gain support and avoid resistance, but it can also dilute their original purpose. This explains why public organizations often fail to deliver radical change, even when they are designed to do so. For example, the TVA originally aimed to empower small farmers and promote land reform, but over time, it aligned itself with local business leaders and preserved existing power structures instead.
By embracing the principles of co-optation, governments can develop AI systems that serve the broader public interest and that development is guided by community engagement, transparency, and collaboration. AI development in government should involve active engagement with a wide range of stakeholders, including citizens, community groups, experts, and advocacy organizations. Co-optation can be used to address concerns and objections raised by external groups. By incorporating their feedback and making adjustments to AI systems, governments can mitigate potential opposition and build consensus.
Monopoly vs. Market Competition
Governments often hold a monopoly over essential services (e.g., national defense, law enforcement, public infrastructure) where competition is neither feasible nor desirable. Governments have broader responsibilities than businesses, encompassing national defense, social welfare, environmental protection, and infrastructure development. Technological changes, however, can change the dynamics of specific utilities. Telecommunications, for example, were primarily government-run facilities that worked to ensure universal service. To upgrade to the global Internet, however, these operations were largely deregulated or sold off to the private sector to invest in innovative new services. More recent discussions have pointed to “net neutrality” and even “cloud neutrality” to address the monopolization of services at the Internet’s edge, such as AI.
Leonard White (1926) in Introduction to the Study of Public Administration pointed out that government agencies do not face direct market competition, which affects incentives and operational efficiency. In contrast, businesses operate in a competitive market where consumer choice determines success. For example, the police department does not compete with private security firms in the way that Apple competes with Samsung in the smartphone market.
White also believed that public administration is the process of enforcing or fulfilling public policy. Since profit is not the primary goal, it’s crucial to define what constitutes “success” for AI systems in government. This might include citizen satisfaction, efficiency gains, improved outcomes, or reduced costs. By carefully considering the unique dynamics of government agencies and incorporating AI in a way that addresses the challenges of limited market feedback and different incentive structures, governments can leverage AI to create more effective, responsive, and citizen-centric services.
Legal and Ethical Constraints
Governments must operate under constitutional and legal constraints, ensuring adherence to democratic principles and human rights. Frank Goodnow (1900) in Politics and Administration argued that public administration is shaped by legal frameworks and public policy goals rather than market forces.
Public officials must follow strict ethical codes and conflict-of-interest regulations that go beyond corporate ethics policies. For example, a government agency should not arbitrarily cut services to boost its budget surplus, whereas a corporation can cut unprofitable product lines without legal repercussions.
Goodnow was one of the first scholars to formally separate “politics” from “administration,” arguing that politics involves the creation of laws and policies through elected representatives. Administration is the implementation of those laws and policies by bureaucratic agencies. Public administration should be neutral, professional, and guided by legal rules, rather than influenced by political pressures.
For example, Congress (politics) passes a law to regulate environmental pollution, and the Environmental Protection Agency (EPA) or the Federal Communications Commission (FCC) (administration) enforces and implements their laws and regulations through technical expertise and bureaucratic processes. Goodnow emphasized that public administration derives its legitimacy from legal and constitutional frameworks, not from market competition.
He argued that government agencies must operate within the rule of law, ensuring fairness, justice, and accountability. Laws define the scope of administrative power, unlike businesses that act based on profit incentives. Bureaucrats should be trained professionals who follow legal principles rather than respond to political or market forces. A tax agency must enforce tax laws uniformly, even if doing so is inefficient, whereas a private company can adjust its pricing strategies according to profit strategies.
Unlike businesses, which prioritize efficiency and profitability, Goodnow argued that government agencies serve the public interest. They provide services that markets might ignore (e.g., public health, education, law enforcement). Public agencies must prioritize equity, justice, and democratic values rather than cost-cutting. The effectiveness of government is measured not just by efficiency but by fairness and public trust. For example, governments fund public schools to ensure universal education, even if private schools might cater to specific family or community preferences.
By adhering to strict ethical principles and conflict-of-interest regulations, governments can ensure that AI is used in a way that builds trust, promotes fairness, and serves the public interest. It’s about creating AI systems that are not only effective but also ethical and accountable.
Service vs. Consumer Model
Citizens are not “customers” in the traditional sense because they do not “choose” whether to participate in government services (e.g., paying taxes, following laws). Harlan Cleveland, a distinguished diplomat and President of the University of Hawaii in his later years argued in his (1965) article “The Obligations of Public Power,” that public administration must ensure universal access to critical services, regardless of financial status. Businesses, on the other hand, serve paying customers and can exclude non-paying individuals from services. For example, a government hospital must treat all patients, including those who cannot afford to pay, whereas a private hospital can refuse service based on financial capacity.
His arguments focused on the ethical, political, and practical challenges faced by government officials in wielding public power. The ethical responsibility of public officials included holding power on behalf of the people, meaning they must act with integrity and accountability. Cleveland warned against abuse of power and the temptation for bureaucrats to act in self-interest rather than the public good. He stressed the need for ethical decision-making in government to prevent corruption and misuse of authority. For example, a government official responsible for allocating funds must ensure fairness and avoid favoritism, even when pressured by political influences.
Public administration should strive to be effective but must not sacrifice democratic values to pursue efficiency. He argued that bureaucratic decision-making should be transparent and participatory, ensuring citizens have a voice in government actions. Efficiency is important, but equity, justice, and citizen involvement are equally critical. For example, government programs should not cut social programs simply because they are expensive—public welfare must be prioritized alongside financial considerations.
Cleveland emphasized that public power must be answerable to multiple stakeholders, including the public (through elections and civic engagement), legislatures (through oversight and funding), and the courts (through legal constraints and judicial review). Unlike businesses, which are accountable mainly to shareholders, government agencies must navigate complex and often conflicting demands from different groups. For example, a public health agency must justify its policies to elected officials (who determine budgets) and citizens (who expect effective services).
Cleveland also pointed to the growing complexity of governance, a term he was one of the first to use. Government agencies were becoming more complex and specialized, requiring public administrators to manage technological advancements and expanding regulations as well as international relations and globalization. Cleveland worried that bureaucracies might become too rigid and disconnected from the people, creating a gap between government and citizens.
By keeping Cleveland’s principle at the forefront, governments can leverage AI to create a more just and equitable society where everyone has access to the services they need to thrive. It’s about using technology to empower individuals, reduce disparities, and ensure that everyone has the opportunity to reach their full potential.
As government agencies adopt AI and data-driven decision-making, they must ensure that technology serves human interests and does not lead to excessive bureaucracy or loss of personal agency. Cleveland called for adaptive, innovative leadership in public administration to keep up with social, political, and technological changes. He criticized government agencies that resist reform or fail to evolve with society’s needs. Public administrators must be proactive, responsive, and forward-thinking rather than merely following routine procedures. For example, climate change policies require public agencies to anticipate future risks, rather than simply reacting to disasters after they occur.
For Cleveland, public service was a moral obligation, not just a technical or managerial function. He believed that serving the public is an ethical duty, requiring commitment to justice, fairness, and the common good. Bureaucrats must see themselves as stewards of public trust, not just rule enforcers.
Harlan Cleveland’s emphasis on universal access to critical services, regardless of financial status, is a fundamental principle that must guide the design of AI mechanisms in government. Cleveland argued that public administration, unlike business, has a fundamental obligation to serve all citizens regardless of their ability to pay, and must balance efficiency with democratic values like equity, justice, and citizen participation. He stressed the ethical responsibility of public officials to act in the public interest, be accountable to multiple stakeholders, and adapt to the growing complexity of governance.
These principles are crucial for guiding AI development in government. AI systems should be designed provide universal access to critical services, overcoming barriers like financial constraints, location, and digital literacy. AI systems should avoid sacrificing democratic values in the pursuit of efficiency while maintaining transparency and accountability, allowing citizens to understand and participate in AI-driven decision-making. Ultimately, AI in government should be a tool for enhancing public service and promoting the common good, not just a means to increase efficiency.
Conclusion: The Unique Role of Government and the Implications of AI
Public administration scholars have consistently emphasized that government is not simply a business; it operates under different principles, constraints, and objectives. While efficiency is valuable, the government’s primary goal is to serve the public good, uphold democracy, and ensure fairness and justice, even at the cost of financial efficiency. The writings of Appleby, Cleveland, Waldo, Weber, and Wilson, continue to reinforce the fundamental distinction between governance and business management.
Drawing on the classics of public administration, we can start to specify some constraints and goals for artificial intelligence (AI) and develop a “smart” but ethical government that is efficient but also responsive to public concerns.
Possibilities and Oversight
– AI systems used in government should be transparent, with open access to the data and algorithms used to make decisions. This allows for public scrutiny and accountability.
– Regular audits and oversight of AI systems are vital to ensure they function as intended and do not produce unintended consequences.
– AI systems should be designed to protect the privacy of citizens and ensure that their data is used responsibly and ethically.
– While AI can automate many tasks, human oversight is essential to ensure that AI systems are used in a way that aligns with ethical principles and societal values.
– AI can make government information more accessible to citizens, providing clear and concise explanations of policies and programs.
– AI can gather and analyze citizen feedback, providing valuable insights for policymaking and service delivery.
– AI can facilitate participatory governance, enabling citizens to contribute to decision-making processes and shape public policy.
Challenges and Considerations
– AI systems can perpetuate existing biases if not carefully designed and monitored. It’s essential to ensure that AI systems are fair and non-discriminatory.
– The automation of specific government tasks may lead to job displacement. It’s important to develop strategies for workforce transition and retraining.
– Building and maintaining public trust in AI is crucial for its successful adoption in government. This implementation requires a commitment to transparency, explainability, and accountability in all AI-related processes and decisions.
By carefully considering these opportunities and challenges and drawing on the wisdom of the classics of public administration, we can start to harness the power of AI to create a “smart” but ethical government that serves the public interest and promotes the well-being of all citizens. In future posts, I plan to draw on subsequent generations of public administration practitioners and scholars who provide more critical perspectives of more complex government structures that have emerged in the last century. Women’s voices, such as Kathy Ferguson’s critique of bureaucracy and Stephanie Kelton’s critique of government budgeting, are extremely valuable perspectives going forward. AI is undoubtedly on the near horizon for government services, and it should be approached with the understanding that such systems are capable, but not guaranteed, of being designed for the public good.
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 09) AI and Government: Concerns from Classic Public Administration Writings. apennings.com https://apennings.com/digital-geography/ai-and-government-concerns-from-classic-public-administration-writings/
Bibliography
Appleby, P. H. (1945). Big Democracy. Alfred A. Knopf.
Cleveland, H. (1965). The Obligations of Public Power. Public Administration Review, 25(1), 1–6.
Ferguson, Kathy (1984). The Feminist Case Against Bureaucracy. Temple University Press.
Goodnow, F. J. (1900). Politics and Administration: A Study in Government. Macmillan.
Kelton, Stephanie. (2020) “The Deficit Myth: Modern Monetary Theory and the Birth of the People’s Economy.” PublicAffairs.
Selznick, P. (1949). TVA and the Grass Roots: A Study in the Sociology of Formal Organization. University of California Press.
Simon, H. A. (1947). Administrative Behavior: A Study of Decision-Making Processes in Administrative Organizations. Macmillan.
Waldo, D. (1948). The Administrative State: A Study of the Political Theory of American Public Administration. Ronald Press.
Weber, M. (1922). Economy and Society: An Outline of Interpretive Sociology (G. Roth & C. Wittich, Eds.). University of California Press.
White, L. D. (1926). Introduction to the Study of Public Administration. Macmillan.
Wilson, W. (1887). The Study of Administration. Political Science Quarterly, 2(2), 197–222. https://doi.org/10.2307/2139277
Note: Several AI requests were prompted and parsed for this post.
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: bounded rationality > Dwight Waldo > Harland Cleveland > Herbert Simon > Max Weber > Paul Appleby > Philip Selznick > Tennessee Valley Authority (TVA) > Woodrow Wilson
AI Governance and the Public Management of Transportation
Posted on | February 6, 2025 | No Comments
I’m doing an audit of my work on the topic of AI Governance of the Automatrix for publication. It’s my collection of posts on Artificial Intelligence (AI) Governance and the Automatrix or maybe “Robomatrix?” It is about the public management of the future transportation infrastructure as it becomes increasingly “smart” and electric.
Pennings, A.J. (2025, Feb 09) AI and Government: Concerns from Classic Public Administration Writings. apennings.com https://apennings.com/digital-geography/ai-and-government-concerns-from-classic-public-administration-writings/
Pennings, A.J. (2024, Nov 21) Google: Monetizing the Automatrix – Rerun. apennings.com https://apennings.com/global-e-commerce/google-monetizing-the-automatrix-2/
Pennings, A.J. (2024, Oct 10). All Watched over by Systems of Loving Grace. apennings.com https://apennings.com/how-it-came-to-rule-the-world/all-watched-over-by-systems-of-loving-grace/
Pennings, A.J. (2024, Jun 24). AI and Remote Sensing for Monitoring Landslides and Flooding. apennings.com https://apennings.com/space-systems/ai-and-remote-sensing-for-monitoring-landslides-and-flooding/
Pennings, A.J. (2024, Jun 22). AI and the Rise of Networked Robotics. apennings.com https://apennings.com/technologies-of-meaning/the-value-of-science-technology-and-society-studies-sts/
Pennings, A.J. (2024, Jan 19). How Do Artificial Intelligence and Big Data Use APIs and Web Scraping to Collect Data? Implications for Net Neutrality. apennings.com https://apennings.com/technologies-of-meaning/how-do-artificial-intelligence-and-big-data-use-apis-and-web-scraping-to-collect-data-implications-for-net-neutrality/
Pennings, A.J. (2024, Jan 15). Networking Connected Cars in the Automatrix. apennings.com https://apennings.com/telecom-policy/networking-in-the-automatrix/
Pennings, A.J. (2022, Apr 22). Wireless Charging Infrastructure for EVs: Snack and Sell? apennings.com https://apennings.com/mobile-technologies/wireless-charging-infrastructure-for-evs-snack-and-sell/
Pennings, A.J. (2019, Nov 26). The CDA’s Section 230: How Facebook and other ISPs became Exempt from Third Party Content Liabilities. apennings.com https://apennings.com/telecom-policy/the-cdas-section-230-how-facebook-and-other-isps-became-exempt-from-third-party-content-liabilities/
Pennings, A.J. (2021, Oct 14). Hypertext, Ad Inventory, and the Use of Behavioral Data. apennings.com https://apennings.com/global-e-commerce/hypertext-ad-inventory-and-the-production-of-behavioral-data/
Pennings, A.J. (2012, Nov 8) Google, You Can Drive My Car. apennings.com https://apennings.com/mobile-technologies/google-you-can-drive-my-car/
Pennings, A.J. (2014, May 28) Google, You Can Fly My Car. apennings.com https://apennings.com/ditigal_destruction/disruption/google-you-can-fly-my-car/
Pennings, A.J. (2020, Feb 9). It’s the Infrastructure, Stupid. apennings.com https://apennings.com/democratic-political-economies/from-new-deal-to-green-new-deal-part-3-its-the-infrastructure-stupid/
Pennings, A.J. (2010, Nov 20). How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/star-wars-creates-the-internet/
Pennings, A.J. (2018, Sep 27) How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part II. apennings.com https://apennings.com/how-it-came-to-rule-the-world/how-star-wars-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-japan/
Pennings, A.J. (2011, Jan 2) How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part III: NSFNET and the Atari Democrats. apennings.com https://apennings.com/how-it-came-to-rule-the-world/how-%e2%80%9cstar-wars%e2%80%9d-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-part-iii-nsfnet-and-the-atari-democrats/
Pennings, A.J. (2017, Mar 23) How “STAR WARS” and the Japanese Artificial Intelligence (AI) Threat Led to the Internet, Part IV: Al Gore and the Internet. apennings.com https://apennings.com/how-it-came-to-rule-the-world/how-%e2%80%9cstar-wars%e2%80%9d-and-the-japanese-artificial-intelligence-ai-threat-led-to-the-internet-al-gor/
Pennings, A.J. (2014, Nov 11). IBM’s Watson AI Targets Healthcare. apennings.com https://apennings.com/data-analytics-and-meaning/ibms-watson-ai-targets-healthcare/
Pennings, A.J. (2011, Jun 19) All Watched Over by Machines of Loving Grace – The Poem. apennings.com https://apennings.com/how-it-came-to-rule-the-world/all-watched-over-by-machines-of-loving-grace-the-poem/
Pennings, A.J. (2011, Dec 04) The New Frontier of “Big Data”. apennings.com https://apennings.com/technologies-of-meaning/the-new-frontier-of-big-data/
Pennings, A.J. (2013, Feb 15). Working Big Data – Hadoop and the Transformation of Data Processing. apennings.com https://apennings.com/data-analytics-and-meaning/working-big-data-hadoop-and-the-transformation-of-data-processing/
Pennings, A.J. (2014, Aug 30) Management and the Abstraction of Workplace Knowledge into Big Data. apennings.com https://apennings.com/technologies-of-meaning/management-and-the-abstraction-of-knowledge/
Pennings, A.J. (2021, Oct 14). Hypertext, Ad Inventory, and the Use of Behavioral Data. apennings.com https://apennings.com/global-e-commerce/hypertext-ad-inventory-and-the-production-of-behavioral-data/
Anthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea, teaching financial economics and ICT for sustainable development, holding a joint appointment as a Research Professor at Stony Brook University. From 2002 to 2012, he was on the faculty of New York University, where he taught digital economics and information systems management. When not in Korea, he lives in Austin, Texas.
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 6) AI Governance and the Public Management of Transportationapennings.com https://apennings.com/digital-coordination/ai-governance-and-the-public-management-of-transportation/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Artificial Intelligence (AI) > Connected Cars
Lotus 1-2-3, Temporal Finance, and the Rise of Spreadsheet Capitalism
Posted on | February 3, 2025 | No Comments
One of the books I read during my PhD years was Barbarians at the Gate: The Fall of RJR Nabisco (1989), about the $25 billion leveraged buyout (LBO) of the iconic conglomerate (tobacco/snacks/) company by Kohlberg Kravis Roberts & Co. (KKR). Am LBO is the purchase of a company using high amounts of short-term debt and the target company’s assets as collateral for the loans. The purchaser or “raider” plans to pay off the debt using future cash flow. KKR’s LBO of RJR Nabisco became the largest and most famous LBO of its time and a major influence on my thinking about the role of digital spreadsheets and private equity in the economy.[1]
Barbarians at the Gate rarely mentions the role of spreadsheets, but I also had my interest sparked by Oliver Stone’s movie Wall Street (1987), which had a similar theme. It is a story about a fictional corporate raider called Gordon Gekko taking over a fictional company called Bluestar Airlines. Gekko mentors Bud Fox, a young financial analyst who is anxious to be successful. The movie draws on the gangster genre, with Stone replacing the iconic guns and cars of the traditional genre with spreadsheets and cellular telephones. I wrote about the movie in my 2010 post “Figuring Criminality in Oliver Stone’s Wall Street,” where I identified digital spreadsheets as one of the “weapons” used by financial raiders.
This post looks at the early use of digital spreadsheets and two aspects of modern capitalism that emerged during the 1980s with the personal computer and a significant spreadsheet that was dominant before Microsoft’s Excel. The LBO and the use of “junk bonds” emerged in conjunction with digital technology and new financial techniques that reshaped major corporations and the organization of modern finance.
This post looks at the early use of digital spreadsheets and two aspects of modern capitalism that emerged during the 1980s with the personal computer and a significant spreadsheet that was dominant before Microsoft’s Excel. The LBO and the use of “junk bonds” emerged in conjunction with digital technology and new financial techniques that reshaped major corporations and the organization of modern finance.
The concept of time is central to the“temporal finance” of spreadsheet capitalism. It empowers techniques like forecasting, modeling, risk analysis, and decision-making that depend on time-based variables, such as cash flows, interest rates, investment returns, or market trends. These techniques translate into debt instruments like bonds, mortgages, and loans that promise future repayment with interest, acknowledging the time value of money.
Temporal finance plays a critical role in the development of spreadsheet capitalism. In disciplines like corporate finance, portfolio management, and financial engineering, spreadsheets drew on established temporal practices and alphanumeric culture yet shaped a PC-enabled technical knowledge that transformed the global political economy and filtered out into general use.
I am using the term “spreadsheet capitalism” to refer to a type of rationality brought to the political economy due to the computer application’s confluence of capabilities exemplified by Lotus 1-2-3. The spreadsheet integrated previous political technologies such as the list and table combined with accelerated processing speeds, gridmatic visibility, and the interactivity and intimacy enabled by the microprocessing “chip” in the affordable PC.
Spreadsheets integrated temporal variables into financial decision-making and accelerated the influence of innovations in financial modeling, software, and decision support systems. Early spreadsheet software, like Lotus 1-2-3 and later Excel, allowed users to automate calculations across time periods using formulas, macros, and built-in functions. Financial analysts could efficiently calculate and project periodic growth rates, depreciation schedules, or portfolio returns without manually recalculating each period.
Lotus 1-2-3 effectively leveraged the PC’s keyboard and cathode-ray display to provide a clean, ASCII text-based interface that was easy for financial professionals to learn and use. While it was less feature-rich than later tools like Microsoft Excel, several functions and formulas on Lotus 1-2-23 were particularly valuable for LBO modeling. Not surprisingly, a key function was @SUM(), which added the values in a range of cells. For example, @SUM(D1..D7) would calculate totals from D1 through D7, such as revenue, costs, or aggregate cash flow over time. @ROUND() was often used to clean up financial outputs for reporting, ensuring Indo-Arabic written outputs are rounded to the nearest dollar, thousand, or million in its positional placement numbering scheme. More functions and formulas used by financial raiders and private equity will be discussed below.
Reaganomics and the Digital Spreadsheet
The larger context for the emergence of the spreadsheet economy was the “Reagan Revolution” and its emphasis on deregulation, dollar strength, tax incentives, and a pro-finance climate. These changes created fertile ground for the rapid growth of the spreadsheet economy. The Economic Recovery Tax Act of 1981 lowered capital gains taxes, making investments in LBOs more attractive to private equity firms and individual investors. Legislative changes in depreciation schedules also allowed firms to write off investments faster, improving cash flow and making acquisitions more financially feasible. Relaxed corporate tax laws allowed companies to deduct significant interest payments on debt, a cornerstone of LBO financing, and incentivized the heavy use of leverage in buyouts.
In this environment, temporal forecasting, modeling, and risk analysis during the 1980s and early 1990s enabled the complex calculations and scenarios that made corporate raiding and other types of mergers and acquisitions (M&A) possible. Users could automate calculations across future time periods using formulas, macros, and built-in functions. These were previously too cumbersome to perform manually but were made possible with formulas, macros, and built-in functions. These capabilities made spreadsheets instrumental in structuring and executing many major business deals and LBOs during the Reagan era.
Spreadsheets worked with several new technologies that emerged at the time to fundamentally transform corporate America and diffuse into many industries and social activities. This post will focus more on the basic capabilities of the PC that enabled the CP/M and MS-DOS spreadsheets. Other posts will track the developments in GUI environments that enabled Mac and Windows-based spreadsheets.
Standardization of IBM PCs and the IBM “Compatibles” for Lotus 1-2-3
While the Reagan Revolution of the 1980s created an economic and regulatory environment that made LBOs particularly attractive, the IBM PC and its compatibles became its initial workhorses. IBM had been the dominant mainframe computer producer but noticed it was “losing the hearts and minds” of computer users to the Apple II “microcomputer” and its games like Alien Rain, AppleWriter word processor, and especially the VisiCalc spreadsheet. In response to popular demand in the business world, IBM created its own microcomputer, based on a new Intel microprocessor.
IBM released the Personal Computer (PC) in 1981 to entice the business community back to “Big Blue,” as the New York-based computer company was sometimes called. After procuring an operating system (OS) from “Micro-Soft,” it went on sale in August 1981. Early PCs were powered by Intel’s 8088, a 16-bit microprocessing chip used for its central processing unit (CPU). Although limited by the era’s hardware, the 8088 allowed Lotus 1-2-3 to process larger datasets than previous-generation microprocessors, enabling businesses to manage more comprehensive financial information.
The combination of Lotus 1-2-3’s features and the 8088’s performance made the software versatile for various financial tasks, from simple bookkeeping to advanced financial modeling. The 8088, operating at 5-10 MHz, delivered significant computational power for its time, enabling fast data processing and calculations.
The speed of the 8088, representing a 50-times speed increase over the revolutionary Intel 4004 chip that inspired Bill Gates to leave Harvard and start “Micro-Soft.” Although primarily focused on developing software, they took advantage of an opportunity to buy and configure an operating system called MS-DOS for IBM. In a historic move, Gates would outmaneuver the computer giant, however, and offer MS-DOS to many new “IBM-compatible” microcomputers that were based on the reverse-engineering of the PC.
Working on many new PCs such as the Compaq and Dell, MS-DOS allowed Lotus 1-2-3 to dominate the market, despite Microsoft’s release of its Multiplan spreadsheet in 1982. Despite using the old-style command-line interface, the new spreadsheets could handle real-time financial updates, giving users the ability to recalculate entire spreadsheets almost instantly. With MS-DOS, Lotus 1-2-3 became the de facto spreadsheet tool for businesses.
The widespread use of the 8088 established the PC as a standard computing platform, encouraging software developers like Lotus to optimize their products for this architecture. The popularity of the 8088 and Lotus 1-2-3 fostered a growing arsenal of compatible software, add-in boards, and other hardware for storing data and printing charts, further amplifying its utility for financial purposes.
The 8088’s ability to handle multitasking allowed Lotus 1-2-3 to integrate spreadsheet, charting, and database functions. Financial professionals could perform calculations, visualize data, and manage records in one program without needing additional tools.
In early 1983, Lotus 1-2-3 was released after a long incubation period while being programmed in Assembly language for faster performance. Lotus could also run on “IBM-compatible” machines, such as the Compaq portable computer that came out a few months later. Lotus 1-2-3 became known for its ability to (1) visually calculate formulas, (2) function like a database, and (3) turn data into charts for visible representation of data. The spreadsheet’s features made the software versatile for various financial tasks, from simple bookkeeping to advanced financial modeling. Lotus 1-2-3 played a pivotal role in the emergence of spreadsheet capitalism during the 1980s, and its functions and logic informed the principles of modern LBO modeling.
Spreadsheets Empower Corporate Raiding
The consolidation of corporations in the 1960s and early 1970s, mainly through mergers, created the conditions that fueled the rise of corporate raiders in the 1980s. Expanding conglomerates often created inefficient bureaucracies with poor management by acquiring companies in unrelated industries. Slow decision-making, short-term planning, and internal competition for resources hid the value of their subsidiary companies. Leveraged buyouts, enabled by Lotus 1-2-3 and junk bonds, provided the financial firepower for corporate raiders to execute hostile takeovers and break up these companies for big profits.
Corporate raiders would model the complex financing of LBOs, where a company is acquired primarily with borrowed money. These raiders would input a target company’s financial data into a spreadsheet and assess the company’s value, analyze different scenarios, and identify areas where costs could be cut or assets sold off to increase profitability. They would adjust variables like interest rates, debt levels, and projected cash flows to determine the feasibility and profitability of the LBO. The spreadsheet results served as compelling presentations to banks and investors, showcasing the potential returns of the LBO and convincing them to provide the necessary funding. The 1980s saw a wave of high-profile takeovers, often leading to significant restructuring and changes in the corporate landscape.
LBOs in the 1980s included several prominent cases. Notable was the breakup of RJR Nabisco (1988) and the Beatrice Companies (1986) conducted by KKR, an upcoming investment firm founded in 1976 by Jerome Kohlberg Jr. and cousins Henry Kravis and George R. Roberts. They had all previously worked together at Bear Stearns. KKR became a leader in the LBO space and heavily relied on computer analysis and spreadsheets to structure its deals.
RJR-Nabisco
The RJR Nabisco buyout was one of the most famous and largest leveraged buyouts (LBOs) in history, and it perfectly illustrates how these deals worked, including the subsequent asset sales to repay debt. RJR Nabisco was a massive conglomerate, owning tobacco brands (Winston, Salem), and food brands (Nabisco, Oreo). KKR borrowed heavily (mainly through junk bonds) to finance the acquisition. These loans minimized the amount of their own capital needed. The final price was a staggering $25 billion, a record at the time. This massive figure was only possible due to the availability of the financial analysis tools such as the spreadsheet and high-yield, high-risk junk bonds that will be discussed in a future post.
KKR’s core strategy was similar to other LBOs. Take control of the company through the LBO paid for by large loans. Identify non-core assets and divisions that could be sold off and divest those assets to generate cash. They would then use the proceeds from asset sales to pay down the often massive debt incurred in the LBO. KKR and its investors would profit from any remaining value after debt repayment.
The sheer size of the RJR Nabisco deal meant KKR had to raise an enormous amount of debt. This borrowing was facilitated by investment bankers, and increasingly, the junk bond market. KKR proceeded to sell off various RJR Nabisco assets, including several food brands, and overseas operations were also divested. Anything that wasn’t considered core to the remaining business was on the table. The money from these sales went directly to paying down the principal and interest on the LBO debt. While the deal was controversial, KKR and its investors made a substantial profit. RJR Nabisco was significantly smaller and more focused after the asset sales.
The firm’s ability to efficiently model debt financing and equity returns gave it a competitive edge. The deal’s complexity required detailed modeling of debt structures, cash flow scenarios, and potential equity returns. These could be calculated and managed using Lotus 1-2-3, which enabled models of loan amortization schedules, showing how much of each payment goes toward the principal and interest. Key functions included PMT to calculate the fixed periodic loan payment. IPMT to calculate the interest portion of a payment. PPMT was also used to calculate the principal portion of a payment. These formulas could also model different types of debt, such as bullet repayments, fixed-rate loans, or variable-rate loans, and the analysis could include early repayments or refinancings in the schedule to determine total debt cost.
The RJR Nabisco LBO became a symbol of the excesses of the 1980s, highlighting the power (and risks) of leveraged buyouts and junk bonds. It also led to increased scrutiny of these types of deals, although they continued as spreadsheet capitalism spread.
Beatrice Foods
Beatrice Foods was a massive conglomerate with a diverse portfolio of food brands (Hunt’s Ketchup, Wesson Oil, etc.) and other businesses.
KKR borrowed heavily to finance the acquisition, allowing them to purchase a large company with relatively little of their own capital. The Beatrice acquisition was another of the largest LBOs at the time, valued at approximately $6.2 billion.
Using computer analysis and Lotus 1-2-3, they modeled Beatrice’s sprawling operations and assessed the feasibility of breaking the company into smaller, more manageable pieces. The deal’s complexity again required detailed modeling of debt structures, cash flow scenarios, and potential equity returns, which were conducted using the PC-enabled spreadsheet.
After extended deliberations, KKR purchased Beatrice Companies for $8.7 billion in April 1986 and proceeded to break it up. KKR sold off many of Beatrice’s non-core businesses, including those in areas like chemicals and construction. They focused on strengthening Beatrice’s core businesses, primarily in the food and beverage sector. This list included brands like Chef Boyardee, Samsonite, and Tropicana.
Promus
Another important deal was Promus Companies acquiring Harrah’s Entertainment in a 1989 LBO transaction that depended on detailed modeling of casino revenues and operational expenses. Again, this was made feasible by Lotus 1-2-3 because of its dominance at the time. LBOs require complex financial modeling to project cash flows and analyze the target company’s future earnings potential. It also needed to determine the optimal debt levels and repayment schedules as well as assess the impact of different assumptions, such as interest rates and revenue growth, on the deal’s profitability.
Raiding financiers and associated firms leveraged Lotus 1-2-3 to simulate financial outcomes and quickly adjust models as negotiations or deal terms changed. Estimating the company’s value under different scenarios and determining whether the target company could generate enough cash to service debt under different economic and operational conditions was crucial. This estimation required precise tracking of various tranches of debt, their repayment terms, and interest coverage.
The Blackstone Group
Blackstone also started with the help of spreadsheets like Lotus 1-2-3 for its early buyouts, including real estate and private equity deals. Lotus 1-2-3 provided the necessary tools for these complex financial analyses, which were used to organize data, build complex economic models, and perform calculations. Macros were available to automate repetitive tasks and improve efficiency in the modeling process. Blackstone also used Lotus to generate charts, graphs, and other visualizations to help analyze investment performance and make presentations in the boardroom.
LBO models can become complex, requiring intricate formulas and linkages between spreadsheet parts. The accuracy of the LBO model heavily relies on the accuracy of the underlying data inputs. Lotus 1-2-3 could perform sensitivity analyses by changing key assumptions (e.g., interest rates, revenue growth) to understand their impact on the model’s output.
Spreadsheet Formulas and the Temporal Imperative
Time is a crucial factor in capitalism and its financial investments, but it was only after the West’s social transformation of time and sacrifice that investment took its current priority. Religious discipline, which structured earthly time for heavenly reward, met with the Reformation in 16th-century Europe to produce a new calculative rationality – financial investment.[Pennings Dissertation] Also, by solidifying time in alphanumerical base-12 measures (60 minutes, 24-hour days, 360 days a year), a new correlation – investment over time gained prominence. Sacrificing spending in the present for payoffs in the future was the cultural precondition for spreadsheet capitalism.[4]
The analysis of the time value of money (TVM) was critical for LBOs, particularly for valuing a target company, determining debt service, and return on investment, as well as understanding and managing the risks associated with the LBO. TVM calculations were time-consuming and tedious, often requiring financial tables or manual calculations using formulas.
Digital spreadsheets significantly accelerated and improved the analysis of TVM by automating calculations, enabling “what-if” analysis, increasing accessibility, and enhancing visualization. Lotus 1-2-3 introduced built-in financial functions, such as PV (Present Value), FV (Future Value), PMT (Payment), RATE, and NPV (Net Present Value). These functions simplified TVM calculations that would otherwise require extensive manual work or financial calculators. Instead of manually solving the compound interest formula to find future value, users could simply input values (e.g., interest rate, periods, and payment) into the FV function. Spreadsheets allow users to quickly change input variables (interest rates, cash flows, and time periods) and instantly see the impact on the TVM calculations.
TVM is based on the notion that a dollar today is worth more than a dollar in the future due to its earning potential. This formula (FV = PV x (1 + i / f) ^ n x f ) has empowered individuals and businesses to make more informed financial decisions. Spreadsheets allow users to create charts and graphs to visualize TVM concepts, such as the impact of compounding interest over time or the relationship between present value and future value.
A vital formula that was converted to the Lotus spreadsheet was Present Value @PV(), a crucial tool for analyzing companies. It provided a foundation for evaluating the worth of future cash flows from raided companies or their parts in the present terms. Companies generate cash flows over time, and analyzing them with PV ensures that delayed returns are appropriately considered and valued. PV helps distinguish between high-growth opportunities that justify higher valuations and overvalued prospects with limited potential.
PV quantifies this by discounting future cash flows to reflect their value today. This equation is critical in decision-making, whether assessing investments, valuing a company, or comparing financial alternatives. Present Value determines the internal rate of return (IRR) or net present value (NPV), the difference between the present value of cash inflows and the present value of cash outflows over a period of time. NPV is used in capital budgeting and investment planning to analyze a project’s projected profitability.
A related temporal technique is Future Value @FV() that was developed to project future cash or investment values. It calculates what money is expected to be worth at a future date based on current growth trends. It is particularly useful for debt paydown schedules or residual equity valuation @IRR(), the Internal Rate of Return. These calculations were crucial for evaluating the return on investment for equity holders, a core metric in LBOs.
Net Present Value @NPV() helped assess the profitability of an investment by calculating the value of projected cash flows discounted at the required rate of return. @NPV was crucial as it allowed users to input a discount rate (representing the cost of capital) and a series of future cash flows, and the @NPV function would calculate the present value of those cash flows.
@IF() determined whether a debt covenant has been breached or whether excess cash should be used for debt repayment. Payment @PMT() was useful for calculating the periodic payment required for a loan, considering principal, interest, and term.
Conclusion
Lotus 1-2-3’s capabilities on IBM and “IBM-compatible” personal computers allowed private equity firms to confidently pursue larger and more complex deals by providing a reliable platform for financial forecasting and decision-making. The tool’s role in shaping LBO strategies contributed to the emergence of private equity as a dominant force in corporate finance. Many fundamental modeling practices in these landmark deals continue to underpin private equity and LBO analyses today, albeit with more advanced tools like Microsoft Excel.
By providing the computational power needed for sophisticated spreadsheet software, the Intel 8088 chip-enabled Lotus 1-2-3 to become a powerful tool for financial analysis, transforming how businesses managed and analyzed financial data in the 1980s. The 8088’s arithmetic capabilities enabled Lotus 1-2-3 to execute complex financial formulas and algorithms quickly, making it suitable for forecasting, budgeting, and economic modeling tasks.
Summary
This article explores the role of digital spreadsheets, particularly Lotus 1-2-3, in the rise of leveraged buyouts (LBOs) and “junk bonds” during the 1980s, a phenomenon termed “spreadsheet capitalism.” The author argues that spreadsheets, combined with the economic policies of the Reagan era, enabled the complex financial modeling and analysis necessary for these deals, transforming corporate finance and the broader political economy.
Spreadsheets like Lotus 1-2-3 allowed financiers to analyze target companies, model LBO financing, and present compelling cases to investors. This facilitated the wave of LBOs in the 1980s, exemplified by deals like RJR Nabisco and Beatrice Foods.
Spreadsheets enabled sophisticated financial modeling across time periods, incorporating factors like cash flows, interest rates, and investment returns. This “temporal finance” became crucial for LBOs and other financial instruments.
The IBM PC and its compatibles, powered by Intel’s 8088 microprocessor, provided the hardware platform for Lotus 1-2-3 to thrive. The spreadsheet’s features, combined with the 8088’s processing power, made it a versatile tool for financial professionals.
Reaganomics is a key point. It argues that the “Reagan Revolution,” with its emphasis on deregulation, tax cuts, and a pro-finance climate, created a favorable environment for LBOs and the use of spreadsheets in finance.
The article highlights specific Lotus 1-2-3 functions and formulas, such as @SUM, @ROUND, @PV, @FV, @NPV, and @IF, that were crucial for LBO modeling and financial analysis.
Spreadsheets automated complex calculations related to the time value of money, making it easier to evaluate investments and compare financial alternatives.
The author concludes that spreadsheets played a pivotal role in the rise of LBOs and the transformation of corporate finance in the 1980s. The ability to model complex financial scenarios and analyze the time value of money empowered financiers to pursue larger and more complex deals, contributing to the emergence of private equity as a major force in the economy.
Citation APA (7th Edition)
Pennings, A.J. (2025, Feb 3) Lotus 1-2-3, Temporal Finance, and the Rise of Spreadsheet Capitalism. apennings.com https://apennings.com/how-it-came-to-rule-the-world/digital-monetarism/lotus-1-2-3-temporal-finance-and-the-rise-of-spreadsheet-capitalism/
Notes
[1] Barbarians at the Gate was written by investigative journalists Bryan Burrough and John Helyar and based upon a series of articles written for The Wall Street Journal. The book was also made into made-for-TV movie in 1983 by HBO. The book centers on F. Ross Johnson, the CEO of RJR Nabisco, who planned to buy out the rest of the Nabisco shareholders.
[2] Lotus 1-2-3 v2.3 Functions and Macros Guide. Copyright:Attribution Non-Commercial (BY-NC)
[3] Differences between Microsoft Excel and Lotus 1-2-3.
[4] The concept of time is central to the creation of debt instruments like bonds, mortgages, and loans. These instruments promise future repayment with interest, acknowledging the time value of money. The Reformation, particularly in 16th-century Europe, profoundly transformed the concept of religious sacrifice, redirecting its focus from traditional spiritual practices such as indulgences and pilgrimages to a more personal, moral, and communal framework of financial responsibility and economic participation. Driven by figures like Martin Luther and John Calvin, the Reformation emphasized salvation through faith alone (sola fide), as opposed to salvation through works or financial contributions to the Church.
Sacrificial acts, such as indulgences (payments to reduce punishment for sins) or pilgramages, were denounced and replaced by personal piety and moral rectitude became the markers of faith.
The emergent Protestantism emphasized a form of asceticism that discouraged excessive spending on luxuries and instead encouraged investment in one’s household, community, and vocation as acts of divine service. Calvinist teachings in particular associated hard work, frugality, and the accumulation of wealth with signs of God’s favor, framing secular work and financial investment as forms of religious duty legitimizing economic activity and investment as expressions of faith. Financial stewardship—managing wealth responsibly for the benefit of family and society—was seen as a spiritual obligation, transforming economic practices into acts of religious significance. The reframing of religious sacrifice as financial responsibility and moral investment influenced economic development and the encouragement of disciplined financial behavior and reinvestment contributed to the rise of capitalist economies in Protestant regions. This transformation redefined the role of the individual in their faith community, linking personal piety with economic productivity and reshaping the societal understanding of sacrifice as a moral and practical investment in the future rather than a direct transaction with the divine.
Hypertext References (APA Style)
Burrough, B. and Helyar, J. (1990). Barbarians at the Gate: The Fall of RJR Nabisco. New York: Harper & Row.
Corporate Finance Institute. (n.d.). Reaganomics. Retrieved from https://billofrightsinstitute.org/essays/ronald-reagan-and-supply-side-economics
Investopedia. (n.d.). Net Present Value (NPV). Retrieved from https://www.investopedia.com/terms/n/npv.asp
Investopedia. (n.d.). Internal Rate of Return (IRR). Retrieved from https://www.investopedia.com/terms/i/irr.asp
Investopedia. (n.d.). Residual Equity Theory. Retrieved from https://www.investopedia.com/terms/r/residual-equity-theory.asp
Pennings, A. (2010). Figuring Criminality in Oliver Stone’s Wall Street. Retrieved from https://apennings.com/2010/05/01/
The Economist. (2024, October 15). Why Microsoft Excel Won’t Die. Retrieved from https://www.economist.com/business/2024/10/15/why-microsoft-excel-wont-die
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: IBM PC > Intel 8088 > LBOs > leveraged buyouts > Lotus 1-2-3 > Reagan Revolution > time value of money (TVM)
Steve Jobs Pioneering Work in Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK)
Posted on | January 24, 2025 | No Comments
Hailing from Silicon Valley during the age of the counter-cultural movement, Steve Jobs was a keen observer of social and technological events. As a result, he significantly impacted the development of human-machine interaction (HMI) and Human-Machine Knowledge (HMK) through his work at Apple and NeXt.
Humans, at their best, excel at creativity, judgment, and complex reasoning, while machines excel at accuracy, speed, storage, processing, and speedily transmitting vast amounts of data. HMI, also known as MMI or man–machine interaction, is defined as interaction between human operators and devices through multiple interfaces.
Human-Machine Knowledge (HMK) is an emerging interdisciplinary field that explores how humans and machines can effectively collaborate, construct, and share knowledge to achieve outcomes that surpass what either could accomplish alone. HMK emphasizes creating systems where humans and machines complement each other’s strengths. HMK seeks to create a symbiotic relationship between humans and machines, where they can leverage each other’s strengths to achieve unprecedented levels of knowledge, innovation, learning, and progress.
Steve Jobs provides an interesting point of departure and discussion vehicle for understanding these two areas. He played a pivotal role in advancing human-machine interactivity and reshaping how we engage with technology and knowledge. His contributions spanned hardware, software, and design, fundamentally changing the way people interact with computers and other devices.
Working with Steve Wozniak, Jobs’ emphasis on craftsmanship, simplicity, and purpose inspired a generation of designers, engineers, and technologists to prioritize simplicity and user-centric design with the Apple II and later other products. Jobs believed in the power of simplicity, stripping away unnecessary complexity to create products that were easy to understand and use.
Jobs popularized the Graphical User Interface (GUI) with the Lisa and Macintosh “microcomputers.” Drawing on WIMP (Windows, Icons, Menus, and Pointers) technologies from developed at Xerox PARC for the Aloha Alto computer, he worked to make computers more intuitive and accessible to the masses. This revolutionized how people interacted with technology, moving beyond text commands to hand movements.
The Lisa and Mac introduced WYSIWYG (What You See Is What You Get) editing, allowing users to see their work on screen as it would appear in print, significantly improving user experience. This made the Laser printer and desktop publishing possible. He had a great appreciation for different fonts and prioritized their integration into the Mac.
Jobs championed the use of the mouse, making computer interaction more natural and intuitive. It was Doug Engelbart’s famous 1968 “Demo” at the “Augmentation Research Center” at Stanford Research Institute (SRI) that developed computer technologies that would enhance or “augment” human performance. They innovated the mouse and early prototypes of many of the interface technologies that would be used commonly in the personal computer by the late 1980s. Jobs was instrumental in adopting and refining the mouse as a critical tool for interacting with GUIs. By reducing its cost and simplifying its design, the Macintosh made the mouse a standard input device.
Jobs emphasized the importance of user-centric design, focusing on creating products that were not only functional but also aesthetically pleasing and enjoyable to use. Jobs championed design thinking, emphasizing that technology should be intuitive and beautiful. His mantra, “It just works,” drove Apple to create devices that required minimal technical expertise, democratizing technology for the masses. It’s the subtle way that Apple builds its software and hardware to work together in a seamless experience for users.
Apple software, from Mac OS to iOS, also strove for ease of use and intuitive design, making technology more accessible to a wider range of users. Jobs pioneered a seamless integration of hardware and software, ensuring that devices were not only functional but delightful to use. This holistic approach elevated user experience to new heights.
The iPod (2001) and iPhone (2007) revolutionized human-machine interaction with their multi-touch interface, eliminating physical buttons in favor of gestures like swiping, pinching, and tapping. Jobs refined the use of multi-touch gestures on the iPhone and iPad, making interactions more fluid and natural. The iPad blurred the line between consumption and creation, enabling users to read, write, draw, and interact in entirely new ways. The App Store extended this revolution, creating a vibrant online techosystem for accessing appls that enhanced knowledge, tools, and entertainment.
Jobs incorporated and refined technologies like Siri, Apple’s virtual assistant, which introduced natural language processing and voice interaction, making machines more responsive to human needs. Siri brought voice control to the mainstream, enabling more intuitive and hands-free interaction with devices.
Jobs instilled a deep appreciation for user experience across the technology industry, leading to a greater focus on creating products that are not only functional but also enjoyable and intuitive to use.
These contributions have fundamentally changed how we interact with technology, making it more accessible, intuitive, and enjoyable for billions of people worldwide.
Steve Jobs Pioneering Work in Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK)
Steve Jobs, a counter-cultural visionary from Silicon Valley, profoundly influenced Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK) through his work at Apple and NeXT. He emphasized simplicity, user-centric design, and seamless integration of hardware and software, fundamentally changing how people interact with technology and access knowledge.
His contributions to Human-Machine Interaction (HMI) included simplified design, GUI interaction, WYSIWYG editing, as well as touch (haptic) and voice interaction. Jobs prioritized intuitive, user-friendly designs, removing unnecessary complexity to make technology accessible to all. By using GUI and integrating WIMP technologies in the Lisa and Macintosh, Jobs replaced text-based commands with intuitive visual interactions. Inspired by Doug Engelbart’s innovations, Jobs refined and popularized the mouse, making it affordable and central to personal computing.
The Lisa and Macintosh introduced “What You See Is What You Get” editing, enabling users to see their work as it would appear in print and advancing tools like desktop publishing. The iPhone and iPad revolutionized interaction with multi-touch gestures such as swiping and pinching, offering a more natural and fluid user experience. With Siri, Jobs introduced natural language processing to mainstream devices, enabling hands-free, intuitive interactions.
Contributions to Human-Machine Knowledge (HMK) including democratizing knowledge, creating a App Store technosystem, and enhancing creativity with technology to empower users. Devices like the iPod, iPhone, and iPad provided tools for learning, creativity, and communication in a portable and accessible format. Jobs fostered an ecosystem of applications, enabling users to access tools for innovation, education, and entertainment, thereby enhancing knowledge-sharing and creation. Jobs blended also artistic design with technological innovation, making products that were both functional and inspiring. His designs empowered individuals to create, manipulate, and share knowledge, bypassing traditional technical or bureaucratic barriers.
Humans, at their best, excel at creativity, judgment, and complex reasoning. In contrast, machines excel at accuracy, speed, storage, processing, and speedily transmitting vast amounts of data. HMI, also known as MMI or man-machine interaction, is defined as the interaction between human operators and devices through multiple interfaces.
Human-Machine Knowledge (HMK) is an emerging interdisciplinary field that explores how humans and machines can effectively collaborate, construct, and share knowledge to achieve outcomes that surpass what either could accomplish alone. HMK emphasizes creating systems where humans and machines complement each other’s strengths. HMK seeks to create a symbiotic relationship between humans and machines, where they can leverage each other’s strengths to achieve unprecedented levels of knowledge, innovation, learning, and progress.
Steve Jobs provides an interesting point of departure and discussion vehicle for understanding these two areas. He was pivotal in advancing human-machine interactivity and reshaping how we engage with technology and knowledge. His contributions spanned hardware, software, and design, fundamentally changing how people interact with computers and other devices.
Working with Steve Wozniak, Jobs’ emphasis on craftsmanship, simplicity, and purpose inspired a generation of designers, engineers, and technologists to prioritize simplicity and user-centric design with the Apple II and later other products. Jobs believed in the power of simplicity; he stripped away unnecessary complexity to create products that were easy to understand and use.
Jobs popularized the Graphical User Interface (GUI) with the Lisa and Macintosh “microcomputers.” Drawing on WIMP (Windows, Icons, Menus, and Pointers) technologies developed at Xerox PARC for the Aloha Alto computer, he worked to make computers more intuitive and accessible to the masses. This innovation revolutionized how people interacted with technology, moving beyond command line interfaces and text commands to hand movements.
The Lisa and Mac introduced WYSIWYG (What You See Is What You Get) editing, allowing users to see their work on screen as it would appear in print, significantly improving user experience. This made the Laser printer and desktop publishing possible. He also had a great appreciation for different fonts and prioritized their integration into the Mac.
Jobs championed using the mouse, making computer interaction more natural and intuitive. It was Doug Engelbart’s famous 1968 “Demo” at the “Augmentation Research Center” at Stanford Research Institute (SRI) that developed computer technologies that would enhance or “augment” human performance. They innovated the mouse and early prototypes of many interface technologies that would be commonly used in the personal computer by the late 1980s. Jobs was instrumental in adopting and refining the mouse as a critical tool for interacting with GUIs. By reducing its cost and simplifying its design, the Macintosh made the mouse a standard input device.
Jobs emphasized the importance of user-centric design, focusing on creating products that were not only functional but also aesthetically pleasing and enjoyable to use. Jobs championed design thinking, emphasizing that technology should be intuitive and beautiful. His mantra, “It just works,” drove Apple to create devices that required minimal technical expertise, democratizing technology for the masses. It’s the subtle way that Apple builds its software and hardware to work together in a seamless experience for users.
Apple software, from Mac OS to iOS, also strove for ease of use and intuitive design, making technology more accessible to a broader range of users. Jobs pioneered a seamless integration of hardware and software, ensuring that devices were functional and delightful to use. This holistic approach elevated user experience to new heights.
The iPod (2001) and iPhone (2007) revolutionized human-machine interaction with their multi-touch interface, eliminating physical buttons in favor of gestures like swiping, pinching, and tapping. Jobs refined multi-touch gestures on the iPhone and iPad, making interactions more fluid and natural. The iPad blurred the line between consumption and creation, enabling users to read, write, draw, and interact in entirely new ways. The App Store extended this revolution, creating a vibrant online techosystem for accessing apps that enhanced knowledge, tools, and entertainment.
Jobs incorporated and refined technologies like Siri, Apple’s virtual assistant, which introduced natural language processing and voice interaction, making machines more responsive to human needs. Siri brought voice control to the mainstream, enabling more intuitive and hands-free interaction with devices.
Jobs instilled a deep appreciation for user experience across the technology industry, leading to a greater focus on creating products that are not only functional but also enjoyable and intuitive to use.
These contributions have fundamentally changed how we interact with technology, making it more accessible, intuitive, and enjoyable for billions of people worldwide.
Steve Jobs, a counter-cultural visionary from Silicon Valley, profoundly influenced Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK) through his work at Apple and NeXT. He emphasized simplicity, user-centric design, and seamless integration of hardware and software, fundamentally changing how people interact with technology and produce knowledge.
His contributions to Human-Machine Interaction (HMI) included simplified design, GUI interaction, WYSIWYG editing, as well as touch (haptic) and voice interaction. Jobs prioritized intuitive, user-friendly designs, removing unnecessary complexity to make technology accessible. By using GUI and integrating WIMP technologies in the Lisa and Macintosh, Jobs replaced text-based commands with intuitive visual interactions. Inspired by Doug Engelbart’s innovations, Jobs refined and popularized the mouse, making it affordable and central to personal computing.
The Lisa and Macintosh introduced “What You See Is What You Get” editing, enabling users to see their work as it would appear in print and advancing tools like desktop publishing. The iPhone and iPad revolutionized interaction with multi-touch gestures such as swiping and pinching, offering a more natural and fluid user experience. With Siri, Jobs introduced natural language processing to mainstream devices, enabling hands-free, intuitive interactions.
Contributions to Human-Machine Knowledge (HMK) include democratizing knowledge, creating an App Store ecosystem, and enhancing creativity with technology to empower users. Devices like the iPod, iPhone, and iPad provided tools for learning, creativity, and communication in a portable and accessible format. Jobs fostered an ecosystem of applications, enabling users to access tools for innovation, education, and entertainment, enhancing knowledge-sharing and creation. Jobs also blended artistic design with technological innovation, making functional and inspiring products. His designs empowered individuals to create, manipulate, and share knowledge, bypassing traditional technical or bureaucratic barriers.
Jobs’ work laid the foundation for more intuitive, accessible, and enjoyable technology. His contributions bridged the gap between humans and machines, enabling collaboration and knowledge-sharing at an unprecedented scale, impacting billions of users worldwide. His contributions have fundamentally changed how we interact with technology. His focus on HMI and, to a lesser extent, HMK principles, has made technology more accessible, intuitive, and enjoyable for billions of people worldwide.
Citation APA (7th Edition)
Pennings, A.J. (2025, Jan 24) Steve Jobs Pioneering Work in Human-Machine Interaction (HMI) and Human-Machine Knowledge (HMK). apennings.com https://apennings.com/artificial-intelligence/steve-jobs-pioneering-work-in-human-machine-interaction-hmi-and-human-machine-knowledge-hmk/
Links
Pennings, A.J. (2017, May 27) Not Like 1984: GUI and the Apple Mac. apennings.com https://apennings.com/how-it-came-to-rule-the-world/not-like-1984-gui-and-the-apple-mac/
Pennings, A.J. (2018, Oct 10) TIME Magazine’s “Machine of the Year”. apennings.com https://apennings.com/financial-technology/digital-spreadsheets/time-magazines-machine-of-the-year/
Pennings, A.J. (2018, Oct 19) Apple’s GUI and the Creation of the Microsoft’s Excel Spreadsheet Application. apennings.com
Apple’s GUI and the Creation of the Microsoft’s Excel Spreadsheet Application
Pennings, A.J. (2023, April 16). The Digital Spreadsheet: Interface to Space-Time and Beyond? apennings.com https://apennings.com/technologies-of-meaning/the-digital-spreadsheet-interface-to-space-time-and-beyond/
Pennings, A.J. (2024, Dec 7) The Framing Power of Digital Spreadsheets. apennings.com https://apennings.com/meaning-makers/the-framing-power-of-digital-spreadsheets/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Human-Machine Interaction (HMI) > Human-Machine Knowledge (HMK)
Connecting a Dangerous World: Border Gateway Protocol (BGP) and National Concerns
Posted on | December 9, 2024 | No Comments
This post discusses what Border Gateway Protocol (BGP) does and some of the risks it poses. While BGP is necessary for the functioning of the Internet, it also presents several security concerns due to its decentralized and trust-based nature. This raises national concerns, and has drawn the attention of the cybersecurity efforts by regulatory agencies such as the Federal Communications Commission (FCC) in the US. This post examines how governments and malicious actors can exploit BGP vulnerabilities for censorship, surveillance, and traffic control. Techniques include implementing blacklists, rerouting traffic, and partitioning networks, as seen with China’s Great Firewall. Such actions enable monitoring, filtering, or isolating traffic and raise concerns about privacy, Internet freedom, and global access.
The Internet is considered a network of interconnected networks. People with devices like PCs or mobile phones (hosts) connect to Internet via networked Internet Service Providers (ISPs) that in turn are connected to other ISPs. Applications running on devices work with other devices through a maze of interconnections that pass data from router to router within the ISP’s domain and through or to other ISPs, enterprises, campuses, etc. The resulting network of networks is quite complex, but it connects the world’s Internet traffic with the help of specific routing protocols.
Border Gateway Protocol (BGP) is one such protocol and integral to the global Internet. Sometimes known as the “Post Office of the Internet,” it was developed in the 1980s to help networks exchange information about how to reach other networks. They basically determine the best path for data packets to reach their destination. BGP enables network administrators to manage and optimize network traffic by advertising the Internet routes they offer and the ones they can reach. With BGP, they can prioritize certain routes, influence the path selection process to balance traffic loads between different servers, and adapt to changing network conditions.
Despite its usefulness, many nations are worried about BGP’s security vulnerabilities.
In the US, the Federal Communications Commission (FCC) is concerned that the Internet presents security threats to the US economy, defense capabilities, public safety, and critical utilities such as energy, water, and transportation. These concerns are echoed by other countries. Malicious or sometimes incompetent actors can exploit or mismanage BGP vulnerabilities through hijacking or spoofing attacks. They can also reroute traffic to intercept or disrupt data flows. The FCC says that while efforts have been made to mitigate the Internet’s security risks, more work needs to be done, especially on BGP.
How Does BGP Work?
Jim Kurose does a great job explaining BGP and its importance in this video:
BGP connects the world by enabling communication and cooperation among autonomous systems, ensuring that data packets can traverse the vast and interconnected network of networks that make up the Internet. It bridges separate, but also networked ASes such as campuses, companies, and countries. BGP works to ensure that data packets originating from one location can cross over between ISPs and other WANS (Wide Area Networks) to reach their destination anywhere else on the planet. Network routers read the prefix numbers on packets to determine how which way they will be sent.
BGP’s ability to adapt to changing network conditions, manage data traffic, and facilitate redundant paths is crucial for the stability and reliability of the global Internet, but it also poses several dangers. Without the continuing implementation of software-defined networks (SDN), BGP will organize routing tables locally throughout its network. And the information for the routing calculations will be based on trust relationships with other ASes. Ideally, this will result in connections that can quickly reroute traffic through alternative paths to maintain network integrity. If one route becomes unavailable due to network outages, maintenance, or policy, it has the ability to quickly find other routes.
BGP is designed for interdomain routing, which means it focuses on routing decisions between different autonomous systems. This is in contrast to intradomain routing protocols like OSPF or Enhanced Interior Gateway Routing Protocol (EIGRP), which operate within a single AS. BGP is the protocol of choice for interdomain routing and that can mean between countries and even large scale Tier 1 ISPs.
Who Uses Border Gateway Protocols?
BGP users include telecommunications companies such as ISPs, Content Delivery Networks (CDNs) such as Akamai and Cloudflare, and Internet Exchange Points (IXPs) that bypass multiple networks and allow specific networks to interconnect directly rather than routing through various ISPs. Cloud providers like Amazon’s AWS, Google Cloud, and Microsoft Azure also use BGP to manage the traffic between their data centers and ISPs, allowing them to provide reliable cloud services globally.
Many large enterprises with extensive networks operate their own ASes and have BGP access to control routing policies across their internal networks and connections to external services. Universities and research institutions often employ their own ASes and use BGP when connecting to national and international research networks supporting scientific collaboration.
The image above from Top “tier-1” commercial Internet Service Providers (ISPs) use BGP as well. Tier-1 ISPs are considered the top-tier providers in the global internet hierarchy. They own and operate extensive networks and are responsible for a significant portion of the global Internet’s infrastructure. BGP is crucial for them route and exchange network reachability information with ASes, and it plays a crucial role in how these tier-1 ISPs manage their networks and interact with other ASes. Tier-1 ISPs use BGP to implement routing policies that align with their business strategies and network management goals.
A Tier-1 ISP has access to an entire Internet Region like Singapore solely via its free and reciprocal peering agreements with BGP as its glue. Examples include AT&T in the US or KDDI in Japan. BGP allows them to announce their IP prefixes to the wider Internet and receive routing updates from other ASes. Tier-1 ISPs use BGP to make routing decisions based on various criteria, including network policies, path attributes, and reachability information. BGP allows them to determine the best path for routing traffic through their networks, considering factors like latency, cost, and available capacity.
Tier-1 ISPs can establish BGP peer relationships with other ASes. These relationships can take the form of settlement-free peering or transit agreements. Peering involves the mutual exchange of traffic between two ASes, while transit agreements typically involve the provision of Internet connectivity to a customer AS in exchange for a fee. Network effects increase the importance and centrality of existing network hubs, giving them a stronger “gravitational pull,” making it more difficult for new entrants to establish themselves in the market. Effective relationships enable the global Internet to function as a connected network of networks.
BGP allows the managers of autonomous systems to consider various factors when selecting the best path, including network policies, routing metrics, and the reliability and performance of available routes. BGP helps maintain Internet reachability by constantly updating routing tables and responding to changes in network topology. It identifies the most efficient path for data packets to travel from source to destination and allows ISPs to advertise what routes they are able to offer other ISPs. BGP empowers network managers to control how data is routed, manage traffic, and enforce security policies.
How Governments Use and Abuse BGP
Military agencies usually maintain BGP access within their data infrastructure, especially to secure sensitive networks or manage national Internet traffic and, in some cases, control public Internet access to their networks. BGP allows militaries to define specific routing policies, such as prioritizing certain types of traffic (e.g., command-and-control data) or restricting traffic to trusted allies. In field operations, militaries use deployable communication systems that rely on satellite links and mobile base stations. BGP allows these systems to dynamically integrate into broader military networks. Militaries increasingly rely on drones and Internet of Things (IoT) devices, which require efficient routing of data. BGP works to ensure that data from these systems is routed optimally within military infrastructures.
A study of the early Russian-Ukrainian conflict revealed that Russian and separatist forces modified BGP routes to establish a “digital frontline” that mirrored the military one. This strategy involved diverting local internet traffic from Ukraine, the Donbas region, and the Crimean Peninsula. The research focused on analyzing the strategies employed by actors manipulating BGP, categorizing these tactics, and mapping digital borders at the routing level. Additionally, the study anticipated future uses of BGP manipulations, ranging from re-routing traffic for surveillance to severing Internet access in entire regions for intelligence or military objectives. It underscored the critical role of Internet infrastructure in modern conflict, illustrating how BGP manipulations can serve as tools for strategic control in both cyber and physical domains.
Government and other malicious actors can manipulate the Internet through multiple techniques including BGP hijacking, IP blacklisting and filtering, network partitioning and isolation, content monitoring and traffic analysis, traffic throttling and prioritization, shutdowns and access control, border routing as well as policies and compliance.
By influencing or manipulating BGP routes, governments or actors with access to BGP-enabled networks can reroute traffic to go through specific regions or servers. This is often done by injecting false BGP announcements to redirect traffic to specific router. This can allow governments to block, intercept, or monitor certain data flows. Such an approach was seen with incidents in various countries where traffic was rerouted through state-managed systems.
Governments worldwide can influence or control national ASes and various network providers. They can use BGP to dictate the paths data takes across the Internet on a large scale to manage, manipulate, or filter traffic for their own ends. This capability provides a point of control that governments can leverage for regulatory, security, or censorship purposes.
Government and military agencies usually maintain BGP access within their data infrastructure, especially to secure sensitive networks or manage national Internet traffic and, in some cases, control public Internet access. Governments worldwide can influence or control national ASes and local network providers. Because BGP controls traffic flow, it provides a point of control that governments can leverage for regulatory, security, or censorship purposes. They can dictate the paths data takes across the Internet on a large scale to manage, manipulate, or filter traffic for their own ends.
Governments can mandate that ISPs refuse to announce or accept specific IP prefixes or routes associated with restricted sites or content. By implementing BGP blacklists, they can prevent access to certain websites or services entirely by removing or altering the BGP routes that lead to these destinations, effectively blocking them at the network level.
Some governments impose strict routing policies that partition national networks from the global Internet. By requiring ISPs to use BGP filtering rules that isolate local traffic, they can keep Internet activity confined within national borders. China’s Great Firewall is an example, where BGP filtering and routing policies prevent certain global routes and confine users to government-approved internet spaces.
Governments can influence routing so that Internet traffic passes through surveillance or monitoring points. By injecting specific BGP routes, traffic can be directed to infrastructure where deep packet inspection (DPI) or other monitoring techniques are applied. This enables governments to analyze or even censor content in real time.
Through BGP route manipulation, governments can slow down or prioritize traffic to specific networks. For example, they may route traffic through slower networks or specific filtering points to control Internet speeds to certain services or prioritize government-approved traffic sources.
In extreme cases, governments can mandate ISPs to withdraw BGP routes to cut off access entirely, effectively disconnecting regions, communities, or entire countries from the global Internet. This can be seen in certain political scenarios or during unrest when governments initiate BGP route withdrawals, isolating the local Internet temporarily.
Governments can also enforce policies that restrict data to specific geographic boundaries, requiring ISPs to adjust BGP configurations to comply with data residency or border policies. This limits data flows outside national borders and aligns with regulatory frameworks on data sovereignty.
Concerns
Nations worldwide have growing concerns regarding the security and resilience of BGP, which is fundamental to Internet routing. While critical for directing Internet traffic between ASes, BGP has vulnerabilities that can pose significant risks to national security, data integrity, and overall network resilience.
Through these mechanisms, governments can exercise significant influence over network behavior and access at a national level, using BGP as a powerful tool for traffic control, monitoring, and regulation. Such actions raise concerns over Internet freedom, privacy, and access rights on a global scale.
Citation APA (7th Edition)
Pennings, A.J. (2023, Dec 10). Connecting a Dangerous World: Border Gateway Protocol (BGP) and National Concerns. apennings.com https://apennings.com/global-e-commerce/connecting-a-dangerous-world-border-gateway-protocol-bgp-and-national-concerns/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Amazon's AWS > AT&T > autonomous system (AS) > Border Gateway Protocol (BGP) > Content Delivery Networks (CDNs) > Denial-of-Service (DoS) > Distributed Denial-of-Service (DDoS) > Federal Communications Commission (FCC) > Google Cloud > Internet Exchange Points (IXPs) > Microsoft Azure > Open Shortest Path First (OSPF) > Resource Public Key Infrastructure (RPKI) > Tier-1 ISPs
The Framing Power of Digital Spreadsheets
Posted on | December 7, 2024 | No Comments
Digital spreadsheets like Excel have framing power because they shape how information is chosen, organized, interpreted, and presented. These capabilities directly influence decision-making and resource prioritization within organizations. The power of framing arises from the ability to define what data is included, how it is processed by the spreadsheets functions or formulas, and the visual or numerical emphasis placed on specific inputs and outcomes. Spreadsheets exert framing power through selecting and prioritizing data, building formula logic and embedded assumptions, standardizing norms and templates, simplifying complex realities, and selectively presenting results.
This post continues my inquiry into the remediation of digital spreadsheets and the techno-epistemological production of organizational knowledge. This includes a history of spreadsheet technologies including VisiCalc, Lotus 1-2-3, and Microsoft’s Excel as well as the functions and formulas they integrated over time. Digital spreadsheets built on the history of alphabetical letteracy and the incorporation of Indo-Arabic numerals, including zero (0) and calculative abiltities built up through administrative, commercial, and scientific traditions.
Spreadsheets frame discussions by determining which data is included or excluded, consequently controlling narratives and resource decisions. For instance, only presenting revenue figures without costs can create a biased perspective on the financial health of an organization. By selecting and emphasizing certain key performance indicators (KPIs) over others, spreadsheets prioritize specific organizational goals (e.g., profitability over sustainability). A budget sheet that highlights “Cost Savings” as a primary metric frames spending decisions in a cost-minimization mindset. Those designing spreadsheets gain control and power by deciding what aspects of reality are quantified and analyzed.
Spreadsheet formulas embed certain assumptions about relationships between variables (e.g., Profit = Revenue – Costs assumes no other factors like opportunity costs). The logic built into formulas can obscure biases or simplify complexities, shaping decision-making paths. For example, a financial projection using =RevenueGrowthRate * PreviousRevenue assumes linear growth and potentially oversimplifies future uncertainties. What-if scenario analysis in spreadsheets often reflects the biases or priorities of the person constructing the formulas. These biases can frame potential outcomes in specific ways.
Spreadsheets can become templates for recurring processes, setting standards for what is considered “normal” or “important” in an organization. Those who create and control spreadsheet templates have the power to define organizational norms and expectations and codify power dynamics. Establishing standardization reinforces certain frames, perpetuating specific ways of viewing and evaluating organizational performance over time. A standardized sales report may continuously emphasize gross sales, neglecting other factors like customer churn.
Spreadsheets distill complex realities into numbers, tables, and graphs, simplifying nuanced issues into quantifiable elements. This reductionism simplifies complex realities. An example is reducing employee performance to numeric scores (e.g., Productivity = TasksCompleted / HoursWorked) that may overlook qualitative contributions. Critical contextual factors, such as market volatility or employee morale, may be excluded, shifting focus to what is easily measurable. By reducing complexity, spreadsheets prioritize some perspectives while marginalizing others.
Those who design spreadsheets often decide how data is framed and can influence organizational decision-making disproportionately. Those who gain access and control over the spreadsheet design have an inside track to the power dynamics of an organization. Control over spreadsheet design shapes how others interpret and interact with the data, reinforcing the designer’s influence and power. In an organization with limited transparency, complex formulas or hidden sheets can obscure understanding of the data. This can disadvantage other users and consolidate power with those with the skills and tools to create the organization’s spreadsheets.
The order in which data is presented (e.g., revenue before costs) in spreadsheets influences how stakeholders mentally process the information. Structural and layout choices affect how data is understood and what conclusions are drawn. Visualizations like pie charts, bar graphs, and trend lines direct focus to certain comparisons or patterns, and frame how priorities are perceived.
Finally, presentation decisions about formatting (e.g., conditional highlights, bold text) and visualization (e.g., charts) draw attention to specific elements. For example, highlighting a profit margin cell in green or showing a declining revenue trend in red emphasizes success or urgency. Choosing to present aggregated data (e.g., total revenue) versus granular details (e.g., revenue by product) influences how complexity is perceived and addressed. Presentation choices affect data interpretation, often steering stakeholders toward certain conclusions.
In conclusion, digital spreadsheets are powerful technologies that frame knowledge in organizations and consequently they produce power by determining how data is selected, processed, and presented. Their influence lies not just in the data they hold but in how they structure understanding and shape decision-making. Those who design spreadsheets can wield significant control over organizational narratives and perspectives, making it critical to use these tools with awareness of their potential limitations and strengths.
Citation APA (7th Edition)
Pennings, A.J. (2024, Dec 7) The Framing Power of Digital Spreadsheets. apennings.com https://apennings.com/meaning-makers/the-framing-power-of-digital-spreadsheets/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: digital spreadsheet > Excel > remediation theory
Google: Monetizing the Automatrix – Rerun
Posted on | November 21, 2024 | No Comments
I originally wrote this in October 2010 but showed it to a student recently working on a paper about the economics of FSD. It was my first post on autonomous driving.
Google recently announced its work on a driverless car to mixed reviews. While a technical success, with only one mishap in 140,000 miles of testing, many felt that Google was losing its focus. I think this latter view underestimates Google’s strategy – to monetize the road. As we move towards the “Automatrix,” the newly forming digital environment for self-driving and wireless charging transportation, Google looks to situate its search/advertising business at its center.
Let’s face it; the car is almost synonymous with shopping and consumerism. Whether going to the mall to buy some new shoes, picking up groceries, or going out to look for a new washing machine – the car transports both our bodies and our booty.
Nothing in the fridge? Drive out to the nearest Applebee, Dennys, or Olive Garden for some nachos and a diet Coke. Got kids? Try the drive-in for a Happy Meal or some Chuck E. Cheese’s pizza after a day at the water park. You get the point: have car, will spend. It’s American.
Google, which “wants to organize the world’s information”, clearly sees your car as a major generator of that data and the car occupants as major traffic generators – the good kind of traffic – on the web, not the road. They want the passenger to focus on the navigation, not the road. They want to provide destinations, pit stops, and other places to rest and refresh. The car will provide the movement while “the fingers do the walking,” to draw on a famous Yellow Pages ad.
While Nielsen, famous for its ratings business, has championed the three-screen advertising measurement (TV, PC, mobile phone), you could say Google is going for a four-screen strategy: PC, mobile, TV, and now the dashboard. Talk about a captured audience! It has the potential to pay off big, adding billions more to Google’s bottom line by tying moving passengers to the web.
Can driving through downtown Newark, sitting at a light, or leaving a movie theater parking lot really compete with the latest user-generated video on YouTube? As you drive to the airport, wouldn’t you rather be making dinner reservations or checking out entertainment on your flight or destination? No, Route 66 is going to be route66.com because, well, Pops Restaurant bought the ad word, and you would rather be enjoying a Coke and burger anyway.
Actually, I’m all for computers driving my car, as long as they are doing it for other drivers as well. Yes, I enjoy the occasional thrill of driving and, probably more, the relaxing feel from the directed focus of the activity. However, I prefer looking out the window, listening to music, and or even reading a book. I’m good at reading in a moving vehicle.
GPS has already rescued me from the travel maps, and I now need reading glasses to see them anyway. Besides, the road is dangerous. It’s really scary passing that zigzagging car because the driver is zoning out in a conversation with his ex-wife, or some teenager is texting the girl he has a crush on.
Sure, I have mixed feelings about sliding into the Automatrix. Taking over the steering wheel seems like a bit of a stretch, even for Moore’s Law modern-day’s microprocessors. It will require a whole new framework for car safety testing. However, it has been over 50 years since they guided the Apollo spacecraft to the Moon, so it makes sense to replace the current system of haphazard meat grinders we currently use.
The next in this series is Google, You can drive my car.
Citation APA (7th Edition)
Pennings, A.J. (2024, Nov 21) Google: Monetizing the Automatrix – Rerun. apennings.com https://apennings.com/global-e-commerce/google-monetizing-the-automatrix-2/
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Google Maps and Google Mobile > search/advertising business > three-screen advertising measurement
The Lasting Impact of ALOHAnet and Norman Abramson
Posted on | November 13, 2024 | No Comments
Professor Norman Abramson was a pioneering engineer and computer scientist best known for creating ALOHAnet, one of the first wireless packet-switched networks, at the University of Hawaii’s College of Engineering in 1971. His work laid the foundation for local area networks (LANs), wireless networking, satellite transmission, and data communication protocols, all crucial to modern digital communications.[1]
When I was doing my internship at the East-West Center’s Communication Institute and eventually graduate work at the University of Hawaii, I had the chance to interact with Professor “Norm” Abramson. At first, it was while interning with the National Computerization Policy Program, and then I audited his Satellite Communications class as that was a significant focus of my MA thesis. As an intern and master’s degree student, I did not exactly command his attention, but it was good to meet him and occasionally cross paths on the beaches and in the waters off of Diamond Head and Waikiki. Now, I teach a broadband course that covers local area network protocols and wireless communications where ALOHA and slotted ALOHA protocols are fundamental technologies.
Here, Dr. Abramson gives a talk at the East-West Center on the history of ALOHAnet including how it used Intel’s first microprocessor, the 4004. He also talks how it will continue to influence new developments in wireless communications. He is introduced in the video by the CIO and future president of the University of Hawaii, David Lassner.
Originally designed as a radio communication network ALOHAnet, which connected computers on different Hawaiian Islands, pioneered techniques for handling data collisions over shared communication channels. Using microwave radio transmissions, ALOHAnet protocols allowed multiple devices to share the same communication channel. It introduced a simple but effective way of managing “collisions” that might occur when two devices transmit data simultaneously.
If a collision occurred, the Aloha protocol allowed devices to detect and retry transmission after a random delay, significantly improving the efficiency of shared communication channels. The ALOHA protocol was also one of the first “random access” methods, enabling devices to send data without waiting for authorization or a fixed schedule. This innovation was a breakthrough for networks where many devices, such as wireless and satellite networks, shared the same medium.
Ethernet
Bob Metcalfe cited the ALOHA protocol as directly influencing Ethernet’s design. While working on his PhD at Harvard, Metcalfe went to Hawaii to study the Aloha protocols. He then went to Xerox PARC in Menlo Park California, to develop networking technology for the Aloha Alto computer systems they were developing with Stanford University. More famous for its graphical user interface that Apple would “pirate” for the Lisa and Mac a few years later, the Aloha Alto would adapt and enhance ALOHAnet’s channel-sharing techniques for wired connections. With this Metcalfe laid the groundwork for what became the Ethernet standard, which went on to dominate LAN technology.
Ethernet’s approach to transmitting data in packets was initiated by the ARPAnet’s packet-based structure, which sent data in discrete units. ALOHAnet demonstrated how packet-switching could enable more efficient and flexible communication between multiple users over a shared channel, a principle central to Ethernet and other LAN technologies. ALOHAnet’s use of radio channels showed how digital data could be transmitted wirelessly, paving the way for both Ethernet’s early development and later wireless local area network (WLAN) technologies.
Although Aloha protocols initally handled collisions simply by retransmitting after a random delay, this concept was expanded upon by Ethernet’s CSMA/CD (Carrier Sense Multiple Access with Collision Detection) protocol. CSMA/CD built on ALOHA’s collision-detection mechanism to allow faster, more reliable data transfer in wired networks. Rather than transmitting indiscriminately, CSMA would “listen” to make sure no one else is transmitting before sending out its packets. Collisions can still occur, primarily because of signal propagation delays, so CSMA/CD was developed to adjust to collisions that still might occur. To free up more of the channel, CSMA/CD was also designed to be able to transmit only part of a packet (called a frame in LANs) at times.
Ethernet has become indispensable for high-speed networking and high-capacity applications for data centers, corporate networks, and manufacturing automation. Ethernet has reached extraordinary speeds ranging from 10 Mbps to 100 Mbps, 1 Gigabit Ethernet, 10 Gigabit Ethernet, and higher-speed Ethernet up to 400 Gbps, primarily for advanced networks and data centers. While cabling can be cumbersome and expensive, Ethernet supports high data transfer rates with low latency and minimal data loss and can scale from small home networks to enterprise-level setups while compatible across devices from different manufacturers.
Wireless Communications
ALOHAnet also demonstrated the potential of wireless networking, inspiring future wireless data transmission systems. By showing how to handle contention and collisions in a shared radio network, it became a foundational model for mobile and wireless data communication systems where devices contend for the same wireless spectrum. Many modern wireless technologies, including Wi-Fi networks, rely on concepts that originated with ALOHAnet for managing access to shared channels and retransmitting after collision. Many principles used in Wi-Fi, such as carrier-sensing and collision handling, were modeled on ALOHA’s methods, adapted for higher-speed and more complex wireless environments.
Techniques derived from ALOHA’s collision handling have been integrated into cellular network standards. Modern mobile networks, especially in early generations (2G, 3G, LTE, 5G) adapted variants of ALOHA for handling data requests in base stations and handling multiple simultaneous transmissions over limited spectrum. For example, spread spectrum techniques in cellular networks and CSMA/CD methods in Wi-Fi borrow principles directly from ALOHA, adapted to high-density environments. Spread spectrum systems spread a signal over a wider frequency band than the original message while transmitting at the same signal power, making it harder to intercept or jam.
Abramson’s approach to data transmission over ALOHAnet helped popularize the concept of decentralized network architectures. Unlike traditional telecommunications systems, which often relied on centralized control, the ALOHA protocol allowed for a more flexible and robust form of data transmission. This decentralized model influenced satellite communications, where satellite networks often need to function independently in distributed environments. This model was influential in shaping the architecture of distributed and resilient communication networks, where remote or isolated nodes (e.g., satellites or base stations in remote areas) must handle communication independently.
Satellite Communications
Abramson’s work also contributed to the advancement of satellite Internet, including Starlink, where random access protocols allow for the efficient use of satellite bandwidth, especially in remote and rural areas where terrestrial infrastructure is limited. Random access techniques became crucial for enabling multiple users to communicate through a shared satellite link. Slotted ALOHA and spread spectrum ALOHA also allowed for more efficient and flexible satellite communication systems. These methods have been used extensively in satellite-based messaging, data collection from remote sensors, and early VSAT (Very Small Aperture Terminal) systems.
Many IoT systems that use satellite networks for low-cost, wide-coverage communication rely on techniques similar to ALOHA for uplink data transmission from devices to satellites. This is particularly important for applications like remote sensing, environmental monitoring, and asset tracking, where many devices need to transmit sporadic bursts of data over long distances.
Lasting Impact
Abramson’s work at the University of Hawaii inspired a generation of engineers and researchers in telecommunications to explore new, more efficient methods of managing bandwidth and addressing congestion. His contributions continue to inform R&D in telecommunications, from satellite communications and mobile networks to next-generation IoT and 5G protocols, which employ refined versions of random access and collision avoidance methods. Abramson’s work effectively laid the groundwork for these standard practices. His protocols became foundational in textbooks and engineering curricula, influencing fields as diverse as digital communications theory, networking equipment, and data transmission protocols.
Robert X. Cringely interviewed Norm Abramson about why he moved to the University of Hawaii.
The success and influence of ALOHAnet proved that multiple devices could share the same communication medium effectively, ultimately helping shape the modern landscape of wired and wireless networking. Abramson’s impact on satellite communications and telecommunications can be seen in the widespread adoption of random access protocols, the development of mobile and wireless standards, and the rise of decentralized communication models. His foundational work with the ALOHA protocol allowed for efficient use of shared communication channels and inspired innovations that are integral to modern satellite networks, mobile communications, and IoT applications.
Citation APA (7th Edition)
Pennings, A.J. (2024, Nov 14) The Lasting Impact of ALOHAnet and Norman Abramson. apennings.com https://apennings.com/how-it-came-to-rule-the-world/the-lasting-impact-of-alohanet-and-norman-abramson/
Notes
[1] Abramson, N. (2009, Dec) THE ALOHAnet — Surfing for Wireless Data. IEEE Communications Magazine. https://www.eng.hawaii.edu/wp-content/uploads/2020/06/THE-ALOHANET-%E2%80%94-SURFING-FOR-WIRELESS-DATA.pdf
© ALL RIGHTS RESERVED

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-20637720-1']); _gaq.push(['_trackPageview']);
(function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Tags: Aloha protocol > ALOHANET > Alto Aloha Net > Bob Metcalfe > Carrier Sense Multiple Access (CSMA) > CSMA/CD > Ethernet > Norman Abramson