Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

“Run to Goshen Regardless of Opposing Train”

Posted on | September 29, 2016 | No Comments

When I was growing up, on a summer night, with the bedroom windows open, I could often hear the far off whistle of a train warning of its approach to a nearby road crossing. The trains were leaving or entering my hometown of Goshen, New York; a small community known for its regional government center, proximity to New York City and a history of horse racing. The railroad track is gone now, replaced by a hiking/biking trail called the Heritage Trail and the downtown station now serves as the town police station. Much of the history of those railroad tracks is lost or forgotten, despite its surprisingly notable historical importance.

The quote “Run to Goshen regardless of opposing train” by a superintendent for the New York & Erie Railroad (see map) marked an event that had a significant impact on US history, as it spurred the development of both railroads and the telegraph. The first recorded use of the telegraph in the US to coordinate railroads occurred in 1851 when Charles Minot telegraphed ahead fourteen miles to Goshen to delay a train.[1] This electric communication marked the beginning of a radical convergence of telegraph and railroad technology that would have a lasting influence on the accelerating development of both technologies, and ripple out into many other aspects of American life.

Recently, this article appeared in the Goshen Independent newspaper.

Goshen Train and Telegraph

The convergence of the telegraph and railways would change the future of organizational management and pave the way for the modern corporation. It would also have a major influence on American life through the standardization of time zones, the national distribution of agriculture and industrially produced goods, as well as the regional coordination and arbitrage of commodity prices.

An appropriate historical starting point for this post is the chartering of the New York & Erie Railroad in 1832 to build a rail line across the length of New York. Enos T. Throop, the Governor of New York approved the Act passed by the state’s Senate to allow the railroad to build the 447-mile railroad line. It would connect the southern part of the state at Piermont, New York, a small town with a good harbor on the Hudson River just south of the Tappan Zee Bridge, with Dunkirk, a small village on the eastern shore of Lake Erie. The plan was to connect the ocean at New York City’s harbor to the Great Lakes and the natural bounties of America’s Midwest, elevating New York’s commercial status to the nation’s major entrepot. Innovations in transportation infrastructure would make New York City the country’s center of import and export, for finance and insurance, and for collection and the distribution of goods and services.

Construction of the New York and Erie Railroad began in 1835. Along with the increase in economic growth along the southern part of New York, it was meant to provide some balance to the expenditures on the Erie Canal in the north. The first section completed was from Piermont to Goshen, New York in 1841 at a cost of $20 million. Goshen was the government center of the area, known for dairy products (especially “Goshen butter”), and becoming famous for its harness horse racing. While the process of funding and building the railroad line had not been easy, the entire line to Lake Erie was finally completed in the spring of 1851.

On May 14, 1851, a train carrying President Millard Fillmore, Secretary of State Daniel Webster, and 300 other dignitaries set off on a celebratory tour of this historic railroad accomplishment. They took a steam ship up to the southern tip of New York on the west of the Hudson River and departed amid much fanfare. But by the time it reached Goshen, the locomotive was having trouble with its engine. At the next stop in Middletown, New York, Minot had to telegraph forward to Port Jervis on the Delaware River to have another engine ready. After changing locomotives, the train passed through the southern tier of New York and puffed on to its destination without incident.

The mishap left an impression on Minot who would come to value the role of the telegraph and its relationship with the railway. Shortly after, when the train engineers conspired to refuse to move the trains under such a system, Minot authoritatively issued an order that the telegraph would henceforth be used to coordinate all train movements.

During the intervening time, the telegraph was becoming more prominent. Ezra Cornell, who had worked with Samuel Morse in 1844 on the very first telegraph line, had followed up this historic accomplishment in 1845 with the construction of a major portion of line between New York and Albany. Cornell, the benefactor of the famous university that bears his name, was more involved in the construction aspect of the telegraph line and even patented machinery for laying cable under the ground. In 1848, he helped put together the New York & Erie Telegraph Company to build a line of telegraph wire from New York to Dunkirk along roads in the southern border counties of New York. The completed line was celebrated in 1849 but soon found its fate intertwined with the railroads.

    The record of the very first train order sent in the U.S. was documented by Edward H. Mott in his book on the history of the The Story of Erie, Between the Ocean and the Lakes. According to William H. Stewart, a retired Erie Railroad conductor, in the “fall of 1851,” Charles Minot was on a west bound train stopped at Turner, N. Y. waiting for an eastbound train coming from Goshen, N. Y., fourteen miles to the west. The impatient Minot telegraphed Goshen to see if the train had left yet. Upon receiving a reply of “no,” Minot wrote out the order: “To Agent and Operator at Goshen: Hold the train for further orders, signed, Charles Minot, Superintendent.” Minot then gave Stewart, who was the conductor of Minot’s train, a written order to be handed to the engineer: “Run to Goshen regardless of opposing train.” The engineer, Isaac Lewis, refused Minot’s order because it violated the time interval system. Minot proceeded to verbally direct Lewis to move the train but he again refused. Lewis then became a passenger on the rear seat of the rear car and Minot, who had experience as an engineer, took control of the train and proceeded safely to Goshen. Shortly thereafter, the Erie adopted the train order for the movement of its trains and within a few years the telegraph was adopted by railroads throughout the U.S.[2]

The telegraph industry was still struggling to take off. The next year, 1852, the New York and Erie telegraph line failed due to competition and declining prices. It was the longest line in the country at the time and it was hard to manage, but a partnership with the railroads soon changed its fortunes. With the urging of Minot, Ezra Cornell bought the telegraph business back and renamed it the New York & Western Union Telegraph Company.

With help from Minot, they transferred poles from the roadways and placed them along the railroad tracks of the Erie Railroad. They also agreed that railroad depot employees would be telegraphy operators in exchange for unlimited use of the communication lines. The telegraph allowed for a single track railroad that would be more efficient, less expensive, and safer for the passengers. Trains could be coordinated safely in both directions, even on a single track. Cornell took on the role of Superintendent of the company. He soon joined the company with other partners and it became Western Union in 1856.

The New York and Erie was the first railroad in the U.S. to utilize the telegraph in its management activities. Minot promoted Luther Tillotson to the superintendent of telegraphs in charge of the eastern half of the route even though he was only 19 years old. Tillotson became one of the pioneers in utilizing the telegraph for train dispatching and later developed his own company manufacturing and selling telegraph instruments to the railroad industry.

The railroad company also hired civil engineer Daniel McCallan to coordinate its railroad routes. He realized that the management of a line nearly 500 miles long was a different venture than one that was only 50 miles. His solution was a data processing and management information system based on telegraphy, record-keeping, and regularized reports. He also drew up one of the first organizational charts and required all employees to wear uniforms indicating the employee’s rank in the organizational hierarchy. But he also recognized the importance of reversing the hierarchy of information flow and stressed the importance of communications going from the subordinates up to managers, rather than just the opposite.[3]

The telegraph was soon being used to transmit hourly reports on the progress of trains and to precisely coordinate the movement of railroad cargo and passengers. Experimentation with the telegraph had been conducted by the Great Western Railway in Britain as early as 1837 but they inexplicably lost interest, probably because the technology was in its infancy and the codes used were cumbersome. The standardization of Morse code was crucial to the success of the telegraph.

Railroads had been reluctant to expand territorially because of the difficulties of managing such vast and complex movements. However, responding to many tragedies involving train collisions, the railroads took steps to synchronize their routes. Trains required precisely timed movements through spatial landscapes and needed the capability of telegraphy for communicating ahead for coordination.

In October 1861, Western Union connected telegraph lines from the east and west at Salt Lake City in Utah, providing the first nearly instantaneous transcontinental communication between San Francisco and Washington DC. With westward expansion, railroads and telegraph lines soon crisscrossed the country. As the Civil War erupted, both the North and South attempted to capitalize on these technological developments. President Lincoln, who had been a railroad lawyer, studied the railroad and telegraph strategies used by Napoleon III to help unify Italy in 1859. He set up a communications post outside the White House to execute the war by mobilizing troops and organizing supplies. Many military officers in both the North and South armies became logistics experts and went on to help create the post-War national economy.

These two technologies became highly dependent on each other. As railroads became the major mode of transportation for most parts of the United States, it also aided the construction of telegraph lines. The combination of the railroad and the telegraph allowed the modern corporation to emerge. Companies like National Biscuit Company (Nabisco), Sears, and Standard Oil used railroads and telegraphs to expand outside their originating regions. R.W. Sears, a telegrapher, quit his job to form a catalog supply company, the Amazon of its time.

Britain also developed a telegraph network throughout its empire with undersea links to India, Indo-China, and Australia. Expansion brought on new problems in coordination. The information flow of the telegraph networks stimulated new data collection and processing technologies that were crucial for managing logistics, marketing, and production. The telegraph, and the complexities of organizing information over geographical space brought on by the railroads (and steam shipping) built the foundations of the modern data processing industry.

To create this new type of organization, a standard ‘time’ needed to be determined. The Prime Meridian Conference in Washington DC convened to find a solution. In 1884, they chose Greenwich, England as the zero meridian and divided the Earth into twenty-four one-hour time zones to help facilitate the coordination of transportation and commercial transactions.[4] With a standard “railroad time” in place, complex tables could be produced by cross-listing train movements with cities, towns, and ports and as a result regularizing transportation schedules. This brought “time” to rural America and the western frontier.

Innovations in time-tabling worked with the telegraph system to choreograph the complex scheduling of railroad transport. Transportation schedules accelerated the number of journeys in a given time period and provided consistency for passengers and freight hires. This mode of sequencing time and transport proved crucial for the development of regional and then nationwide markets for mass-produced goods.[4]

If the railroads provided the muscle, the telegraph provided a rudimentary nervous system. Information about train delays, obstacles, needed repairs, and passenger numbers, all flowed through the telegraph to keep the railroad system running smoothly.[5] Eventually, another innovation that would prove crucial for the management of railroads was the punch-card tabulator, developed by future IBM founder Herman Hollerith.

The tabulating machine was based on the tickets that train conductors punched indicating the characteristics of ticket-holders such as hair color and sex. Hollerith developed a way to run electrical charges through the holes of the punch cards that turned dials, tallied numbers, and tabulated results. His punch-card and tabulating technology would be used to determine the 1890 US census, as specified in the US Constitution. The task had proved physically impossible to complete in ten years by manual labor due to population growth and the speed of immigration from Europe.

Hollerith’s tabulating machine and punch-cards were later used by the railroads in conjunction with telegraphy to manage cars and cargo. J.P. Morgan became a big believer in both technologies, and they became central to his strategy of “Morganizing” companies. His rise to financial notoriety began with railroads, starting with the creation of New York Central, but he was also known for consolidating other companies such as AT&T and US Steel. As American companies like Nabisco, Standard Oil, meat-producer Swift, and Western Union began to operate wide-scale businesses, the combination of punch-card tabulators and the telegraph became crucial management tools and the ancient forerunners of the Internet age.

So, what happened to the train track running through Goshen?
The Erie railroad was originally designed with unique gauge and spacing that was meant to keep the trains operating only in New York. But as this proved to inhibit its profitability, the entire train track was eventually replaced, allowing the rail to run into New Jersey down to the Hoboken station and as far as Chicago.

In 1984, with the approval of Orange County, the Metro-North Railroad ripped up the Erie Mainline between Harriman and Middletown, New York. The Metro-North Railroad chose not to recondition the historic line for passenger service. After more than 140 years trekking through Goshen, the railroad line was abandoned in favor of a longer route between Hoboken and Port Jervis for passengers commuting to and from New York City.

Notes

[1] Beniger, J.R. (1986) The Control Revolution: Technological and Economic Origins of the Information Society (pdf). Cambridge, MA: Harvard University Press. p. 230. Beniger’s reference for this information was from Edward Harold Mott’s Between the Ocean and the Lakes: The Story of Erie published in 1901 by John S. Collins in New York.
[2] A Monument to Charles Minot. Accessed September 26, 2016. A version of this article was originally published in the October 2006 issue of The AWA Journal, a quarterly publication of
The Antique Wireless Association, a non-profit historical society.
[3] Information on Daniel McCallan’s organization of Erie Railroad from Beniger, pp. 228-233.
[4] Anthony Giddens tied the relationship together between the telegraph and transportation in his work on administrative power and internal pacification in The Nation-State and Violence. (1987) San Francisco, CA: University of California Press. p. 174-175.
[5] This is from a short story about how the railroads and telegraph started working together.
© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor at the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he was on the faculty of New York University. Previously, he taught at Hannam University in Korea and Marist College in New York. He started his career at Victoria University in New Zealand. He spent a decade as a Fellow at the East-West Center in Honolulu, Hawaii. Originally from Goshen, New York, he now keeps a home in Austin, Texas.

CISCO SYSTEMS: FROM CAMPUS TO THE WORLD’S MOST VALUABLE COMPANY, PART TWO: Starting Up the Tech

Posted on | September 13, 2016 | No Comments

Len Bosack and Sandy Lerner left Stanford University in December 1984 to launch Cisco Systems, a California-based company known for innovative networking devices. Data communications, and particularly, packet-switching, the key technology of the Internet, was in its infancy. Leading computer companies like IBM were slow to make the major innovations needed for its success. So the recently married duo took a chance and started their company to develop and market data networking technology.

In a previous post, I discussed the formation of Cisco and its contribution to universities connecting to the NSFNET. In this post, I examine how Cisco Systems emerged around the “Blue Box” network software and technology and how the company went on to become the key supplier of key Internet technologies for the emerging World Wide Web.

The key to Cisco’s strategy was to develop routing technology that could direct the packets of data along the network. In a sense, routing technology would act like a traffic cop helping to move automobiles through a busy intersection. Similarly, data routers facilitate movement of packets of digital information – 1s and 0s – through the intersections of the Internet. Packets of digital data are constructed by Transmission Control Protocols (TCP) and individually addressed by the Internet Protocol (IP) to transverse various networks. Routers were created to read an IP address and direct the packet towards the next stop in the network, or to another network, on its “route” to the intended destination.

This video is a little advanced, but discusses the differences between switches and routers.

The Internet was originally designed as a military network that could compensate if an enemy attack destroyed some network nodes. Early data communication equipment was designed to sense if a network node was offline (ie destroyed) and choose another route to direct the data towards its final destination. As networks could also become congested with too much traffic, modern routers were developed to determine the best path to send the email, document, file or web page, often in terms of the “cost” of transmission as well.

The Cisco founders did not invent the router technology. Instead, they drew together important work by many people at Xerox PARC and Stanford University that became the basis for the “Blue Box” (1981). This portable computer was originally designed to increase the distance between networked computers but turned out to be much more. The Blue Box incorporated three crucial innovations: the 3Mb Ethernet transceiver and adapter, workstation network boards, and the software that became the foundation for the Cisco operating system.

3Mb Ethernet Transceiver and Adapter

Xerox PARC is more popularly known for developing the graphic user interfaces that became the basis for the Apple Macintosh and Microsoft’s Windows environment. PARC had donated a lot of computers and network technology to Stanford University. This became a dynamic new environment for fertile innovation and the development of many of our current digital technologies.

Robert Metcalfe brought the AlohaNet technology to PARC that resulted in the Ethernet technology. The new version was standardized in 1983 as IEEE 802.3. Initially, it was used widely for Local Area Network (LANs) on campuses and companies; it is currently used for many services including wireless communications. Metcalfe formed a company, 3Com Corporation that became a leader in client-server networking and expanded into product areas such as digital switches, internetworking routers, modems, network hubs, network management software, network interface cards, and remote access systems.

Workstation Network Boards

Andy Bechtolsheim, with other Stanford graduate students, produced a network board based on their Aloha Alto Computer. This technology connected the computer to the Ethernet and became the prototype of the SUN workstation that was later marketed by another spin-off company, Sun Microsystems.

Software for Multiple Routing Protocols

Bill Yeager, who wrote software for a number of network connections on campus, including an ARPANET Interface Message Processor (IMP), contributed crucial code for the Blue Box. His software provided instructions to guide data traffic from different LANs using multiple routing protocols. Networks at the time were almost exclusively proprietary, designed to connect equipment from the same manufacturer. Yeager designed protocols that permitted data to be exchanged among different types of mainframe terminals, printers, and workstations. It initially linked Xerox’s Aloha Alto workstations, mainframes, mini-computers and printers, but was rewritten to connect different networks. Yeager’s software became the foundational operating system of Cisco’s routers and consequently, for modern local and wide area networking, including the Internet.

In the next section I will focus on Cisco’s push into TCP/IP protocols.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Vacation

Posted on | August 28, 2016 | No Comments

Enjoyed a month of vacation, with some work thrown in of course. San Francisco to meet with my favorite venture capitalist, and home to Austin. Took a desert road trip with family from Phoenix to Las Vegas with a lovely stop in Sedona. Took my daughter to Miami, the Bahamas, Cape Canaveral and Cocoa Beach, Florida. Then New York for a few days of lake life in the Catskills, minus a day trip to Stony Brook University on Long Island. Manhattan and back to Washington Square Park where we lived for 9 years. Farewell dinner with friends at the restaurant in Korea town where my daughter decided to be born 10 days early (spicy food!). Got an extra day in Austin when my flight was canceled due to a typhoon in Japan. Many thanks to all that contributed to our trip. Too many to mention, but so much for which to be thankful.

Anthony

© ALL RIGHTS RESERVED


Xbox One – Extending Virtual Reality and Multi-Player Games

Posted on | August 2, 2016 | No Comments

Today we got the first look at the new Xbox game console, the Xbox One S. It’s been three years since the original Xbox One was introduced xbox-one-sin a broadcast live on Spike TV from the Microsoft campus in Bellevue, Washington when representatives from Microsoft’s Xbox team and strategic partners such as Activision and EA demonstrated and hyped the new system. The new Xbox One S is smaller and lighter with more processing power and 2TB of internal storage. More relevant for this discussion is the backward compatibility (including Xbox 360 classics) and the continuing trend towards immersion and augmented/virtual environments.

It’s clear that Microsoft doubled down on the living Microsoft-xbox oneroom and large screen 1080p HDTV. Despite the recent popularity of mobile games, the Xbox One series was designed to return us to an immersive gaming experience within the home environment that integrates games with movies, music, live TV, Skype, and web browsing.

It’s too early to evaluate the Xbox One S, but I wanted to review one related technological trajectory, virtual reality (VR) and its relationship to the gaming experience and related industries. The new Xbox One Architecture combined the Xbox OS with Windows and a new connective tissue that works with Kinect to respond to voice, gestures, and body movements. With games such as Call of Duty: Ghosts designed for it, Microsoft promised a whole new level of immersive gameplay.

For a variety of reasons, Kinect was cancelled by Microsoft in late 2017.

Virtual reality began as an idealistic notion of the early 1990s, popularized through avant-garde magazines such as MONDO 2000, non-fiction best-sellers like Howard Rheingold’s (1991) Virtual Reality, sci-fi novels like Neal Stephenson’s (1992) Snow Crash and cyberfiction movies like The Lawnmower Man (1992). Star Trek: Next Generation provided the most dramatic example of what virtual reality could be like with its Holodeck. But VR’s future, at least its immediate future, was in the gaming industry.

Drawing on flight simulation technology and research, VR captured the imagination of pre-Web techno-enthusiasts deliberating the future of what William Gibson’s termed “cyberspace”. Conceptualized with electronic accessories such as high definition LCD goggles, surround sound, fiber-optic laced gloves and pressure sensitive body suits, VR was designed to simulate the world in a computerized artificial reality. It was conceived as a system that would suspend the viewer’s belief that the environment is produced and immerse them in a highly responsive, multi-sensory apparatus. The Renaissance invention of perspectival art, with its vanishing point creating a first-person view, proved to be one of the most important drivers of VR as it focuses attention and reinforces the ego.

While most of the technology and associated software was developed for various simulation devices, it was the digital game industry that fully capitalized on this innovation. Castronova pointed to three reasons for this new path.

  1. One was that the digital game environment focused for the most part on software, not hardware. From Magnavox’s Odyssey in 1972 to Microsoft’s Xbox 360 in 2005, the game console proved to be a crucial platform for video game play, but it was the “killer app” software applications like Pong and Pac-man that propelled the industry’s success. VR development, on the other hand, was dominated by gadgetry such as the goggled helmet, the force-feedback glove and the sensor-laden body suit. The console was a major contributor to the video game explosion, but it was a series of good games that propelled the development of virtual game environments.
  2. The second reason VR was less successful than game virtual environments was that the virtual reality industry was pushed more by research concerns than by commercial concerns. The game industry on the other hand had no compunctions about its profit-making origins and goals.
  3. The third reason “the game version of VR” proved more successful was that it focused “on communities, not hardware.” From shoot-em-up Quake II free-for-alls on networked PCs to the programmed pandemonium of Atari Test Drive on Xbox Live, the social experience has been central to the success of the game experience.[1]

It was a young company named id Software that pioneered many of the virtual environment features that characterize the contemporary game environment. The small Texas-based company used the ego-centric perspective to create the first person shooter (FPS) game, Wolfenstein, in May of 1991. Id followed with the extraordinarily successful DOOM in December 1994. The game extended an image of a weapon into the vanishing point to orient the player’s perspective as they hunted a variety of monsters through a research facility on an alien planet. DOOM combined a shareware business model with the nascent distribution capabilities of the Internet. Just two months after Netscape introduced its first browser as freeware over the Web, DOOM enthusiasts by the droves were downloading the game by FTP to their PCs, many of them with a 14.4 kb modem. The first third of the game was freeware while another 27 levels and several new weapons could be purchased for a modest sum.

doom

In a prescient move, id decided to make DOOM’s source code available to its users. This allowed their fans to create their own 2.5D (not quite 3-D) levels and distribute them to other players. Making the code available also allowed new modifications of the game called “mods”, including a popular one that involved the characters from the Simpsons’ animated TV show running around the DOOM environment with Homer Simpson able to renew his health by finding and eating donuts. The US military created a version called Marine DOOM designed to desensitize soldiers to the idea of killing. Many of the company’s new employees were recruited because they had developed expertise by designing their mods.

Id’s innovation streak didn’t stop there as they also pioneered multiplayer capability. While other games had developed an interactive mode between two players, DOOM allowed up to eight players over local area networks (LANs) or modems. Their next few games, QUAKE and QUAKE II, increased the capacity to 32 players while using true 3-D graphics to create a virtual world of stunningly immersive environments and player mobility.

Multiplayer games took off with QUAKE II and have since morphed into multiple variations including the Massively Multiplayer Online Game (MMOG) that can involve hundreds of players at a time. One of major early innovators of the MMOG was Archetype Interactive who conceived Meridian 59 using DOOM graphics technology from id and sold it to 3DO who coined the term “Massively Multiplayer” to market the innovative game.[3] It was Ultima Online that proved there was a market for online multiplayer games. Based on the popular Ultima game, its subscriber base grew to over 200,000 in over 100 countries. But Ultima Online was also the first to face a number of technical and community problems including synchronizing the game experience for all participants and establishing a system of player etiquette. In 1999, Sony Entertainment Online (SOE) opened up its Everquest universe online. It made national news when players started selling virtual items on eBay and established the validity of an online 3D role playing game. Motivated by Everquest’s success, Microsoft pushed up the release its Asheron’s Call on its Zone.com gaming site.

Virtual worlds have morphed into a wide variety of environments and games for all ages. MMOGs (Massively Multiplayer Online Games) emerged as one the biggest revenue producers of online games and are expected to remain so in the near future with on-demand games running a fairly close second. These games connect hundreds to thousands of game players in a virtual environment that often includes its own internal economy. The “fairies and elves” genre and particularly World of Warcraft reigned. At its peak it had upwards of 12 million subscribers sending in $30 million a month in subscription fees. But other games like RuneScape are challenging its dominance and sci-fi games like Eve Online and Planet Calypso have also prospered lately.

So what is the future of gaming in virtual reality or what the Xbox people are calling “living and persistent worlds“?[4] Nintendo released the WII on November 19, 2006 that was notable for a remote that could be used as a handheld pointing device. Gamers flocked to their sports package with games like tennis and baseball that could be played virtually using the Wii Remote as a racket or a bat. Microsoft responded with the Kinect in 2010 that could sense body movements. It immediately broke records by selling over 8 million units in its first two months. The Xbox One has an improved Kinect that reads its environment with a HD camera, taking in some 2 GB of photonic information every second with its Time of Flight (TOF) technology. Its algorithms allow it to register the details of each body it scans, gauging the direction and balance of the skeletal system, the energy of each motion, the force of each impact, and even monitoring the heart rates of each player.

The Xbox may not be living up to VR ideals of sensory force feedback and other forms of haptic connectivity, but the level of popularity suggests that successful gameplay is often achieved. Kinect provides a level of bodily interaction that have made games like Dance Central and microsoft kinectDance Central 2 quite popular. The Xbox controller, despite a relatively steep learning curve and limited body engagement, provides a number of options that once learned, adds levels of complexity that reward those who master them.

For a successful virtual engagement, it appears that what is most important is that a level of psychic/cognitive stimulation is achieved by participating in an artificial challenge or conflict that operates within defined parameters or rules, and results in an observable change or quantifiable result. In other words – a game.[5] As long as these conditions are being met we can expect a rich pattern of future innovation in this area.

Notes

[1] From Castronova’s “Appendix: A Digression on Virtual Reality”, in Synthetic Worlds: The Business and Culture of Online Games. p. 285.
[2] Anthony J. Pennings, “The Telco’s Brave New World: IPTV and the “Synthetic Worlds” of Multiplayer Online Games” for the Pacific Telecommunications Council Proceedings. January 15-18, 2005 Honolulu, Hawaii.
[3] Information on the first MMOGs from “Alternate Reality: The History of Massively Multiplayer Online Games”, By Steven L. Kent Sept. 23, 2003. Located at Gamepsy.com on November 28, 2005.
[4] Marc Whitten’s presentation on the technical aspects of Xbox One was broadcast live on Spike TV.
[5] For a great explanation of games and gameplay read Rules of Play: Game Design Fundamentals by Katie Salen and Eric Zimmerman.

Anthony

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

CONTENDING “INFORMATION SUPERHIGHWAYS”

Posted on | July 16, 2016 | No Comments

During the 1980s, before the reality of the Internet, a new communications infrastructure was initiated based on digital technologies. Propelled largely by growing demand for new microprocessor-based business services and fuelled by the availability of low-grade “junk bonds,” companies like MCI, Tele-Communications Inc. (TCI), Turner Broadcasting, and McCaw Cellular raised over $20 billion dollars to lay a fiber optic networks and implement new digital services such as videotext and interactive television.

Soon several modes of telecommunications were competing for the title of “information superhighway,” a popular metaphor for the changes happening to data communications and the potential for expanded telecommunications services such as interactive television. Generally attributed to then Senator Al Gore, the term was co-opted by the Bell companies who finally saw new opportunities coming with the digital revolution. For instance, Bell Atlantic and TCI attempted to form a merger that would offer interactive information services and video-on-demand over both cable and telephone lines.

Although the Internet was already twenty years old, it still had not achieved the type of technical robustness needed to capture popular and commercial attention. Wireless communication was growing, but still primarily used by an elite business class due to the lack of dedicated spectrum and wide-scale infrastructure problems. Cable TV was also a contender, with an extensive subscriber base and having built up its coaxial and fiber infrastructure during the late 1980s. Satellite was also in the running, with dramatically increased power capabilities resulting from the continuing development of solar power. The ability to efficiently transform sunlight into signal radiating power allowed smaller and smaller “earth station” antennas to pick up broadcast and narrowcast signals. A longshot were the power utilities that had developed technology to transmit data along its electrical lines. They lacked installed capacity but had good maintenance teams, billing systems, and ready access to homes and other buildings.

Which technology was going to rise to this status? Despite two decades of existence, the Internet was relatively archaic with no World Wide Web and few high-speed backbone networks. Wireless systems lacked the spectrum or infrastructure for broadband transmission over significant geographic domains. Interactive television was becoming a possibility as the FCC rolled back restrictions on the common carriers providing content, but despite ADSL over copper and fiber-to-the-home, software and content factors proved major limitations.

Interactive consumer services got their start with videotext offerings, but the terminals were large and awkward, and it only displayed textual information. Telephone companies soon began testing other electronic services. For instance, Bell Atlantic and TCI attempted to form a merger that would offer interactive information services and video on demand.

“Telcos” created a new technology for enhancing telephone lines called ADSL (Asynchronous Digital Subscriber Line) that could provide video over existing copper lines to the home. They were also intensively lobbying Washington DC to create a favorable regulatory environment for their new endeavors, specifically trying to derail the 1984 Cable TV Act that excluded them from offering television services.

Divested from ATT in the early 1980s and deprived of the lucrative long-distance services, the Regional Bell Operating Companies (RBOCs) such as Ameritech, Bell Atlantic, BellSouth, and others sought to take advantage of their monopolies over local telecommunications by providing such services as ISDN and interactive television.

By the early 1990s, the Baby Bells were conducting tests using ADSL (Asynchronous Digital Subscriber Line) to provide video over existing copper lines to the home. Disillusioned by the costs of providing fiber to the home, telcos looked to leverage their existing plant. ADSL could send compressed video over the established telephone lines. It was suitable to this task because it could send data downstream to the subscriber faster (256Kbps-9Mbps) than upstream (64Kbps-1.54Kbps) to the provider.[1]

The Bell telcos were also intensively lobbying Washington DC to create a favorable regulatory environment, specifically trying to derail the Cable Communications Policy Act of 1984 that excluded them from offering television services. Bell Atlantic even attempted to merge with cable TV giant TCI in anticipation of their control of the new information highways.

The Cable Communications Policy Act of 1984 triggered strong concerns that the cable TV industry was becoming too strong in relation to other parts of communications/media industry. Growing horizontal and vertical integration as well as a subscriber rate encompassing over 60% of American homes threatened the telecommunications companies, which began to press their own claim to the household imagination.[2]

The information superhighway, as envisioned by the Bell Atlantic-TCI merger, ran into a roadblock when Congress overrode President Bush’s veto of the 1992 Cable Act. The new rules allowed Clinton’s FCC administration to lower cable rates. TCI’s stock dropped and the deal fell through. The FCC had implemented new econometric models that allowed them to reduce cable TV rates in select markets around the country without reducing cable companies revenue. When the FCC announced its new rulings in February 1994, both companies announced that the new regulations had killed their deal.[3]

Well, we know how this story turns out. With its TCP/IP software, the Internet became the world’s “information superhighway.” Its ability to connect differing computers and operating systems had given it unprecedented connectivity in the computer world, and over the course of the 1990s, it became the preferred conduit for communications and netcentric commerce.

Notes

[1] Speed rates on ADSL are listed as their more current rates as provided by Heidi V. Anderson, “Jump-Start Your Internet Connection with DSL,” in How the Internet Works, Part I. SMART COMPUTING REFERENCE SERIES. p. 105
[2] Logue, T. (1990) “Who’s Holding the Phone? Policy Issues in an Age of Transnational Communications Enterprises,” PTC ’90 Pacific Telecommunications Council Annual Conference Proceedings, Honolulu, Hawaii. p. 96.
[3] Hundt, R.E (2000) You Say You Want a Revolution: A Story of Information Age Politics. Yale University Press. pp. 30-34.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Digital Games and Meaningful Play

Posted on | June 30, 2016 | No Comments

What is a game? What makes it fun? How can you design a game to provide a meaningful and rewarding experience? Rules of Play: Game Design Fundamentals by Katie Salen Tekinba and Eric Zimmerman is a great blend of theory and practical application and helps us understand the importance of “gameplay” – the emotional relationship between player actions and game outcomes. The book helps explain what makes games, from baseball to virtual reality games, effective and meaningful. In this post, I look at some of the key ideas involved in understanding games, using baseball as a primary example.

A game is a structured form of playing and involves choices of action. It invokes an organized way of making choices, taking action, and experiencing some kind of feedback. In other words, game players take some visible action and the game responds with information that provides feedback to the player and subsequently changes the status of the game. Below is a picture of my daughter playing a virtual reality game of baseball.

Baseball VR Game

The actions and subsequent outcomes need to be discernible – understandable and visible. And they need to be integrated into the game. In baseball, for example, a batter makes a decision to swing at a ball thrown by the pitcher. Several things can happen based on the trajectory of the pitch and the way the batter swings. She can swing and miss, or hit the ball for one of several results: foul ball, base hit, home run, pop out, etc.

The result of the action needs to be evident and contribute to the game. The foul ball is registered as a strike; a base hit can move runners or at least get the batter on base. A home run is an ultimate action in baseball as it adds immediately to the final score of the game.

The many recognizable actions of a game is one of the reasons “datatainment” is a prominent part of baseball. Hits, home runs, ERA, strikeouts are all significant acts that can be distinguished and statistically registered on a baseball scoreboard or the season “stats” of a player. Not only are players evaluated based on these measures, fans of professional sports often take a keen interest in these numbers as part of an identification process with players. Sports teams look to deepen fan engagement by going beyond box scores to digitally-enabled fantasy sports and other forms of social involvement and entertainment.

baseball scoreboard

Choices and actions change the game and create new meanings. They move the game forward. Strikes end batters; outs end innings. As the game moves forward, new meanings are created. Heroes emerge, a team pulls ahead, a team comes from behind. A good game drives emotional and psychological interest, either through a tribal allegiance to a team, an interest in a player, or a recognition of the stakes of a game, as in a championship such as the World Series. But in every case, the game must have discernible actions that have a meaningful impact on the progress and result of the game.

Citation APA (7th Edition)

Pennings, A.J. (2016, Jun 30). Digital Games and Meaningful Play. apennings.com https://apennings.com/meaningful_play/games-and-meaningful-play/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the (SUNY) State University of New York, Korea. When not in Korea he lives in Austin, TX where he taught in the Digital Media MBA program at St. Edwards University. At New York University, where he spent most of his career, he created the BS in Digital Communications and Media. Previously, he taught at Victoria University in New Zealand and also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Xanadu to World Wide Web

Posted on | June 11, 2016 | No Comments

Tim Berners-Lee, a British citizen and a software consultant at CERN, or Centre European pour la Recherche Nucleaire developed what came to be known as the World Wide Web (WWW). Located in Switzerland, CERN was Europe’s largest nuclear research institute, although the name was changed to European Laboratory for Particle Physics to avoid the stigma attached to nuclear research.

In March of 1989, Berners-Lee proposed a project to create a system for sharing information among CERN’s dispersed High Energy Physics research participants. This information management system would form the basis of the global Internet, especially after 1994, when he founded the World Wide Web Consortium (W3C), a standards organization that began to guide the Internet’s interoperable technologies with specifications, guidelines, software, and tools for web addresses (URLs), Hypertext Transfer Protocol (HTTP) and Hypertext Markup Language (HTML). These technologies allowed web browsers like Mosaic, Netscape, Internet Explorer, and later Firefox and Chrome, to access data and display web pages.

Berners-Lee wanted to create a system where information from various sources could be linked and accessed, creating a “pool of human knowledge.” Using a NeXT computer built by Steve Job’s post-Apple company, he wrote the prototype for the World Wide Web and a basic text-based browser called Nexus. The Next computer had a UNIX operating system, built-in Ethernet and a version of the Xerox PARC graphical user interface that Jobs had transformed into the Apple Mac. Berners-Lee credited the NeXT computer with having the functionality to speed up the process, saving him perhaps a year in the coding process.

Dissatisfied with the limitations of the Internet, Berners-Lee developed this new software around the concept of “hypertext,” which had originated in Ted Nelson’s Computer Lib, a 1974 manifesto about the possibilities of computers. Nelson warned against leaving the future of computing to a priesthood of computer center guardians that served the dictates of the mainframe computer.

Ted Nelson’s Xanadu project allowed him to coin the terms “hypertext” and “hypermedia” as early as 1965. Xanadu is the original name for Kublai Khan’s mythical summer palace, described by the enigmatic Marco Polo. “There is at this place a very fine marble palace, the rooms of which are all gold and painted with figures of men and beasts and birds, and with a variety of trees and flowers, all executed with such exquisite art that you regard them with delight and astonishment.” Nelson strove to transform the computer experience with software and display technology that would make reading and writing an equally rich “Xanadu” experience.

An important transition technology was HyperCard, a computer application that allowed the user to create stacks of connected cards that could be displayed as visual pages on an Apple Macintosh screen. Using a scripting language called HyperTalk, each card could show text, tables, and even images. “Buttons” could be installed on each card that linked it to other cards within the stack with a characteristic “boing” sound clip. Later, images could be turned into buttons. Hypercard missed out on historical significance because of Apple’s “box-centric culture,” according to HyperCard inventor Bill Atkinson. He later lamented, “If I’d grown up in a network-centric culture, like Sun, HyperCard might have been the first Web browser.” [1]

Berners-Lee accessed the first web page, on the CERN web server on Christmas Day, 1990. He spent the next year adding content and flying around the world to convince others to use the software. Concerned that a commercial company would copy the software and create a private network, he convinced CERN to release the source code under a general license so that it could be used by developers freely. One example was a group of students at the University of Illinois at Urbana-Champaign’s National Center for Supercomputing Applications that was part of the NSFNET. Marc Andreessen and other students created the Mosaic browser that they distributed for free using the Internet’s FTP (File Transport Protocol). They soon left for Silicon Valley where they got venture capital to create Netscape, a company designed around their Web browser called Netscape Navigator.[2]

Berners-Lee designed the WWW with several features that made it extremely effective.

First, it was based on open systems that allowed it to run on many computing platforms. It was not designed for a specific proprietary technology but rather would allow Apples, PCs, Sun Workstations, etc. to connect and exchange information. Berners-Lee compared it to a market economy where anyone can trade with anyone on a voluntary basis.

Second, it actualized the dream of hypertext, the linking of a “multiverse” of documents on the WWW. While Ted Nelson would criticize its reliance on documents, files, and traditional directories, the “web” would grow rapidly.[3]

Third, it used a hypertext transfer protocol (HTTP) to create a direct connection from the client to the server. With this protocol and taking advantage of packet-switching data communications, the request for a specific document is sent to the server and either the requested document is sent or the client is notified that the document does not exist. The power of this system meant that the connection was closed quickly after the transaction, saving bandwidth and freeing the network for other connections.

Fourth, it also worked with the existing Internet infrastructure and integrated many of its basic protocols including FTP, Telenet, Gopher, e-mail, and News. FTP was particularly important for the distribution of software, including browsers. Newsgroups informed people all around the NSFNET that the technology and associated browsers were available.

Another crucial feature was that content could be created using a relatively easy to use interpretation language called Hypertext Markup Language (HTML). HTML was a simplified version of another markup language used by large corporations called Standard Generalized Markup Language (SGML). HTML was more geared towards page layout and format, while SGML was better for document description. Generalized markup describes the document to whatever system it works within. HTML and SGML would form a symbiotic relationship and eventually lead to new powerful languages for e-commerce and other net-centric uses like XML (eXensible Markup Language) and HTML 5.

Finally, Berners-Lee developed the uniform resource locator (URL) as a way of addressing information. The URL gave every file on the WWW, whether it was a text file, an image file, or a multimedia file, a specific address that could be used to request and download it.[4]

Together, these features defined a simple transaction that was the basis of the World Wide Web. In summary, the user or “client” establishes a connection to the server over the packet-switched data network of the Internet. Using the address, or URL, the client issues a request to the server specifying the precise web document to be retrieved. Next, the server responds with a status code and, if available, the content of the information requested. Finally, either the client or the server disconnects the link.

The beauty of the system was that its drain on the Internet was limited. Rather than tying up a whole telecommunications line as a telephone call would do, the HTTP allowed for the information to be downloaded (or not) and then the connection would be terminated. The World Wide Web began to allow unprecedented amounts of data to flow through the Internet, changing the world’s economy and it’s communicative tissue.

In another post I discuss how important hypertext clicks are to the advertising industry through the tracking of various metrics.

Notes

[1] Kahney, L. (2002, August 14). HyperCard: What Could Have Been. Retrieved June 10, 2016, from http://www.wired.com/2002/08/hypercard-what-could-have-been/
[2] Greenemeier, L. (2009, March 12). Remembering the Day the World Wide Web Was Born. Retrieved June 11, 2016, from http://www.scientificamerican.com/article/day-the-web-was-born/
[3] Banks, L. (2011, April 15). Hypertext Creator Says Structure of World Wide Web ‘Completely Wrong’ Retrieved June 11, 2016. Also Greenemeier, Larry. “Remembering the Day the World Wide Web Was Born.” Scientific American. N.p., 11 Mar. 2009. Web. 13 May 2017. .
[4] Richard, E. (1995) “Anatomy of the World Wide Web,” INTERNET WORLD, April. pp. 28-20.

Citation APA (7th Edition)

Pennings, A.J. (2016, Jun 11). From Xanadu to World Wide Web. apennings.com https://apennings.com/how-it-came-to-rule-the-world/xanadu-to-world-wide-web/

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

Cisco Systems: From Campus to the World’s Most Valuable Company, Part One: Stanford University

Posted on | May 24, 2016 | No Comments

Cisco Systems emerged from employees and students at Stanford University in the early 1980s to become the major supplier of the Internet’s enigmatic plumbing. In the process, it’s stock value increased dramatically and it became the largest company in the world by market capitalization. Cisco originally produced homemade multi-platform routers to connect campus computers through an Ethernet LAN and throughout the 1980s, they built the networking technology for the National Science Foundation’s NSFNET. As the World Wide Web took off during the 1990s, they helped countries around the world transit their telecommunications systems to Internet protocols. Cisco went public on February 4, 1990, with a valuation of $288 million. By 2002, Cisco Systems was calculated to be the world’s most valuable company, worth $579.1 billion to second place Microsoft’s $578.2 billion. Microsoft had replaced General Electric’s No. 1 ranking in 1998.

This post will present the early years of Cisco Systems development and the creation of networking technology on the Stanford University campus. The next post will discuss the commercialization and success of Cisco Systems as it helped to create the global Internet by first commercializing multi-protocol routers for local area networks.

Leonard Bosack and Sandra K. Lerner formed Cisco Systems in the early 1980s and were the driving forces of the young company. In the beginning, they each were the heads of different computer centers on campus and incidentally, (or perhaps consequently) dating. The couple met on the Stanford campus (Bosack earned a master’s in computer science in 1981 and Lerner received a master’s in statistics in 1981) while managing the computer facilities of different departments on the Stanford campus. The two faculties were located at different corners of the campus and the couple began to work together to link them, and to other organizations scattered around the campus. Drawing on work being conducted at Stanford and Silicon Valley, they developed a multi-protocol router to connect the departments. Bosack and Lerner left Stanford University in December 1984 to launch Cisco Systems.

Robert X. Cringely, author of Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date interviewed both founders for his PBS video series, Nerds 2.0.1

Bosack and Lerner happened on their university positions during a very critical time in the development of computer networks. The Stanford Research Institute (SRI) was one of the four original ARPANET nodes and the campus later received technology from Xerox PARC, particularly the Alto computers and the Aloha Network technology, now known as Ethernet.[1] This technology, originally developed at the University of Hawaii to connect different islands, was improved by Robert Metcalfe and other Xerox PARC researchers and granted to the Stanford University in late 1979.[2] Ethernet technologies needed router technology to network effectively and interconnect different computers and Ethernet segments.

A DARPA-funded effort during the early 1970s at Stanford had involved research to design a new set of computer communication protocols that would allow several different packet networks to be interconnected. In June of 1973, Vinton G. Cerf started work on a novel network protocol with funding from the new IPTO director, Robert Kahn. DARPA was originally interested in supporting command-and-control applications and in creating a flexible network that was robust and could adjust to the changing situations to which military officers are accustomed. In July 1977, initial success led to a sustained effort to develop the Internet protocols known as TCP/IP (Transmission Control Protocol and Internet Protocol). DARPA and the Defense Communications Agency, which had taken over the operational management of the ARPANET, supplied sustained funding the project.[3]

The rapidly growing “Internet” was implementing the new DARPA-mandated TCP/IP protocols. Routers were needed to “route” packets of data to their intended destinations. Every packet of information has an address that helps it find its way through the physical infrastructure of the Internet. Stanford had been one of the original nodes on ARPANET, the first packet-switching network. In late 1980, Bill Yeager was assigned to work on a router as part of the SUMEX (Stanford University Medical Experimental) initiative at Stanford University to build a network router. Using a PDP-11, he first developed a router and TIP (Terminal Interface Processor). Two years later he developed a Motorola 68000-based router and TIP using experimental circuit boards that would later become the basis for the workstations sold by SUN Microsystems.[4]

Bosack and Lerner had operational rather than research or teaching jobs. Len Bosack was the Director of Computer Facilities for Stanford’s Department of Computer Science, while Sandy Lerner was Director of Computer Facilities for Stanford’s Graduate School of Business. Despite their fancy titles, they had to run wires, install the protocols, and get the computers to work. They were in charge of getting the computers and the networks to work and make them usable for the university. Bosack had worked for DEC, helping to design the memory management architecture for the PDP-10. At Stanford, many different types of computers: mainframes, minis, and the microcomputers were all in demand and used by faculty, staff, and students.

Some 5000 computers were scattered around the campus. The Alto Computer, in particular, was proliferating on campus. Thanks to Ethernet, computers were often connected locally, within a building or a cluster of buildings, but no overall network existed throughout the campus. Bosack, Lerner, and other colleagues such as Ralph Gorin and Kirk Lougheed worked on “hacking together” some of these disparate computers into the multi-million dollar broadband network being built on campus. But it was running into difficulties. They needed to develop “bridges” between local area networks and then crude routers to move packets. At the time, routers were not offered commercially. Eventually, their “guerilla network” became the de facto Stanford University Network (SUNet).

Notes

[1] Stanford networking experiments included those in the AI Lab, at SUMEX, and the Institute for Mathematical Studies in the Social Sciences (IMSSS).
[2] Ethernet Patent Number: 04063220 , Metcalfe, et al.
[3] Vinton G. Cerf was involved at Stanford University in developing TCP/IP and later became Senior Vice President, Data Services Division at MCI Telecommunications Corp. His article “Computer Networking: Global Infrastructure for the 21st Century” was published on the www and accessed in June 2003.
[4] Circuit boards for the 6800 TIP developed by Andy Bechtolsheim in the Computer Science Department at Stanford University.

Share

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is Professor and Associate Chair of the Department of Technology and Society, State University of New York, Korea. Before joining SUNY, he taught at Hannam University in South Korea and from 2002-2012 was on the faculty of New York University. Previously, he taught at St. Edwards University in Austin, Texas, Marist College in New York, and Victoria University in New Zealand. He has also spent time as a Fellow at the East-West Center in Honolulu, Hawaii.

« go backkeep looking »
  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    March 2024
    M T W T F S S
     123
    45678910
    11121314151617
    18192021222324
    25262728293031
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.