Here my notes on The changing structure of American innovation - Some cautionary remarks for economic growth by Arora, Belenzon, Patacconi, and Suh. I try to draw clear lines between the authors’ claims and my own responses. If there is ambiguity, anything that sounds smart is them and anything that sounds crazy and speculative is me.

The authors point out that you can divide the history of the “American innovation system” into roughly four periods: 1850-1900, 1900-1940, 1940-1980, and 1980-present. There are four threads to track in each period is the relationship to producing new and impactful technology-based goods and services (which I will begrudgingly refer to as innovations although we need a better word): universities, large incumbent corporations, startups/inventors, and the government.

1850-1900

Inventors + Incumbents

The period from 1850-1900 could be described as the age of the inventor. During this period inventors were the main mechanism for creating innovations. These folks would usually sell these patents directly to large companies like the telegraph operators, railroads, or large chemical companies. The companies themselves did little R&D and primarily operated labs to test the patents that inventors brought them. There was a healthy market in patents. My hunch is that this world was enabled at least in part by the nature of the inventions. The cutting-edge inventions were for the most part based on tinkering and trial-and-error that an individual could do in their garage. This was the era of the Wright Brothers building an airplane in their garage, Charles Goodyear creating vulcanized rubber on his stove, and Alexander Graham Bell creating the telephone in his house. Even Thomas Edison’s Menlo Park was less of a corporate lab working on big projects and more of a bunch of individual inventors all working in the same place for the same person. Over time, ‘inventor’ became a more specialized profession as indicated by fewer people creating more of the patents.

My hunch is also that the nature of the inventions is why you see the direct sale of patents instead of the creating a company around it and then aim for being acquired later. The inventions were simple enough that I literally could hand you the patent and you could figure out how to build and operate it. Draw a contrast to complex technologies that need human capital to be packaged with the patent in order to be useful for the acquirer. In a way modern tech startups are packages of human capital and IP. (This will become important later.) Additionally, the patent was often issued after someone got the thing to work, rather than before it is actually implemented, as is the case today. Companies understandably prefer to buy things that already work, so the shift from selling patents to selling companies may reflect the changing relationship of patents to the technologies they describe as well.

Small companies don’t come into the picture during this period much, though clearly many relevant companies were started:

  • 1849 - Pfizer
  • 1851 - Corning
  • 1877 - Bell Telephone
  • 1886 - Westinghouse
  • 1888 - Kodak
  • 1892 - General Electric
  • 1898 - Goodyear
  • 1897 - Dupont
  • 1902 - 3M
  • 1908 - General Motors

A piece of complete speculation on my part but perhaps there was less of a distinction between big companies and small companies in this period. Beyond the railroads, steel companies, an oil companies did any companies actually reach massive scale?

As Anton Howes points out in Arts and Minds, the common ad Inventor seeks capitalist from this period comes to mind .

Academia

American Universities were mostly focused on practical arts, especially agriculture - Harvard and other older schools were the exception and emulated a more European model. Many universities were started during this period specifically to train people in useful skills and discover better ways of doing agriculture and manufacturing. That was was the whole point of the land-grant universities like Cornell, which to this day has one of the best agriculture departments in the world. MIT was explicitly founded to teach a combination of professional and liberal arts.

Government

The government was almost uninvolved in R&D directly except during wartime when they were legally only allowed to spend money on short-term military-need research and in the creation of the Land Grant Colleges.

1900-1940

Innovations during this period shifted to chemicals and complex systems like cars and planes. The nature of these innovations made it harder to simply sell a piece of paper with an explanation of how the thing worked to a company and expect them to be able to replicate it. DuPont bought the patent for creating viscose rayon (processed cellulose used for creating artificial silk and other fibers.) However, Dupont was unable to replicate the process successfully and eventually had to partner with the original inventors to get it to work. Context is important and underrated for knowledge transfer! Additionally, these innovations started to require more specialized and expensive equipment which was out of the range of individuals, small companies, and universities as well.

Incumbents + Labs

The federal government increased anti-trust enforcement after a massive spike of M&A around 1900 so the authors argue that large corporations felt more pressure to expand markets by creating their own products. Additionally the government started dipping its toes into funding University research. The Hatch (1887) and Adams acts (1903) provided states $15,000 and then $30,000/year for agricultural research (which was presumably distributed to the state/land grant colleges.)

Most industrial labs were started in this period, partially inspired by German chemical companies. Note that the Germans were the first ones to step into the domain of complex/chemical innovations and first ones to build industrial labs. The implied reasons for this were the complexity of new innovations and the fear of antitrust preventing M&A as a way of expanding. The German precedent suggests to me that the nature of the technology itself was a causal factor.

Founding dates of some famous labs:

  • GE Labs (1900)
  • Bell Labs (1925)
  • Dupont Experimental Station (1903)
  • Kodak Research Laboratories (1912)

Academia

Universities relied on state and industrial funding. Saying “universities did” anything is a bit anachronistic because before WWII, individual professors brought funds to their own labs with very little university overhead. (Pieces of the Action does a good job describing this system.) The need for industrial funding pushed many professors towards mission-oriented research. This direction irked some professors, especially those influenced by European academic philosophy that emphasized pure discipline-oriented research. Caltech was founded by these dissidents. Industrial labs attracted professors away from universities because they had equipment that only large companies could afford.

1940-1950: The great discontinuity

The paper actually skips the period of WWII and its immediate precedent and aftermath. It does note that WWII saw Federal R&D go from $83.2M -> $1313.6M in 1945 (2 orders of magnitude!) Most of it went to universities.

1950-1980

The technology story starts to shift towards drugs, materials, and even more complex systems (like early computers and jet engines.) Drugs arguably have the most science-based bottlenecks. Development of these complex systems require complementary engineering skills and a lot of tacit knowledge. For example, the transistor required a combination of physicists, metallurgists, and chemists all working with a specific mission-focus. A notable fact about the technology of this era is that many scientific discoveries had clear uses - the maser is a good example. As soon as it was discovered, people realized it would be useful.

Government

The federal government continued to pump money into university research after WWII because it realized how coupled technological superiority and military superiority would be in a conflict with the USSR. This period also saw peak antitrust cases in the US. It’s only briefly mentioned in the paper but this was also the period during which the National Labs were created.

Incumbents

Arguably, the combination of M&A-focused antitrust enforcement and the technology frontier’s need for complementary skills drove more corporate lab work. There was a lot of discovery-focused science going on at corporate labs. Corporate labs were able to act on new discoveries faster than academia.

Universities

The 1970’s feel like a discontinuity at the end of this period. The Vietnam war and associated cultural shifts and protests pushed universities to divest from aiding the ‘military industrial complex’ - so mission focused Federal R&D and R&D with large corporations. Universities as a whole turned towards more discipline-focused research.

Inventors+Startups

Venture capital and small companies also started taking off near the end of this period. Genetic engineering-based pharmaceutical development and semiconductor-based computing both have high upfront capital costs and low unit costs which makes them ideal for the venture capital model.

1980 -> Present

The innovations of this period are concentrated in computing, the internet, software, and pharmaceuticals. Note that except for pharmaceuticals1 , these have lower and lower capital costs outside of the human capital. Many of the innovations were less ‘technology’ and more ‘tech’ - business models that take advantage of computers and the internet, shifting businesses online.

Andrew Odlyzko and the authors point out that there’s an increasingly small window of time between groups discovering the same thing, even on the technological frontier. If the that gap is large, a company can use in-house research to build out a patent portfolio. They note how xerography was invented in 1937 and not patented by Xerox until 1950. The implied assertion (which I agree with) is that systems are exponentially more valuable than single technologies. Or frame differently - a really useful technology is made up of a bunch of smaller inventions. In a world where all the pieces of an innovation are rapidly patented by multiple competing entities it makes more sense to spend your money buying those patents or entities themselves. The returns from in-house research are capped because even if you get to the first patent, it’s not that valuable if other people patent the other pieces of the puzzle. If this argument holds, slightly more slack around new inventions is important.

It also raises the question - why is there so little slack in mission-focused research? There are definitely more scientists, so it could be that the number of researchers has gone up faster than the number of areas of exploration and you’re just seeing the effect of more researchers per area. This saturation would be exacerbated if the number of areas for cutting edge patentable research have gone down. The other two possibilities are cultural. We may be stuck in an equilibrium where the correct move is to patent everything that seems vaguely valuable as soon as possible instead of waiting for a higher value patent or group of patents because you know that everybody else will do the same thing. This equilibrium may be exacerbated by both Bayh-Dole but also the cultural expectation of how valuable science should be. This cultural expectation didn’t really come about until after WWII and perhaps took years to saturate the system. The other cultural explanation for the missing slack might be incentives to do incremental, but guaranteed publishable research. If there are more incentives for groups to do research on known good ideas then naturally there will be fewer discoveries that nobody else was working on.

In part, modern software companies solve the problem of patents standing in the way of building valuable systems by having war chests of defensive patents. These war chests create a Mexican standoff where any tech company has the ability to sue any other company so they all just cooperate to not sue at all. This situation doesn’t keep the patents from blocking development (as they do in other domains) but it doesn’t enable any companies to capture value from patents.

Academia

The idea of a division of labor between academia and industry (which is deeply embedded in the language used in academia) was strong. This separation can only be helped by the fact that the majority of university research money continues to come from the federal government. The authors argue that the government-aided division of labor between academia and industry removed forcing functions preventing extreme specialization. Researchers are now free to sub-divide problems into chunks that are easy to research and publish but then are harder to translate to something useful. Speculatively, this specialization may be a partial reason why discoveries feel less useful today. A splashy press release and then … nothing. The more specialized, discipline-focused nature of university research means that it requires more work to become a useful product.

Another of my own speculations: Looking back to earlier periods when professors were incentivized to collaborate with corporations in order to access expensive equipment, it’s possible that the large amount of federal funding for universities also helps prop up this division of labor because university professors are able to afford expensive equipment or access federally owned equipment where previously they were incentivized to join corporate labs to access more resources.

Inventors/Startups

During this period, the work to translate research into product is often done by startups. Universities are more incentivized to be involved in startups and corporations are less incentivized to do translational research. The Bayh-Dole Act incentivizes universities to be directly involved in startup activity. The Bayh-Dole act gives universities (not the professors) ownership over IP funded by federal money. Since the most profitable path for IP in this 1980 onwards period is for a startup to be built around it, universities find themselves pushed into the startup business. Other factors encouraging university spinoffs are perceptions of increasing institutional calcification in large companies and funding both from VCs and the government in the form of SBIRs. However, since 2005 university patents have continued to climb while their licensing fees have flattened out.

Increasing university patents but flat revenues from “The Changing Structure of American Innovation”

The Government

In addition to funding the majority of university research, the federal government hasn’t been prosecuting anti trust cases and has been encouraging small business activity. The Bayh-Dole act incentivized universities to create more spinoffs. SBIR grants directly subsidize small businesses working on technology. I find the story that all these activities all support the ‘innovation via startups commercializing university research and aiming to be acquired’ paradigm.

Incumbents

During this period large corporations have found it easiest to expand through M&A. M&A activity has gone through the roof, probably caused at least in part by the absence of anti-trust prosecution. If it’s easy to acquire products, there is presumably less incentive to do the translational work themselves. The nature of the most profitable innovations also lend themselves to acquisition. The paper calls these disembodied knowledge products - fabless chip designs, software, and drug formulas. I wonder whether these products suffer from the same problems that Dupont experienced with viscose rayon where there is a lot of tacit knowledge involved beyond the design or code. If the answer is “yes” it fits with the massive M&A story because you need to acquire not just the IP but the human capital behind it. If the answer is “yes,” you would expect to see a lot of organizations just focused on selling IP products. ARM is one example, but are there others?

The paper also tells a story where large corporations have also experienced more pressure to specialize both from investors and from more competition, due at least in part to globalization forcing American companies to compete with companies around the world. This specialization makes corporate labs less valuable because you don’t always know what you’re going to get out of discovery-type research and there’s a higher chance it will be ‘out of scope’ at a more focused company.

AI is the exception

Modern AI2 research bucks the trend of the decline of corporate labs. Like other areas that enticed professors away from universities, AI research (can) require massive resources - in this case it’s just thousands of dollars of compute for training models and the datasets to train those models on. AI research benefits from having multiple disciplines in the same place (including the people who will put it into production.) These factors mean you actually see healthy and productive corporate AI labs: Google Brain, DeepMind, FAIR, Microsoft Research. These organizations are well regarded and collaborate closely with academia in the same way that previous corporate labs did. Like corporate labs of yore, they do discovery-focused work on AI that advances fundamental knowledge in the field.

To me, the existence of successful corporate AI labs suggest a couple of uncomfortable possibilities. Clearly the ability to acquire technology through M&A is not the only reason that corporate labs have declined. Tech companies can and do acquire AI companies, and yet they still run labs. AI labs suggest that successful corporate labs can only exist in domains where there are large corporations with monopoly-like power. The decline of corporate labs in chemistry and physics-related domains may have been caused by commoditization of those products. GE, Dupont, Kodak, and others’ share prices (and public perception) suggest that they at are no longer monopolies. That story isn’t complete either - Boeing has a near-monopoly and doesn’t have an impressive corporate research lab. Corporate labs might be able to pursue AI and not other disciplines in the modern environment because AI development does happen on a faster timescale than most atom-based research.

AI is also just perceived as being incredibly valuable, while new atom-based products are not. The key question is whether this perception is accurate or not.

I feel comfortable asserting that corporate labs can only operate successfully in disciplines where corporations perceive them creating value. The decline of corporate labs means corporations see them less valuable than other options for that same money. Since corporations are also returning money to shareholders, the implied value of corporate labs is probably even negative. As mentioned previously, this negative value could be due to specialized firms being unable to capture the value of broader technologies, shrinking timescales on which corporations need to capture value, M&A being a more attractive option for acquiring technology, and no longer needing to create peace offerings to avoid being broken up by the government.3

A terrifying possibility is that we’ve actually exhausted the physical phenomena that can be tapped to create valuable products. In the past corporations clearly saw a connection between tapping new phenomena and their bottom line. Most of the arguments around the decline of corporate labs focus on the second half of that sentence, how businesses and the culture around them has changed. I hope I’m wrong, but what if new phenomena are just less exploitable? What if we’ve pushed the knowledge frontier into realms that are too big or too small to be relevant on human scale? Recently, the ISS crew created a Bose-Einstein condensate that lasted almost a second! The only use the article suggested for a fifth phase of matter was for mineral prospecting. People have said for years that being in space should enable you to build materials and drugs you never could on earth. So why isn’t there a corporate research lab in space? Either:

  1. Companies don’t believe those things are valuable and are right
  2. Companies don’t believe those things are valuable and are wrong
  3. Companies don’t believe those things would be valuable on the timescales they care about.

I hope it’s not # 1.

The other uncomfortable possibility is that the era from 1900 to 1980 was a weird outlier and we’re now regressing to the mean. Or that the massive spending and focus in WWII + the Cold War was like Bane injecting himself with venom to make himself really strong for a short amount of time, but wrecking himself afterwards.

Overall, this is a thought-provoking paper and I recommend that you read it yourself.

  1. The therapeutics industry has its own issues - see the bookScience Business for many more details) 

  2. AI used here as a shorthand for extremely capable machine learning systems enabled by advances in deep learning and GPU technology, etc. 

  3. The last reason - corporate labs as a peace offering to the government is from neither me nor the authors - I believe I saw it first in The Idea Factory