The Origin Story of Silicon Valley—and Why We Shouldn't Try to Recreate It
In California, a 25-mile stretch of technology parks, offices, and even some garages attached to upper-middle-class homes produce as much economic output as entire industrialized nations. The Santa Clara Valley, known to the world as Silicon Valley, is synonymous with the unprecedented rate of technological innovation that gave birth to the modern Computer Age and made the United States the wealthiest country in human history.
Countless cities, regions, and nations around the world have tried and failed to recreate the economic dynamism of Silicon Valley for more than half a century, creating a techno-utopian aura around the region and the companies that inhabit it. The reality is not nearly so mystical, however. The history of the Valley isn't a difficult one to follow, and the recipe for creating another Silicon Valley isn't as complicated as many make it out to be; it just isn't one that anyone in their right mind should want to see repeated.
The Santa Clara Valley at the turn of the twentieth century
In 1900, if you knew where the Santa Clara Valley was, you probably either lived there or you were really, really into prunes.
Running southeast from the city of San Francisco and the San Francisco Bay, a valley carves a path between the East Bay foothills and the Santa Cruz mountains. The soil of the valley is rich and fertile and at the beginning of the 20th century, mile after mile of fruit orchards filled the valley.
On a visit to the Santa Clara Valley in the 1890s, the legendary British field marshal—or notorious, depending on the perspective—Lord Horatio Kitchener, saw the miles of flowering fruit trees filling the valley and reportedly declared it "the valley of heart's delight," a name that seemed destined to stick to it for an eternity.
The premier crop of the valley was the French plum, which after being dried out and processed into a prune, was exported throughout the world. At one point, this small stretch of farmland 45 miles south of San Francisco produced 30% of the entire world's supply of the popular fruit. Cherries, pears, and apricots were also bumper crops for the region, and the ebb and flow of migrant agricultural laborers and the traditional farming life were the defining culture of the valley.
But just as the agricultural bounty of the Santa Clara Valley seemed destined to define it forever, the transformation of the world was already underway. The Industrial Revolution of the 19th century had produced a generation of incredibly wealthy business tycoons from New York to California and the Santa Clara Valley was home to one such tycoon; Leland Stanford, who made his fortune in the railroads that started crisscrossing the country during the latter half of the century.
Stanford's only son, Leland Stanford Jr., was sent to Europe to receive a 'proper' education and contracted typhoid fever while abroad, dying at the age of 15. Distraught, Stanford Sr. founded a university in 1891 on an 8,100-acre ranch he owned in the valley, located in Palo Alto, in his son's memory. Stanford Sr. himself died two years later.
The university struggled financially after Stanford's death, but that wouldn't be the case for much longer. In 1909, Stanford University president David Starr Jordan made one of the most consequential venture capital investments in history by giving Lee de Forrest $500 to develop his audion tube, which amplified electrical signals in an airless tube of glass.
Credited as the father of electronics, DeForrest kicked off the electronic revolution of the early 1900s with his vacuum tubes, powering everything from radios to innovative new business machinery like adding machines and electronic time recorders. The company DeForrest worked for, the Federal Telegraph Co., was founded in Palo Alto by people who themselves had close ties to Stanford University.
The Federal Telegraph Co. was an early preview of the kind of tech incubators that would come to define the valley in the decades to come as employees of the Federal Telegraph Co. left to start their own companies in the region. One such pair of employees who left Federal Telegraph Co. to found a company which first invented, developed, and sold the loudspeaker, eventually becoming the audio electronics giant Magnavox.
Frederick Terman returns to Stanford
One of the most consequential figure in the Silicon Valley story though is Frederick Terman, who arrived at Stanford University in 1925 having just finished a Ph.D. in electrical engineering the Massachusetts Institute of Technology. He came back to Stanford—where he earned his undergraduate degree—to teach a class in radio engineering, but he would go on to mentor, inspire, and invest in some of the founding companies and personalities of Silicon Valley.
Considered by many as a founding father of Silicon Valley, Terman's spent the following decade after joining the faculty building up the university's modest electrical engineering program into a top-tier one. This dedicated work became frustrating, however, as he watched the university produce highly-educated graduates, only to see them earn their degrees and leave town the following day for jobs with engineering firms on the east coast.
Terman wanted to see Standford's graduates stay in the valley create local businesses that would build a durable industrial base in the region. To that end, Terman used his position to encourage Stanford engineering graduates to stay in the Santa Clara Valley and start their own businesses rather than head east for work.
The first company and most consequential company to do so was Hewlett-Packard, founded by Stanford graduates William Hewlett and David Packard, who Terman encouraged to partner together. They took his advice and became the original garage-startup in 1939 when they formalized their partnership, making electrical test equipment out of a rented one-car garage in Palo Alto.
Soon, more graduates and faculty started listening to Terman and founded their own aerospace and electronics companies in the area. This established the first network of companies that would be bound together by their common connection to Stanford—and sometimes even each other personally—while Terman continued to develop the academic program that would produce a growing pool of highly-educated workers who would be hired in turn by local companies founded by Stanford graduates.
In this way, Terman started building up the pipeline that today continues to feed Stanford graduates to the biggest Silicon Valley firms like Google and Facebook, a pipeline that many consider being a source of Silicon Valley's success. That alone, however, wouldn't be enough to remake the valley so drastically in such a short period. For that, it would take much more than Terman's hard work and networking.
How winning the Second World War primed Silicon Valley for take-off
Even before his installation as Chancellor of Germany in 1933, Adolf Hitler was planning for war. Once he had the powers of the state and military in his control, German rearmament, which was prohibited by the Versailles treaty that ended the First World War, became Germany's top industrial priority.
Traumatized by the slaughter of over a million young men in the First World War, England and France were unable to effectively confront the increasingly aggressive moves by Hitler's Germany. They recognized far too late that war was almost upon them, and were caught in a race to modernize aging equipment and shore up defenses that would prove entirely inadequate after Germany invaded Poland on September 1, 1939. This invasion prompted France and Britain to declare war on Germany long before they were prepared to fight.
In the lead up to the outbreak of the war, Nazi terrorism caused many academics, scientists, and artists to flee, many of them finding their way to the United States, a staggering amount of 'brain-drain' for the continent to cope with when trying to rebuild after the war. Both Albert Einstein and John Von Neumann were Europeans who emigrated to the US as the Nazis rose to power in the 1930s, as were a great many other notable scientific minds. Many more died during the war, unable to escape.
Across the Atlantic, President Franklin Roosevelt spent most of the 1930s carrying the United States through the Great Depression through his New Deal programs. These did everything from putting the unemployed to work doing just about any job imaginable to introducing some of the most important rules and regulations governing the conduct of the banking and finance industry.
Still, the 1930s were a difficult time for all Americans, and even though the worst carnage of the First World War spared the US, they still lost more than a 100,000 soldiers in about a year. With the Depression wearing on the spirits of the American citizen, no one in the US wanted to fight in another European war.
So, it's not surprising that isolationist sentiment ran strong in Congress during the Depression, and they passed laws restricting the sale of military material to France, Britain, or Germany even before war broke out in 1939, not wanting to give one side or the other a casus belli for pulling the US into the conflict. Bowing to anti-war sentiment, Congress also prevented the US Army and Navy from effectively stockpiling materials for itself. It wasn't until 1939 that it became nearly a strategic certainty that the US would have to enter the war.
The America First Committee, with its American hero spokesperson, Charles Lindbergh, pushed hard to keep the US out of the war right up until the end, going so far as to blame American Jews for pushing the US into the war during his infamous Des Moines, Iowa speech on September 11, 1941. Lindbergh was widely denounced for his anti-Semitic remarks, having already earned himself a reputation as a Nazi sympathizer by accepting a medal from Hermann Göring, head of Nazi Germany's air forces, to commemorate Lindbergh's flight across the Atlantic.
Congress granted President Roosevelt the authority to begin directing wartime production and he pressed ahead as quickly as possible to mobilize American industry for war.
Thousands of radios, headsets, and radar systems would be needed to be designed and built, and there weren't all that many places to turn to in the 1930s capable of filling this demand. With the second most important electronics development and research center in the country located at Stanford University, US military funding poured into the area.
Silicon Valley's partnership with the US military began in earnest and never really stopped.
After the US was drawn into the conflict in 1941 following the bombing of Pearl Harbor, and Hitler's declaration of war on the United States a few days later, and the military's earlier preparation turned into the need to mobilize the entire industrial capacity of the United States to aid in the war effort.
The Santa Clara Valley was just a stone's throw away from the port of San Francisco, making it the easiest place to source necessary electronics, microwave, and radar equipment for the Pacific theater, where US naval and aircraft power was much more prominent than in the European theater. The region was also home to the several major aerospace companies only increasing its strategic importance.
Silicon Valley firms would do their part, churning out radar, the radio, and other related electronic equipment as well as aircraft—which developed and strengthened the region's industrial capacity as the war progressed—as well as developing new inventions and innovations to address specific needs brought on by the war.
The war left behind an endless field of rubble and death, except in the United States
When the war finally came to an end on August 1945, upward of 80 million people were dead, and the industrialized murder of 6 million Jews at the hands of the Nazis fundamentally altered the character of Europe forever. Following a campaign of strategic air bombing of industrial areas in cities controlled by the Germans, and eventually, civilian population centers, whatever industrial capacity there was on the continent was all but destroyed.
The Eastern Front suffered the most horrendous fighting of the entire war. The Soviet Union, having just achieved some small semblance of industrial parity with their western rivals in June 1941 when the Nazis invaded the Soviet Union, was driven to relocate hundreds of their factories from the western part of the country ahead of the Nazi advance to locations further in the east.
When the war was over, there were noticeably fewer Soviet citizens around to help rebuild the country and man these factories than there were before the war broke out. In total, 26 million citizens of the Soviet Union, most of them working-age men and women, were killed between 1941 and 1945, a human-labor deficit that the Soviet Union would never overcome during the Cold War.
Stalin, after the war, transformed Eastern Europe into a series of Soviet client states to act as a buffer between the Soviet Union and Western Europe, with the fault line cutting right down the middle of the continent. After the Soviet Union detonated their own atomic bomb in August 1949, a divided Germany became the ideological border between the two nuclear-armed superpowers that kept the world up at night for just over forty years.
Japan suffered an intense American campaign of fire bombings of its industrial cities and manufacturing centers in the last year of the war. The map above, produced by the US Army after the war, shows which Japanese cities the US bombed, what percentage of the city was estimated to be destroyed after the bombings, and a comparable US city in terms of population.
The firebombing of Tokyo alone, on March 10, 1945, is estimated to have killed 100,000 civilians in a single night. More than 70 cities would be firebombed with napalm and conventional explosives over the last five months of the war, killing as many as half a million people. By the time the bombing campaign culminating in the use of atomic bombs on the cities of Hiroshima and Nagasaki, the US had utterly destroyed nearly all of Japan's industrial capacity.
For Japan, rebuilding its industrial capacity was a significantly harder task than in Europe, where industrial capacity was more spread out, and where firebombing of cities occurred more rarely than in Japan and were nowhere near the scale or severity as the sustained campaign that Japan suffered.
Meanwhile, in America...
In America, the situation was very different. The attack on Pearl Harbor was the most damage any combatant managed to cause the United States in terms of its infrastructure or industry—and all but three of the 16 ships that Japanese pilots sank in December 1941 were recovered and repaired [PDF]. Not a single building was bombed after the Pearl Harbor attack in the entire United States, much less in the Santa Clara Valley.
The American economy, meanwhile, broke out of the war years into one of the most extraordinary expansions of prosperity the world had ever seen. Even more importantly, for the Santa Clara Valley at least, is that the US also invested heavily in American servicemen coming back from the war with the GI Bill.
A program where soldiers could attending colleges and universities with support from the US government, the GI Bill provided a massive influx of new students into the freshman classes of every college and university in the country, all of them several years old and matured beyond their years by the war.
Stanford, in 1948, had expanded its freshman class for the 1948-49 school year by more than 1000 students, mostly through the GI Bill. The large influx of government investment into the university through this program allowed the school to expand its facilities and the school of engineering that Terman had built up in his time as a faculty member was full of promising young students who had spent several formative years working hand-in-glove with the US military.
When they graduated, they—along with engineering graduates from across the country—would form a corp of engineers the world had likely never seen before or since.
Terman, meanwhile, returned to Stanford in 1945 as the dean of the school of engineering, having spent the war years at Harvard University, working in the Radio Research Laboratory with the US military. His wartime service further solidified his ties to the government and the US military, a partnership he would promote to his students and graduates throughout the remainder of his life.
As the dean of Stanford's school of engineering, and later the university's provost until his retirement in 1965, Terman shepherded Stanford as it became one of the premier research institutions in the world while providing one last major contribution to the transformation of the Santa Clara Valley: the Stanford Industrial Park.
When Leland Stanford bequeathed his 8100-acre ranch to the university named for his son, he stipulated that the university could never sell any of the land he'd given it. For more than 50 years, much of that land remained undeveloped, something that Terman and the school would change in 1951. Terman took a 660-acre parcel of that land and formed the Stanford Industrial Park, a sprawling space for research labs, offices, and manufacturing for businesses to lease out long term and set up shop.
With ready access to the expertise available to consult from Stanford University's faculty to a ready supply of bright, highly-educated engineering graduates, the Stanford Industrial Park was too good an opportunity for businesses to pass up. Beginning with Hewlett-Packard and Varian Brothers, this parcel of land became the epicenter of the region's transformation. That transformation would need a catalyst though, and it would come in the form of a Nobel Prize winner with a natural capacity for mismanaging extraordinary talent.
Shockley Semiconductors Laboratory and the Traitorous Eight
The world changed forever in 1947 when William Shockley and his subordinates John Bardeen and Walter Brattain invented the 'point-contact transistor,' in AT&T's Bell Laboratory in New Jersey. Though the idea was mainly Shockley's, he wasn't actually involved in the actual creation of the device and he isn't named in the original patent filed by Bardeen and Brattain, who were instrumental in building the first working prototype for the transistor. Since Shockley was Bardeen and Brattain's supervisor, Bell Labs insisted he must be credited as well.
Having actually come up with the idea on his own, though struggling to implement it properly, Shockly apparently resented this immensely, leading him to developed an entirely different transistor, the 'junction transistor' that worked better than the Bardeen and Brattain had built and built the prototype on his own to cut Bardeen and Brattain out of claiming credit. That should give you an idea of the kind of boss Shockley was. Whether true or not in the details, it appears to be true in spirit; Shockley was apparently hell to work with.
Bardeen and Brattain ended up getting credit anyway, winning the Nobel Prize in Physics along with Shockley in 1956, a year after Shockley had moved out to Mountain View, California, to open up Shockley Semiconductors Laboratory to commercialize his invention. There he hired the best talent in the area to help churn out transistors for the growing demand for easy to use, portable electronic switches. Among those hired were Gordon Moore and Robert Noyce, two of the most famous of what Shockley would soon call the 'traitorous eight.'
It seems like a petty fight, but it was a consequential one. Germanium and silicon are both semiconducting materials, but many of Shockley's young engineers felt like germanium was a poor choice to use in a transistor since it starts to break down once it gets to be more than 180 degrees Fahrenheit, which isn't that hot when dealing with electricity. They wanted Shockley to start using silicon instead for its high heat tolerance, but Shockley refused.
With the backing of Fairchild Camera and Instrument in Long Island, NY, eight engineers from Shockley's lab resigned, including Gordon Moore and Robert Noyce, to form Fairchild Semiconductor in 1957. Led by Noyce, Fairchild would eventually grow into the most important company in the history of the Santa Clara Valley after Noyce independently invented the Integrated Circuit along with Texas Instrument's Jack Kilby in 1958.
The Integrated Circuit is the most important invention of the Computer Age. First etching thousands, then hundreds of thousands, then millions, and finally billions of transistors onto a single chip of silicon, the integrated circuit powers the modern computer, producing many trillions of switching operations a second that allow computers to perform all kinds of incredible feats of calculation.
The invention of the integrated circuit couldn't have come at a more opportune time for Fairchild Semiconductor in 1958 and the other aerospace and electronics companies in this growing stretch of the Santa Clara Valley. By now, word had gotten out about Stanford Industrial Park and companies from all about the country were opening op offices in the Santa Clara Valley and these aerospace companies, in particular, would soon be taking on an important new role that would be the catalyst for the area's final transformation into the high tech industrial behemoth we know today.
Sputnik changes everything
In 1957, the Soviet Union stunned the world by launching the Sputnik-1 satellite into orbit around the Earth, the first human-created object to do so.
'Freak out' would probably be the best way to describe the US government's response to being caught completely off-guard and flat-footed by the Soviets' achievement. Everybody knew that both the US and the Soviet Union were hoping to put a satellite into orbit by the end of the decade, but no one expected that the Soviets would do it first and with so large a satellite.
At 187 pounds, Sputnik-1 would barely register in the cargo manifest of SpaceX's Falcon Heavy, much less the Saturn V rocket that would send the Apollo 11 astronauts to the moon just over a decade later, but in 1957 getting something as heavy as Sputnik-1 into orbit was something the US genuinely had no idea how to do.
What the US had been planning on launching with Project Vanguard—the name for its satellite launch—was only 3.5 pounds. When they tried to test a launch vehicle in December 1957 with the vanguard satellite on board, the rocket lost thrust several feet above the launch pad and fell to the ground, exploding into a massive fireball before the assembled press. They called it alternately 'flopnik' and 'kaputnik,' the latter no doubt in honor of all the former Nazi rocket scientists that the US government enlisted into their service after the war.
Add to the sense of crisis was the fact that the Soviets had successfully tested an Intercontinental Ballistic Missile (ICBM) just weeks before the Sputnik-1 launch—also built with the assistance of a stable of Nazi rocket scientists captured by Russian forces after the war. The American public, as well as their government, began to panic about the degree to which they had fallen behind the Soviet Union, a deficit that started becoming known as the 'missile gap.'
President Eisenhower attempted to reassure the American public that Sputnik-1 was no real cause for alarm, but with the satellite passing overhead every 90 minutes and visible in the night sky, Americans couldn't get away from the dread that the Soviet Union was beating them. For anyone who lived through it, they can probably tell you where they were when they first heard the news; it was that kind of a shock to the American psyche.
Then-Senate Majority Leader Lyndon B. Johnson, Democrat from Texas, was hosting a BBQ when he heard the announcement of Sputnik's launch on the radio. He walked his guests down to the river by his Texas ranch that evening and kept coming back to Sputnik. He said of that night: "Now, somehow, in some new way, the sky seemed almost alien. I also remember the profound shock of realizing that it might be possible for another nation to achieve technological superiority over this great country of ours."
The nation turned their collective anger on Eisenhower, the Supreme Allied Commander of the Second World War, who, until then, they knew only as of the American war hero who defeated Hitler. Now, he was a doddering old fool playing golf while the Soviets took over space on his watch. The governor of Michigan, G. Mennen Williams—a Democrat to be fair, Eisenhower was a Republican—wrote a poem that sums up the nation's mood:
Oh little Sputnik, flying high
With made-in-Moscow beep,
You tell the world it's a Commie sky
and Uncle Sam's asleep.
You say on fairway and on rough
The Kremlin knows it all,
We hope our golfer knows enough
To get us on the ball.
The birth of the US Military-Industrial Complex
It would take some time for the US to catch up to the Soviets, and in that time the Soviet Union would launch another satellite, Sputnik-2, in November 1957. This time there was a passenger on board, a dog named Laika, who became the first living being ever to reach outer space. She died hours into the flight, but Americans didn't know that. All they knew is that the oceans that protected the United States from the savage destruction wrought during the two world wars couldn't protect them any longer.
The Soviet Union was supposed to be in ruins after the war—they had a post-war boom of their own [PDF]—and it didn't seem possible for the Russians to recover so quickly that they could shoot right past the Americans and take the technological lead—they could, but they wouldn't hold onto it for very long.
There was no way for Americans to know any of this, or that Soviet efforts to maintain the appearance of parity with the United States would eventually bankrupt the Soviet Union 34 years after Sputnik. All they knew was that Sputnik was out there, blinking away overhead in the night sky, and broadcasting some strange communist morse code into their radios.
There was a sense that mobilization was be needed just as there'd been after Pearl Harbor.
In response, Eisenhower and Congress directed increased funding into the US space program and missile development, pouring money into what Eisenhower would go on to call the Military-Industrial Complex. The space program was getting bogged down in competing agencies and lack of focus, so Congress authorized the creation of the National Aeronautics and Space Administration (NASA) in 1958, along with the Advanced Research Projects Agency (DARPA)—the word Defense was tacked on at the front later—which would fund research into new and unproven technologies.
For Congress, no one would have believed in 1941 that all those physicists in the universities with their incomprehensible theories about atomic structures—or whatever it was they were talking about—would end up holding the key to national survival. But that is precisely where they found themselves in 1945, and in 1958 they were going to turn to the scientists and engineers to do it all again. Only this time, circumstances would be much different than they were in the 1940s.
By 1958, the US military was essentially the military of the 'Free World.' France, Britain, and West Germany, along with the rest of NATO, contributed a fraction of the funding in real dollars of what the United States was spending. This was largely out of necessity, of course, none of these countries were in any real position to rebuilt their militaries to the strength they had once been, but it would be consequential in time.
It was also seen by some to be in the interest of peace, at least as far as the thinking in the US went at the time, that the former belligerents—especially Germany—not build up their armed forces beyond what was necessary for immediate self-defense.
France and Britain, who still had colonial possessions after the war, maintained a stronger military that was more or less a colonial pacification force, and they both failed miserably, revealing how weakened France and Britain had become after the war. Independence movements sprang up throughout the colonies of both empires, with most of them achieving independence by the late 1960s.
The US prohibited any military rearmament in the constitution it wrote for Japan after the war—though it has been interpreted to allow for a form of National Guard called the Self-Defense Force. And while it wasn't without resistance or controversy, for the only nation to suffer an attack by a nuclear weapon, ardent pacifism has taken root in the country. Here, too, the United States would provide security guarantees for Japan and other nations in Asia like South Korea against attack.
The world, understandably, was sick of war, and most nations were willing to follow the US's lead when it came to military matters and the US seemed more than willing to shoulder the weight of confronting the Soviet Union—or in Japan's case, Communist China and subsequently North Korea—militarily should it ever come to that.
As a result of this global dynamic, the US military never really demobilized the way the rest of the world did—other than the Soviet Union, obviously. With a new enemy on the horizon, the War Department was reorganized into the Department of Defense in 1947 and the US military infrastructure wasn't dismantled and moth-balled until the next war as it had been after World War I.
Instead, while its funding decrease from peak war levels for a few years after 1945, the start of the Cold War in earnest with the Soviet invasion of Czechoslovakia in 1948 saw funding begin to climb back to the highs it saw during the war.
As the US government adopted a policy of containment concerning the Soviet Union and communism around the world, the US would engage in relatively smaller-scale conflicts starting in 1950 with the Korean War. From then on, the military budget would be several times as large as it was during demobilization from 1946 to 1948 and would only grow from there.
The other major difference between 1941 and 1958 was the explosion of American GDP after the war. In 1940, the US was still in a Depression but in 1958, they had more money available than the country had ever thought possible and it was plowing much of it into the military—which it still does to this day—with no one at the time asking many serious questions about where the military was spending the money.
The official US response to Sputnik: Silicon Valley
This was the American political climate when Sputnik-1 launched in 1957, so when the US government made it official policy to never be caught technologically behind the Soviet Union again, they had both the will and an abundance of resources to make sure that they never were. They saw by 1958 how important technology was to defense and how it had won the Allies the war. They also knew that you couldn't predict which discoveries would end up being game-changers, so they would invest in all of them without prejudicing the expected outcome.
As far as research and development went, out in the growing boomtowns of Palo Alto, Mountain View, Sunnydale, and Cupertino there was always funding for new projects available from DARPA, NASA, or some other division of the Defense Department and the Santa Clara Valley tech companies took full advantage.
The money would be there for anyone to develop a good idea as well as a bad one. With the national mood in a state of near hysteria in the years following Sputnik, Congress', as well as Presidents Eisenhower, Kennedy, Johnson, and Nixon's, only answer to Sputnik and the 'missile gap' was to throw money at anything that looked like it could be promising technology that could give the US a leg up on the Soviets.
This was instrumental in helping foster a culture of risk-taking and innovation in Santa Clara that the more established and conservative technology and aerospace firms in the eastern US could not replicate and, just as importantly, in a way that private investors would never have tolerated.
Most important of all, these same government agencies would have demands for technology that consumers could never produce. Tech companies rightly fixate on user needs and user experiences and stories, but only the US military could have a product requirement be something like landing a human being on the moon and returning them safely. This is where Europe's diminished military capacity after the war left them without a similar engine for technological innovation that only the US and the Soviet Union could produce.
Britain, for instance, had built a digital computer before the end of the war, as early as 1943. They had one of the most, if not the most, brilliant computer scientists in history in the man of Alan Turing, who quite literally developed the theoretical foundation for modern computing as a Graduate student in 1936, and used those insights to break almost mathematically-unbreakable encryption on Nazi communications during the war.
But the Colossus never became a household name the way ENIAC and UNIVAC did for two reasons. First, the British government kept it a well-hidden secret until the 1970s, but second and, more importantly, they didn't have the resources to invest heavily in the development of computer technology and neither did British businesses. And the British were in much better shape than France, Germany, or Japan.
Britain would continue to have a role to play in the development of computer technology, but it is around this time, the 1960s, when the US simply pulls away from everyone else and never looks back as DARPA started funding as much as 70% of all the research on computer technology in the early 1960s.
For example, DARPA set the challenge for researchers in the 1960s to develop a network of computer systems that could be protected from Soviet attack so that if Soviet missiles destroyed one university research center, their work could be protected. That led researchers to create ARPAnet, which became the Internet we know today. The ARPAbet, a serious of symbols representing the sounds of the spoken English language, was developed with funding from DARPA stating in 1971 and served as the bedrock research that produced modern voice recognition and synthesis like Siri or Google's text-to-speech API. There are literally hundreds of programs like these that DARPA has funded.
Meanwhile, the Department of Defense was greatly expanding the Minuteman missile project and needed integrated circuits to build the guidance systems; lots of them.
"Santa Clara County," writes Thomas Heinrich, assistant professor of business and industrial history at Baruch College in New York, "produced all of the United States Navy's intercontinental ballistic missiles, the bulk of its reconnaissance satellites and tracking systems, and a wide range of microelectronics that became integral components of high-tech weapons and weapon systems."
“The Minuteman program was a godsend for us,” said Charlie Sporck of Fairchild Semiconductor. “The military was willing to pay high prices for performance. How does the small company compete against the giant [Texas Instruments] or Motorola? It has to have something unique. And then it has to have an outlet. Certainly, the military market was very important for us.”
Autonetics, a division of North American Aviation, had won the contracts for the new Minuteman II guidance computers, and they went all-in on integrated circuits over discrete circuits, which had been used exclusively in the Minuteman I guidance system. The Minuteman II used about 2000 integrated circuits and about 4000 discrete circuits in their new guidance computer for the missile, producing performance comparisons between the two missile generations to promote their design to the military.
Kilby, who worked at Texas Instruments at the time—which was one of the top three suppliers of integrated circuits for the Minuteman project—said: "In the 1960s these comparisons seemed very dramatic, and probably did more than anything else to establish the acceptability of integrated circuits to the military."
As the Cold War tensions rose in the 1960s, production of the Minuteman II missiles ramped up considerably, with six to seven missiles being built every week in 1964. At that rate, the top three semiconductor suppliers for the program—Texas Instruments, Westinghouse, and RCA—alone needed to produce over 4000 integrated circuits every week to keep up with the demand.
And then there was NASA to consider. While not part of the military officially, they relied heavily on the same military contractors to supply the necessary electronics for the space program, but especially for Apollo. Fairchild Semiconductor, which was not as keen on military contracts as many other companies were—though they still took them—had no hesitation when it came to NASA and the Apollo program.
In 1962, NASA announced that the Apollo program's guidance computers would use integrated circuits based on a design by Fairchild, and Fairchild would be the main supplier for these chips, with Texas Instruments and Philco-Ford as secondary production suppliers. Each Apollo guidance computer would use about 5000 integrated circuits, with about 75 computers were built over the next 13 years and about 25 of these actually flying on missions.
Those weren't the only systems from NASA that required integrated circuits though. By the middle of the 1960s, NASA was buying 60% of all the integrated circuits made in the country. Fairchild sold NASA 100,000 integrated circuits just for the Apollo program in 1964 alone.
This ferocious demand for integrated circuits in the 1960s provided both the pressure necessary to ramp up mass-production of the expensive devices and also the revenue needed to build up the capacity to actually meet these production targets.
According to Paul Cerruzi, curator of Aerospace Electronics and Computing at the Smithsonian Institution, over the course of the Apollo contract "from the initial purchase of prototype chips to their installation in production models of the Apollo computer, the price dropped from $1,000 a chip to between $20 and $30. The Apollo contract, like the earlier one for Minuteman, gave semiconductor companies a market for integrated circuits, which in turn they could now sell to a civilian market."
That enormous infusion of money overwhelmingly benefitted the companies in the Santa Clara Valley. By 1961, the Pacific region overall led the country in military prime contract awards, receiving 27.5% of all Defense Department contracts. In 1963, nearly the entire market for integrated circuits was filling these military- and space-related contracts, as was about 95% of the market in 1964. During the entire 1960s, California brought in a fifth of all defense-related prime contracts that paid $10,000 or more, and almost half (44%) of all NASA subcontract awards ended up going to California-based companies.
By the end of the decade, Americans had walked on the moon thanks to the efforts of the companies in the Santa Clara Valley and their efforts had transformed the entire region. Stanford and UC-Berkeley expanded their Master's and Ph.D. programs to help supply the trained workers needed by the industry and business was so good that companies were able to start investing in new ventures themselves.
Easy money, win or lose, is what made Silicon Valley
Ultimately, this environment, free from the business consequences of failure, produced a distinct culture for the people who worked at these companies or studied engineering at Stanford or the nearby University of California at Berkeley. It trained an entire generation of industry leaders in the Santa Clara Valley to be a different kind of leader and to approach problems much differently than more conservative firms might have done.
Companies on the east coast, like Digital Electronic Corp, IBM, and others, had more established traditions that they were able to maintain no matter how much money the military or NASA threw at them. The companies that filled the Santa Clara Valley, however, were newer and came to define themselves by the lessons they learned in the 10 to 15 years after Sputnik.
Theirs was a culture of personal networks built out of a decade of collaboration mixed with the competition, wrapped up in the ability to hop from company to company without penalty—unlike states like Massachusetts, with its tech-heavy Route 128 corridor, California bans non-compete clauses in contracts. Most importantly of all, they possessed the learned state of mind that failure is just another step towards success, rather than the end of one's efforts.
The changes in the Santa Clara Valley in the 1960s were visible even if you weren't paying attention. By 1960, the farm section of the local daily paper, the San Jose Mercury News, had been reduced to a one- to two-page update in the Sunday paper, and the focus of the paper had decidedly shifted towards covering the latest developments in the growing tech industry.
In 1960, the paper reported that Stanford University was constructing a two-mile-long linear accelerator at the cost of $125 million—funded by the US Atomic Energy Commission, the forerunner to the US Department of Energy—and that the construction ensured that Stanford would have the largest density of nuclear research facilities on the planet.
They reported in 1963 how Stanford Industrial Park had grown to include 40 companies employing 11,500 people, with half of those companies being in electronics. Headlines in the paper, which a decade earlier might have been talking about crop yields and plum prices now had headlines like, "Tiny gadget helps woman‘s heartbeat after coronary," "Superheat reactor powers generator," and "San Jose engineers expand."
In 1968, Robert Noyce and Gordon Moore would leave Fairchild Semiconductor to co-found Intel, and three years later, Intel would market the world's first microprocessor, the Intel 4004. While the term integrated circuit refers to all kinds of components, from memory circuits to input-output controllers to logic units, the microprocessor is different in that it incorporates different integrated circuits to create the central processing unit of the modern computer.
The microprocessor was able to do the work of an entire computer system, so that in 1975, an Apollo astronaut on the final Apollo mission would have in his pocket a calculator, the HP-65, with more raw processing power than the computer that was piloting his spacecraft. The radical pace of this change, driven by Moore's Law—the exponential rate of growth in processing power due to the compounding miniaturization of the silicon transistor—would govern the explosive increase in computing power of the microprocessor for the next 30 years.
The Santa Clara Valley was at the center of all of this. On January 11, 1971, the name that would forever define this stretch of the United States officially entered the lexicon with journalist Don Hoefler's article in the local trade newspaper, Electronic News, entitled "Silicon Valley, USA."
Pete Carey, a business and technology reporter for the San Jose Mercury News, wrote of the name: "At first it was a rather self-conscious term, requiring a lot of hubris to repeat with any conviction. But the phenomenal growth in size and importance of the area has made the term recognizable nearly everywhere. Outside northern California, a relative handful of people have heard of Palo Alto, Mountain View, Sunnyvale, Cupertino, and San Jose, but the world knows where to find Silicon Valley."
This transformation of computers from a strictly military technology into an industrial and commercial one began in earnest starting in the 1970s as the cost of the integrated circuit—and by extension, the new microprocessors—was now at a place where non-military applications of these technologies could be affordable. As the rate of spending on NASA and the military would begin to slow in the 1970s, the companies that made up Silicon Valley were now well-established and mature firms.
Over time, they were able to find industrial and commercial applications for this new technology to replace the military contracts that enabled the technology to reach maturity. Through the 1970s, a new generation of industry leaders, like Bill Gates and Steve Jobs, began coming up through the pipeline and they would have two generations of business and technology leadership who were able to mentor them.
The drastic reduction in costs of microelectronics over the preceding decade also enabled this generation to build for the consumer computer electronics market—with the Apple II computer, for instance—without needing the kinds of capital investment that the previous generation required. What's more, this meant that Silicon Valley companies and the very wealthy residents of the valley themselves were able to become the primary investors of these new ventures.
By the end of the 1970s, Silicon Valley was no longer the company town of NASA and the US military that it had been. The technologies that they were able to refine and perfect in the 1960s with US government funding were successfully commercialized into industrial, commercial, and consumer products over the next couple of decades, leading to the world we live in today. And, given the prosperity of the region and the national gains that Silicon Valley's technology has provided, it's no surprise then that people want to recreate the place in their own city, state, or even nation. Everyone, it seems, now wants to have their own Silicon Valley.
Forget a new Silicon Valley; we're still debating whether the first was a good idea
Wanting to recreate Silicon Valley is tempting, but this ignores what Silicon Valley is: a unique product of a unique time in human history, one that no one can or should want to repeat if they have any humanity. To recreate Silicon Valley, you would need to have another global upheaval like the one that followed the Second World War. While climate change could present that kind of opportunity, that should give you an idea of the enormous pressures required and the hardships involved.
Given those kinds of pressures though, producing another Silicon Valley wouldn't be hard; it would just, ultimately, be the product of intense fear and anxiety, a destroyed world, and built from the wealth of your country shoveled into a single industry at the expense of almost everything else. That's assuming your's is the country with the resources to invest after all is said and done. Global calamities are unpredictable things and we all exist behind the veil of ignorance when it comes to the future.
Moreover, like the original, those in this new Silicon Valley might well forget the circumstances that put them in their position at the top of the world's technological hierarchy in the first place. Extreme concentrations of wealth will inevitably create various social tensions. Issues like basic regulations that may seem like they were settled long ago can become major controversies.
A Silicon Valley company may be willing to invest in a startup or fund a coding boot camp, but it may be increasingly resistant to paying taxes that would fund public education. Some of the most vocal residents of the original Silicon Valley remain convinced that the government is and always has been an obstacle to their success, not the prime mover of it and they act on that belief to the detriment of the social fabric.
In the end, you might end up with a perverse form of the 'resource curse' on your hands; where the immediate concentration of so much wealth does not enrich your society as expected but instead leads to heightened wealth and income inequality, social unrest, corruption, and democratic backsliding that is often seen in the developing world.
The countries that recovered from World War II, but missed out on their own Silicon Valley, were able to instead invest in universal healthcare programs, education, and more generous social benefits as a result. These countries consistently rate higher on the global happiness index than the United States, so, all things considered, having a Silicon Valley doesn't appear to add much to our quality of life. Quite the opposite even since not a week seems to go by without some new study coming out that suggests that these new technologies may be increasingly incompatible with our basic human needs so that even those closely connected to Silicon Valley have started to fear what they've created.
While producing a new Silicon Valley might sound like winning the lottery, it's a trade-off and it always has been, we're just only now starting to realize the consequences. In the end, these may balance or tilt toward the beneficial, but we aren't there yet so we don't know whether Silicon Valley will ultimately be judged as a blessing or a scourge. We should probably figure that out before we go off trying to reproduce it somewhere else.
Researchers have developed a breakthrough technology that restored vision to 20 people affected with low vision/blindness.