The Internet and Computing: Crash Course History of Science #43


Written by:

We’ve talked a
lot about advances in biotech. But none of those could have happened without
advances in computing. It’s time to get back to data and explore
the unlikely birth, strange life, and potential futures of the Internet. The theme of the history of computing is that
what we mean by “computing” keeps changing. With the invention of the transistor in 1947,
the computer started to shrink! And speed up! And change meaning yet again, becoming a ubiquitous
dimension of contemporary life—not to mention a totally normal thing to yell at. Hey Google… can you roll the intro? [long pause] Google: I’m not sure. [Intro Music Plays] In 1965, Electronics Magazine asked computer
scientist Gordon Moore to do something scientists are generally taught not to do: predict the
future. Moore guessed that, roughly every year, the
number of electronic switches that people could squeeze onto one computer chip would
double. This meant computer chips would continue to
become faster, more powerful, and cheaper—at an absolutely amazing rate. Which might have sounded suspiciously awesome
to readers. But Moore’s prediction came true! Although it took eighteen months for each
doubling, and, arguably, this was a self-fulfilling prophecy, since engineers actively worked
towards it. Moore went on to serve as CEO as Intel and
is now worth billions. His prediction is called “Moore’s law.” Think about what this means for manufacturers:
they keep competing to invent hot new machines that make their old ones obsolete. The same applies to methods of data
Today, engineers face big questions about the physical limit of Moore’s law. Even with new tricks here and there, just
how small and fast can conventional chips get? Currently, teams at different chip manufacturers
are working to create transistors at the nanometer scale. IBM made a whole computer that’s only one
millimeter by one millimeter wide and is about as fast as a computer from 1990. As computers became smaller and cheaper, they
moved from military bases to businesses in the 1960s and to schools and homes by the
late 1970s and 1980s. And computers changed these spaces. People got used to using them for certain
tasks. But computers were pretty intimidating. Manufacturers had to make them work better
with people. So in 1970 the Xerox Corporation founded the
Palo Alto Research Center—known as Xerox PARC. Here, researchers invented many features of
modern computing. In 1973, they came up with the Xerox Alto,
the first personal computer… But Xerox didn’t think there was a market
for computers in the home yet. Other Xerox PARC inventions include laser
printing, the important networking standard called Ethernet, and even the graphical user
interface or GUI—which included folders, icons, and windows. But Xerox didn’t capitalize on these inventions. You probably know who did. In the 1970s, two nerds who dropped out of
college started selling computers you were meant to use at home, for fun and—you know,
to do… stuff, whatever you wanted. In retrospect, that was the genius of the
Apple Two, released in 1977. Along with decades of shrewd engineering and
business moves, fun made video game designer and meditation enthusiast Steve Jobs and engineer
Steve Wozniak into mega-billionaires. They had a commitment to computing for play,
not always just work. And they weren’t alone. In 1981, IBM started marketing the PC powered
by the DOS operating system, which they licensed from Microsoft—founded by Harvard dropout
Bill Gates in 1975. By 1989, Microsoft’s revenues reached one
billion dollars. You can find out more about college dropouts-turned-billionaires
elsewhere. For our purposes, note that some of the inventors
who influenced the future of computing were traditional corporate engineers like Gordon
Moore. But increasingly, they were people like the
Steves who didn’t focus on discoveries in computer science, but on design and marketing:
how to create new kinds of interactions with, and on, computers. Compare this to the birth of social media
in the early 2000s. So new social spaces emerged on computers. And connecting computers together allowed
for new communities to form—from Second Life to 4chan. For that, we have to once again thank U.S.
military research. ThoughtBubble, plug us in. Back in the late 1950s, the U.S. was really
worried about Soviet technologies. So in 1958, the Secretary of Defense authorized
a new initiative called the Defense Advanced Research Projects Agency or DARPA. DARPA set about solving a glaring problem:
what happened if Soviet attacks cut U.S. telephone lines? How could information be moved around quickly,
even after a nuclear strike? A faster computer wouldn’t help if itt was
blown to bits. What was needed was a network. So in part to defend against information loss
during a war—and in part to make researchers’ lives easier—DARPA funded the first true
network of computers, the Advanced Research Projects Agency Nework, better known as ARPANET. People give different dates for the birthday
of the Internet, but two stand out. On September 2nd, 1969, ARPANET went online. It used the then-new technology of packet
switching, or sending data in small, independent, broken-up parts that can each find their own
fastest routes and be reassembled later. This is still the basis of our networks today! At first, ARPANET only linked a few universities. But it grew as researchers found that linking
computers was useful for all sorts of reasons, nukes aside! And then, on January 1st, 1983, several computer
networks including ARPANET were joined together using a standard way of requesting and sharing
information: TCP/IP. This remains the backbone of the Internet
today. Meanwhile, French engineers created their
own computer network, connected through through telephone lines, Minitel, back in 1978—five
years before TCP/IP! Minitel was retired in 2012. And the Soviets developed their own versions
of ARPANET. But after 1991, these joined the TCP/IP-driven
Internet, and the virtual world became both larger and smaller. The Internet in the 1980s was literally that:
a network interconnecting computers. It didn’t look like a new space yet. For that, we can thank British computer scientist
Sir Tim Berners-Lee, who invented the World Wide Web in 1990. Berners-Lee pulled together existing ideas,
like hypertext and the internet, and built the first web browser to create the beginnings
of the functional and useful web we know today. The Web had profound effects. It brought the Internet to millions of people—and
brought them into it, making them feel like they had a home “online,” a virtual place
to represent themselves, meet strangers all over the world, and troll educational video
shows! The Web also democratized the tools of knowledge
making. From World War Two until 1990, building computers
and using them to do work was largely the domain of elites. A short time later, we can trade software
on GitHub, freely share 3D printing templates on Thingiverse, and benefit from the collective
wisdom of Wikipedia. It’s as if the Internet now contains not
one but several Libraries of Alexandria. They’ve radically changed how we learn and
make knowledge. Just as scientific journals had once been
invented as printed objects, since 1990, they’ve moved online—though often behind steep paywalls. In fact, Russian philosopher Vladimir Odoevsky predicted way back in 1837—in The Year 4338—that our houses
would be connected by “magnetic telegraphs.” But this came true only one hundred and fifty
years later—not two millennia! So what will happen in another hundred and
fifty years? Well, computing seems to be changing unpredictably. Not only because computers are still getting
faster, but because of at least three more fundamental shifts. One, scientists are experimenting with quantum
computers, which work in a different way than “classical,” binary ones. This is called superposition, and it has the
potential to make the computers of the future much faster than today’s. This could lead to major shifts in cryptography:
the current method of protecting our credit cards works because classical computers aren’t
strong enough to factor very large numbers quickly. But a quantum computer should be able to do
this kind of math easily. To date, however, quantum computers are not
yet finished technologies that engineers can improve, but epistemic objects: things that
scientists are still working to understand. So will quantum computing change everything? Or mostly remain a weird footnote to classical
computing? I don’t know… we’ll find out! Fundamental shift two: some researchers across
computing, history, and epistemology—the branch of philosophy that asks, what counts
as knowledge?—wonder if really really large amounts of data, called Big Data, will change
how we do science. One of the main jobs of being a scientist
has been to just collect data. But if Internet-enabled sensors of all kinds
are always transmitting back to databases, then maybe the work of science will shift
away from data collection, and even away from analysis—AI can crunch numbers—and into
asking questions about patterns that emerge from data, seemingly on their own. So instead of saying, I wonder if X is true
about the natural or social world, and then going out to observe or test, the scientist
of the future might wait for a computer to tell her, X seems true about the world, are
you interested in knowing more? This vision for using Big Data has been called
“hypothesis-free science,” and it would qualify as a new paradigm. But will it replace hypothesis-driven science? Even if AI is mostly “weak,” meaning not
like a human brain—but only, say, a sensor system that knows what temperature it is in
your house and how to adjust to the temp you want—once it’s very common, it could challenge
long-held assumptions about what thought is. In fact, many people have already entrusted
cognitive responsibilities such as knowing what time it is to AI scripts on computers
in their phones, watches, cars, and homes. Will human cognition feel different if we
keep giving AI more and more human stuff to take care of? How will society change? I don’t know… we’ll find out!!! And these are only some of the anxieties of
our hyper-connected world! We could do a whole episode on blockchain,
a list of time-stamped records which are linked using cryptography and (theoretically) resistant
to fraud, and the new social technologies it enables: like cryptocurrency, kinds of
money not backed by sovereign nations but by groups of co-invested strangers on the
Internet. Will blockchain change money, and fundamentally,
trust in strangers? Or is it just another shift in cryptography? A fad? I don’t know… we’ll find out! Let’s head back to the physical world to
look at the cost of these developments. One feature they have in common is they require
ever greater amounts of electricity and rare-earth metals. And older computers become e-waste, toxic
trash recycled by some impoverished persons at cost to their own bodies. Even as computers become so small they’re
invisible, so common they feel like part of our own brains, and so fast that they may
fundamentally change critical social structures like banking and buying animal hoodies on
Etsy… they also contribute to dangerous shifts of natural resources. Next time—we’ll wrap up our story of the
life sciences by asking questions about the future of medicine and the human brain that
remain unanswered as of early 2019. History isn’t finished! Crash Course History of Science is filmed
in the Cheryl C. Kinney Studio in Missoula, MT and it’s made with the help of all these
nice people. And our animation team is Thought Cafe. Crash Course is a Complexly Production. If you want to keep imagining the world complexly
with us you can check out some of our other channels like Animal Wonders, The Art Assignment, and
Scishow Psych. And if you would like to keep Crash Course
free forever for everyone, you can support the series on Patreon, a crowd funding platform
that allows you to support the content you love. Thank you to all our patrons for making Crash
Course possible with your continued support.

98 Replies to “The Internet and Computing: Crash Course History of Science #43”

  1. Science Class says:

    I really want to see a Crash Course Earth Science or separate geology, oceanography, meteorology course. I'll host them if you need me to 😉

  2. Tommy Huang says:

    ROFL when he said hey google at the beginning my google home actually responded and scared me lmao

  3. Juan Sebastian Gómez Vega says:

    I love that we're getting close to the apocalypse

  4. Jean-Rodney Larrieux says:

    Great episode!

  5. jeffrey dahmere says:

    very interesting , thank you

  6. Jeffery C says:

    just because more data is available… still, it is not going to guess a solution…ffs. I have simulated these complex negotiations in real time in a simulation you already know of. talk about not knowing…

  7. Stuart Smiles :D says:

    How do you decide which words to highlight I. Quotes?

  8. Justin Letchford says:

    My Google Home Mini didn't know either

  9. Rob Spiess says:

    Computers: What Do They Know? Do They Know Things?? Let's Find Out!

  10. Andrew W says:

    Crazy how much life changes within a matter of years. I'm so excited and yet terrified to see the future play out

  11. Melody Jade says:

    summary of this weeks video "I don't know…..We'll find out!"

  12. nantukoprime says:

    We went from mainframe to personal pc and client/server. We are at cloud computing. How far are we from moving to rather dumb consoles that access servers or even mainframes again for everything requiring any real resources?

    We are already almost dependent on our smartphone. What if the smartphone docks/syncs to the TV/VR/your eyes and is the access client for all of the above? Its only job is to manage the data coming in from servers doing all the real computing and presenting it to us through another screen. You take it to work, and it immediately connects to the work servers and you start your work day. It is the key to your car, your rail pass, and your flight ticket. Your bank needs your card/code and your phone to be present to authenticate you. It is your one device, everything else is just so you interact with it better. As long as you have it, you have the new gaming console, a gaming pc, a video editing machine, a smart home, a personal doctor, a financial advisor, an accountant, a secretary, and a chauffer.

    We are already partway there. I have four two-factor security apps and a password manager, one of the two-factor apps is only for work and I would be unable to do any work besides email or bare bones documentation without it. My dad unlocks and starts his car with his smartphone, and it is how he gets data from his insulin pump. My phone is my best tool to move through an airport or the subway with any speed.

    I've heard younger friends/coworkers talking about losing their phone like being disconnected from reality. What if it could be even more than it is now?

  13. Geoffrey Winn says:


  14. Zoxesyr says:

    this is techne rather than scio

  15. Orion Rodriguez says:

    Do an episode on block chain, mesh nets, and cryptography.

    But definitely mention mesh nets.

  16. Indian Study with Me says:

    Widout internet i wouldnt have been able to watch this video 😅

  17. Miguel de Jesús González Martínez says:


  18. Timothy Garcia says:

    In short: we don’t know, we’ll find out!

  19. FlesHBoX says:

    I heard "next time we'll wrap up…" and thought "but this series has been sooooo short… Then I look down and see "#43" and think "wow… that happened fast! I wonder what we will get next??"

  20. Kyle Henderson says:

    I love that they point out that one of the key things about science is that "We don't know, we'll find out."

  21. mike johnston Bob says:

    Teesy P., I Pee! Not Teesy P., You Pee

  22. Make Cool Apps says:

    "this is called superposition" erm, cough… kinda glossing over the basic details there

  23. Josh Bobst says:

    Have I given over responsibility for knowing what time it is to AI? Not in the least. I still rely on my own ability to grade contextual clues about the time. AI clocks hold little more authority in my view than do wristwatches (a little, but not that much). The reason is that AI – and wristwatches – may be many orders of magnitude better than me at measuring time, but they don't care what time it is. I do.

  24. ilovebats10 says:

    The scientific article paywalls are confusing. Like in physics the same paper will be behind a paywall on a journal's website and totally free on the arxiv.

  25. Eric Alvaro says:

    I don't think steve wozniak got billionaire with Apple thought

  26. J Lupus says:

    Humans are getting dumber though. And technology is part of the problem. Read “Is Google Making Us Stupid”, one of my favorite articles, which goes through the history of technology.

  27. Anthony Radisich says:

    In summary… a whole bunch of smart people did a whole bunch of smart things, and now i can learn about it while sitting on the toilet

  28. smileandlaughs says:

    This video just set off my Google home.

  29. Darren Krivit says:

    Sometimes I'm sad I won't be around in 100 years, sometimes I'm relieved🤔

  30. Cat- Daddy says:

    Thanks Hank for "hey Google", my google assailant got trigger

  31. Dia Jasin says:

    C'mon you're not living if your mind isn't backed up on a drive!

  32. Veronika Puhach says:

    So excited about you doing an episode on blockchain and cryptos. I've been interested in this topic for years, and there are so many misconceptions about it. It's great that you'll make an educational video on this matter.

  33. Marco Meijer says:

    @6:35 Just Tim Berners-Lee? No love for his Belgian colleague Robert Cailiau? For shame.

  34. Soy Yoli says:

    Where's the chemistry at tho? I don't remember the quintessential to modern civilization's development thing that's got us away from being peasants called the periodic table of elements being mentioned.

  35. N. Petrenko says:

    Well done.

  36. Dani Ihza Farrosi says:

    Can finally run Minecraft in 60 fps with ultra graphic mod

  37. Forran says:

    I so wanna code on a quantum computer… People always talk about their cryptography capabilities, but I really wonder how one would write a program on it and run simulations…

  38. David McBride says:

    God , who are the 16 people that disliked this video? Amish people ?

  39. XeLaNoiD says:

    I can't wait to find out !

  40. Wesley Morgan says:

    When this series "ends," meaning you have summarized all past discoveries, I do hope you continue to check in and update us on the historic STEM advancements that still happen each year.

  41. Redwan Hasan says:

    Is that PARC or PARC?

  42. Hatsikidee Ole says:

    @ 0:25 talking about invention of transistors and showing a board, with all sorts of components, but almost no transistors what so ever (I see resistors and capacitors for the most part).

  43. Allan Gustavo says:

    Do a history of science episode with Sir. William Osler

  44. AJHernandezRamos says:

    ´´ guess we´ll find out ´´ = Infinite probability .🤓

  45. xspager says:

    SPUTNIK, DARPA was create because of Sputnik and the ICMB that put the satellite on orbit.

  46. Илья Найдов says:

    Do I get useful information from watching videos on YouTube?.. I don't know… We'll find out…

  47. pewds says:

    your videos are trash m88888

  48. Nixitur says:

    Your point about quantum computers and cryptography is a bit misleading. The difference between classical and quantum computers in that context is not just one of power and speed. There is some stuff that quantum computers can do that simply has no equivalent in classical computing.

  49. Woodrow Wiest says:

    History of computing, or future? Fun episode.

  50. Stephen Bly says:

    Hypothesis-free science is terrifying. If you have enough data and look for enough connections, you will find connections, purely by chance.

  51. Trag 1804 says:

    I expect strong AI and robots will be doing everything we do now. Wonder what we will become of humans.

  52. Jack Henderson says:


  53. Jack Henderson says:

    Also where can I find Hank’s pin badge

  54. MinerGate says:

    wheres john green

  55. steve says:

    the internet should be a public utility

  56. leadfoot9x says:

    One drawback of the blinding rate of advancement in computer science is that, since everyone relies on computers now, we are constantly making large chunks of our collective skillsets become obsolete and be in constant need of replacement. In the professions, for example, this often means there is a divide between young people with advanced computer skills (they haven't been around long enough to fall behind yet) but not enough experience to apply them properly, and older people with rich experience but inadequate computer skills (and often managerial responsibilities that keep them from effectively passing on knowledge to the next generation).

    The chaos of a constantly-changing environment undermines the theoretically possible efficiencies of that environment.

  57. Daruqe says:

    Wait, was the red menace Carmen Sandiego the whole time?

  58. Miriam Paris says:

    I honestly didn’t make the connection between Crash Course John Green and author John Green

  59. Leba Babel says:

    in the book outliers by Maclom Gladwell: Bill gates, lived a bus ride from University of state Michigan where it was the first university to own computers. He went to private school and they had money to spend on computers. Bill spent hours writing code for accounting.

  60. andy low says:

    how does this gear thing holding books called? a paperweight or some else?

  61. Angela Galvez says:

    sincerely a nursing student

  62. Monalisa Milloroso says:

    Without the Internet I could not watch crash course.

  63. VitruvianSasquatch says:

    Smaller chips does mean less resource consumption/toxic waste, at least.

  64. Agimaso Schandir says:

    Xerox also brought together existing technologies.
    Stanford Research Institute (SRI) which developed "windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work)" {Wikipedia}. Douglas Engelbart's "Mother of All Demos" (available on youTube) demonstrated this technology. Several people who worked at SRI left to work for… Xerox PARC.
    Ethernet was "inspired" by ALOHAnet developed for DARPA, which Robert Metcalfe had studied and made part of his doctoral thesis at Harvard while working at… Xerox PARC.
    Laser Printers were a modified version of the electrophotography printer which the inventor Carlton Chestor sold to the Haloid Company, which was later became… Xerox.

  65. fowlerj111 says:

    Lots of fact-checking skipped here. E.g. nuclear motivation for ARPANET is controversial and probably not true. I wish SciShow had a way to know when they get it wrong. ☹️

  66. Naoum Sayegh says:

    Was that a Wu-Tang Clan reference?!

  67. A Miz98 says:

    Can you do a video on The Lathe of Heaven?

  68. alaskaoalaska says:

    4:36, connecting the U.S. government's investment in information exchange to 4chan (which used to idolize Ron Paul to large degree) isn't necessarily ironic. The spirit of 1776 lives on in the internet and innovative nerds.

  69. Arihant Jain says:

    spam troll in chat !!!! 7:07 in the video

  70. Arihant Jain says:


  71. Mick Mickymick says:

    Not for the first time, a history of technology, not science

  72. Sarah Leonard says:

    "History, my friends, is not finished."

  73. Badpvppaladin says:

    Hello Hank, when does the new show with your brother start? Green Brothers ftw

  74. theoriginalSkooby says:

    Why is Gates green?

  75. Sonja Johnson says:

    Not only do I love the phrase itself, I love how Hank's eyes light up and how he says in a different, dramatic way every time.

    The whispered "I don't know" was the best!

  76. Like Bot says:

    Doesn't anyone know that DOS doesn't mean an operating system marketed by MicroSoft?

  77. Mikko Saarinen says:

    Funnily the Library of Alexandria is now trying to archive the internet (or at least as much of it as they can)

  78. Number Eight or Nine? says:

    saddened that there was no mention of how many enslaved people help produce technology today. more slaves now than any time in history, worlds grown 5 billion ppl in 60 yrs tho

  79. Human Being says:

    Where does one get that pin?

  80. artenikat says:

    I don't know… we'll find out! – me whenever I'm not sure about my test answer

  81. João Fernando Franco Tonsic says:


  82. E.D. Rodríguez says:

    This guy knows more stuff than Batman

  83. Glitch Gacha Studios says:

    The internet makes playing on Minecraft servers possible. I have videos about Minecraft servers. Thanks internet, for giving me Minecraft servers, screen recording software, video editing software, and a YouTube channel to upload my videos onto!

    (Get the hints? They are leading to WATCH MY VIDEOS!)

    Also I love the internet and crashcourse

  84. Steve says:

    I love telling kids that I'm only 6 months younger than the internet.

  85. Lyserberg says:

    I honestly don't thank anyone for 4chan.

  86. Amanda Starleper says:

    Mom:12:00 listen to history. Me:can I listen to crashcourse. Mom:yes. Mom:son its 8 days later.

  87. Blue Jedi says:

    I'd love to see a series on Ecology and library/information science

  88. maria says:

    This topic is so interesting pls talk more about it! Xx

  89. Brian Hutzell says:

    “The internet is for porn! The internet is for porn!
    Why do you think the net was born?
    Porn! Porn! Porn!” – Avenue Q

  90. David San says:

    I know…but I'm not telling.

  91. Hoshou Neko says:

    Green brothers are my favorite anchors. You doing a great job! Love every episode.

  92. Leotique says:

    Predicting the future with the past.

  93. Анна Dolganova says:


  94. William Kibler says:

    "I dont know. We'll find out." Basically the mantra of this series

  95. Darrell Cole says:

    I don't know….we'll find out!!!

  96. InterNutter says:

    My Google Home couldn't roll the intro, and complained.

  97. Linda Vilma Ole says:

    Super Computers – quantum computers is not finished yet but it is already facing a pile of jobs to do basing on the question Hank Green uttered. Big Science is relying on computers and to create super computers makers relies both on Science and Technology…

  98. Carl L.D. says:

    Bill Gates only stool DOS when he saw the GUIs at xerox.

Leave a Reply

Your email address will not be published. Required fields are marked *