Much of the technology we use daily was initially developed on college campuses. For a prime example of this, look no further than the internet. Many of today’s tech giants including Amazon (NASDAQ:AMZN) and Alphabet (NASDAQ:GOOG, NASDAQ:GOOGL) simply would not exist if it weren’t for the internet.
While we take the internet for granted, it wasn’t always here. It actually had its roots in a network called ARPANET, developed in the 1960s and 1970s to facilitate communication for the U.S. military and university-based researchers. From that backbone, the world wide web (which is now 25 years old) evolved. Even today, the World Wide Web Consortium — tasked with maintaining open web standards — is headed up by World Wide Web “founder” Sir Tim Berners-Lee at MIT.
The internet is one of the biggest and farest-reaching examples, but there are plenty of other groundbreaking technologies that have been created by universities.
Better Lithium Ion Batteries
Lithium Ion batteries may have a bad rap thanks to incidents like Samsung’s, exploding Galaxy Note 7, but their high power density is critical to mobile devices like smartphones and wireless headphones. Lithium Ion batteries also make Tesla’s (NASDAQ:TSLA) electric cars possible.
MIT professor Yet-Ming Chiang is credited with making Lithium Ion batteries safer, much more powerful, and faster-charging than early versions thanks to his research in the university’s materials sciences labs. In 2002, he co-founded A123 Systems to commercialize the new lithium Ion technology, which was soon used in batteries powering power tools, electric cars, and other devices.
The Hoana LifeBed is compared to “Doctor McCoy’s sick bay bed in Star Trek.” It uses non-contact sensors embedded in the cover of a hospital bed mattress to non-invasively provide critical patient monitoring data including heart rate and respiratory rates.
The real-time vital signs data provided by the LifeBed helps medical professionals to asses a patient’s health and emotional state without the use of cuffs or electrodes. Commercializing technology developed at the University of Hawaii that was originally funded by U.S. military grants, Hoana was spun off as a private venture in 2001.
We’ve already established that Google wouldn’t exist if it weren’t for the internet, but it’s also true that Google itself probably wouldn’t exist if it weren’t for Stanford University.
Google co-founders Sergey Brin and Larry Page created their search engine, which used page rankings to improve results while PhD students at Stanford. With the success of the search engine – which was originally available on Stanford’s website — the pair dropped out to launch Google as a commercial venture.
LCD panels replaced CRTs to revolutionize televisions and made the laptop computer possible. Liquid crystals were discovered in 1888 and first used to create an LCD display in 1968. But it was in 1969 that a researcher at Kent State University created a “twisted nematic” LCD display that was durable and power-efficient enough to be practical.
This led to commercialization of LCD technology, starting with the first LCD watch display in the early 1970s. Kent State still operates the Glenn H. Brown Liquid Crystal Institute to further research into liquid crystal technology.
E Ink displays
E-readers like Amazon’s Kindle and the Kobo Forma are built around E Ink displays. It’s the E Ink display that make these devices popular, despite the competition from tablets.
E Ink displays used by e-readers are high resolution, with ultra-long battery life (weeks instead of hours) and they can be read in bright light and sunlight. In addition, e-readers are much lighter than tablets and many are now waterproof as well.
E Ink technology was developed at MIT by associate professor Joseph Jacobson.
Started by Mark Zuckerberg and several classmates while at Harvard University as a social directory for Harvard students, Facebook exploded beyond campus to become what is now a $512 billion company with over 1 billion users. In an interview, Harvard’s Jonathan L. Zittrain (Computer Science professor at the School of Engineering and Applied Sciences in addition to being a Law professor at Harvard Law School and the Harvard Kennedy School) commented about what made the university an ideal launching ground for the nascent social network:
“The college environment made for the ideal petri dish: lots of comparatively tech-savvy people eager to get to know one another, and not as guarded about privacy, especially since the early Facebook was indeed limited to those who could show a university email address.”
Artificial intelligence — or AI — has the potential to be the next game-changer in technology. AI is already making search better, making personal assistants like Siri and Alexa smarter, and helping automakers move toward autonomous driving.
AI is being developed by many tech companies, but the field is also being constantly advanced by pioneering research at universities. Notable hotbeds for AI research include Carnegie Mellon, MIT, Stanford, and the University of Toronto. Researchers from these programs have also increasingly left the campus to lead the AI divisions of tech giants — in 2015, Uber (NYSE:UBER) “gutted” Carnegie Mellon’s AI and Robotics center, hiring away 50 of its staff.
Like the internet, GPS is one of those technologies we take for granted. It’s used by everything from the military to our smartphones, and companies like Garmin (NASDAQ:GRMN) have built successful businesses around GPS and GPS-related products like automobile and hand-held navigation systems.
MIT’s Ivan Getting leveraged his experience at MIT’s Radiation Laboratory to eventually become a key figure in the development of the Global Positioning System — GPS.
Technology doesn’t get much more groundbreaking than robotics, especially the quadruped robots from Boston Dynamics. YouTube videos showing these uncanny, four-legged robots in action look like science fiction, but the company has commercialized them to carry payloads up stairs, though industrial sites and over rough terrain. Its intimidating Big Dog robot was funded by DARPA for U.S military use.
Boston Dynamics got its start as a spin-off from MIT before being acquired by Google, and then SoftBank.
Universities can’t be credited with creating entrepreneurship, but college campuses definitely drive technological innovation, foster entrepreneurs and attract investment.
According to Carnegie Mellon University President Farnam Jahanian at that institution alone, faculty and students started 173 new companies between 2011 and 2016. Across the U.S., between 1996 and 2015, the economic transfer of technology from university research to the private sector contributed $1.3 trillion to U.S. gross industrial output and helped to support 4.3 million jobs.
As of 2017, there were more that 200 universities and colleges with dedicated innovation or entrepreneurship centers, helping to ensure that the groundbreaking technology keeps coming.
As of this writing, Brad Moon did not hold a position in any of the aforementioned securities.