Getting a new technology out to consumers will usually require good people and boat loads of resources – including money. Generally, lots of money.
A lot of that money will be spent on research. And, more often than not, the hard research needed to translate a great idea into a usable product will be performed by someone whose job title isn't "entrepreneur." Sometimes, in fact, the research will be done by individuals who are barely aware that their innovations have any commercial value at all.
If you're here because you want to get the jump on cutting edge technologies, then you may want to keep an eye on the organizations that are known to produce practical research.
Knowing who's big in research, who's funding it, and where the big bucks are being spent can give you useful insights into what might be coming next. From there, you're just a step away from, say, spending time learning the tools that'll come with the new tech or positioning yourself to profit when it finally shows up.
This chapter was taken from the book, Keeping Up: Backgrounders to All the Big Technology Trends You Can't Afford to Ignore. If you'd prefer to watch this chapter as a video, feel free to follow along here:
Who Funds Commercial Science and Why?
Once upon a time, major breakthroughs in serious scientific research were the products of private patronages. The Italian Medici family, for instance, famously supported many individuals whose work would prove to be pivotal, including Leonardo da Vinci and Galileo.
But the years leading up to the Second World War saw the scope and complexity of research projects growing far beyond the capacity of private support. The war's dependence on unprecedented technological complexity – exemplified by the work of the Manhattan Project building the atom bomb – pushed more and more research under government charge.
Government involvement in research has continued in the generations since the war. Still, it's been estimated that universities and governments are responsible for only 30% of research funding between them, with most of the rest provided by private industry (see this article for more info).
Let's see how that breaks down.
Funding from Taxpayers
Democratic governments, of course, don't spend their own money, of which they traditionally have none. Their many programs and services are funded by revenues raised, one way or another, from their capital assets and from their populations. In modern nation states, "populations" would mean those individuals and corporations who pay taxes.
Public research and development can be performed within government agencies. According to the terms of some agency mandates, research results must immediately enter the public domain.
But even those who retain rights to their research will often point their work towards businesses and institutions that can use it productively. The US National Science Foundation (NSF), for instance, uses its $8 billion annual budget to fund "approximately 25 percent of all federally supported basic research conducted by America's colleges and universities" (Source).
Other American agencies do much or all of their research in-house. Here are some examples:
- The National Institute of Standards and Technology (NIST) has a mandate to "promote innovation and industrial competitiveness." One very important part of that mission is maintaining the National Vulnerability Database (NVD) which plays a foundational role in the management of the vulnerability assessment and detection systems protecting our IT infrastructure.
- The US military's Defense Advanced Research Projects Agency (DARPA) collaborates with private and public sector partners to aid in the development of emerging technologies. Work in recent years has included research into robotics and autonomous vehicles, but you might be more familiar with a DARPA innovation from a few decades ago: the internet.
- The National Institutes of Health (NIH) employs 6,000 research scientists across 27 research institutes and centers. Their "mission is to seek fundamental knowledge about the nature and behavior of living systems and the application of that knowledge to enhance health, lengthen life, and reduce illness and disability."
The complete list of US government research agencies makes for quite a read. Take a look for yourself.
Naturally, governments of other countries have their own research agencies. One example is Canada's National Research Council (NRC), which has evolved from its military technology origins through the two world wars, to its current focus on partnerships with private and public-sector technology companies.
The NRC now divides its work into four "business lines:"
- Strategic research and development
- Technical services
- Management of science and technology infrastructure
- NRC-Industrial Research Assistance Program (IRAP)
As I mentioned when discussing the NSF, a significant proportion of taxpayer funds directed towards research and development are granted to public and private colleges and universities.
But, from the college perspective, how much academic R&D funding comes from government sources?
A 2016 review of the 20 US colleges that spent the most on R&D found that they each spent between $837,000 and $2.4 million. It also found that between 47-87% of their total spending came from government sources of one sort or another (see this article for more info). By contrast, businesses only provided between 2 and 22% of that funding.
Private Charitable Funding
While we're on the subject of academic research, we shouldn't ignore a third source of funding: private endowments. Some – although not all – permanent endowments were targeted by their donors at research activities. Although the fund capital can't be spent each year, the income that capital generates can.
Harvard university famously – or perhaps infamously – has a total endowment greater than 40 billion dollars. Some of that undoubtedly finds its way to R&D.
Curiously, according to that 2016 study, Harvard's total R&D spending that year – including activities funded by governments (52.1%), businesses (4.7%), and endowments – was just over one million dollars.
Of course, donations support plenty of research outside of academic settings, too. Most serious diseases have associated charitable foundations that exist to raise money for both victim care and medical research.
Also, many thousands of registered non-profits exist throughout the world supporting non-medical causes, including many that involving technology-related research. The Bill & Melinda Gates Foundation is a particularly well-known example.
Funding from Corporations
Technology-oriented companies have a strong interest in getting their hands on innovations before their competition. To improve their chances, many will run their own research labs in-house.
The Bell Telephone Company, for instance – and its successors including American Telephone & Telegraph Company (AT&T) – maintained the active and enormously creative Bell Labs. Bell Labs, under various names, was responsible for many innovations, including the transistor, lasers, and the Unix operating system.
Individual technologists at some companies are often sources of innovation. 3M, for instance, has what they call a "15% Culture," where employees are allowed to use company time and space to pursue research based on their own ideas and interests. Over the years, the program has generated successful products for the company, including their sticky paper Post-its.
In another example, Percy Spencer, working on radar for US defense contractor Raytheon, accidentally discovered that microwaves could cook food.
It should be noted that not all corporate innovation is truly home-grown. A lot of it is actually funded indirectly through government money in the form of tax incentives or credits. Under such programs, companies may be permitted to use research-related spending (including salary expenses) to reduce the income taxes they would otherwise pay.
Major Fields of Commercial Technology Research
Trying to grasp the full scope of technology development at this point in history is an unforgivable waste of time. There's serious innovation going on every minute of the day, in every time zone, through countless labs, office towers, warehouses, garages, basements, bedrooms and, of course, invisibly within creative people's minds.
No one's keeping track of it all because it's not possible. Not to mention the fact that much of that innovation happens under a thick shroud of secrecy.
But it's probably worth offering just a couple of examples to give you a feel for where to look.
Quantum Computing (and Why We Should Care)
A cousin of mine with an advanced physics degree from Cambridge University once tried to explain quantum mechanics to me. He failed. Miserably. My poor old brain just couldn't absorb it. So don't expect any full, measured descriptions of the underlying science here.
Instead, I'll try to show you how experimental compute technologies that depend on the physics might work, and what can be done with them.
The super-quick executive summary version of this is that computers powered by one quantum technology or other will work a lot faster than any of the super-est of super computers we have now.
So much faster, in fact, that they may be able to solve problems that would be simply unfeasible using traditional computers (an achievement known as quantum supremacy). This would mean that some long-held assumptions about the way software works will no longer apply.
For instance, the reason the best encryption tools we currently use to protect sensitive data work, is because it would take hundreds or even thousands of hours of high-performance compute time to successfully break the encryption key. In most cases, it's just not worth the effort and expense.
But if you could easily buy time on a computer that processed operations exponentially faster, then two things would immediately happen:
- Cracking encryption algorithms would become trivial
- Honest folk would have to seriously look for a new way to protect their data
Currently, Google and IBM are among the major companies that have invested heavily in quantum compute research projects.
As well as I can understand it, quantum computers would measure the state of subatomic particles and use that binary measurement to represent a computational value. The description of that state is known as a qubit, which is effectively the quantum equivalent of traditional computing's bit.
But because a qubit can also exist within what's known as coherent superposition – meaning that its value can exist in a superposition of two possible states – it can be used to represent a more complex range of values.
And that, I'm given to believe, means that such computers will be able to do stuff much, much faster than they can now. If this actually happens, it'll be big.
The modern world consumes an awful lot of energy. We're constantly moving about, controlling our indoor (and in-transit) climate conditions, exchanging information, and expecting that all the world's riches be delivered to our doorsteps. By tomorrow.
But those energy-thirsty activities come with costs, not the least of which from the emissions they leave behind. The search for reliable, steady, and affordable energy sources that can help us find a healthy balance between consumption and emissions is ongoing – and unimaginably expensive.
Small modular nuclear reactors (SMRs) have been the focus of some serious developments in recent years. They appear to promise reliable, steady, and affordable energy in ways that their expensive and complex nuclear predecessors couldn't.
First and second generation reactors were, overall, reliable and steady – and they were clean – but their massive capital costs and large physical footprints made them more than a bit inflexible.
The idea behind SMRs is that highly efficient reactors can be manufactured off-site and delivered on trucks one module at a time for on-site assembly. The design makes the per megawatt generation of power far cheaper and project completion much faster. And it allows the deployment of nuclear power to service smaller markets that previously couldn't consider it as a realistic option.
As the name implies, SMRs are smaller than traditional reactors. They're designed to deliver between 50 and 300 MW of electricity each, compared with the 800 to 1,200 MW outputs that were previously common.
Companies heavily involved in this research include Britain's Rolls-Royce and an American company with historical connections to the US Department of Energy called NuScale Power. Various governments around the world have also invested in the technology one way or another.
Medical Technology Research
If you think we're spending a lot of money on energy, wait 'till you see how much health care costs.
Across the 37 Organisation for Economic Co-operation and Development (OECD) nations, health care industry spending accounts for around 10% of the total gross domestic product. That's more than $3,000 a year for every single living person.
On the one hand, with all that money being thrown around, there are undoubtedly many business opportunities waiting to be discovered. But there's also a lot of room for new and innovative technologies that can improve the delivery of health care while reducing the costs. Here are two excellent candidates:
Telehealth involves the provision of health services (like patient-doctor consultations) through a telecommunication medium. This might mean having a simple telephone conversation rather than a visit to the office, but it could also incorporate video conferencing tools or even the use of remote diagnostic equipment.
For example, small, remote communities could maintain imaging facilities and technicians even many hundreds of miles away from the nearest medical labs and radiologist specialists. Digital connections can permit distant doctors to view, say, ultrasound results, speak directly with patients, and confidently reach diagnoses. And all without the need for anyone to undertake exhausting and expensive travel.
Telehealth also allows for meaningful patient-doctor contact without the risk of spreading disease.
Telesurgery is an extension of telehealth which can allow some surgical procedures even when doctors are many miles away from their patients. The technology makes use of high-definition video feeds and purpose-built robotic arms that can be controlled by doctors remotely.
Telesurgery tools have the potential to save money for cash-strapped health systems but, more importantly, they can improve health care and save lives.
So whether you're thinking about building the tech behind the next mega business model, learning a new technology, or looking for a good investment, having a good sense of where the serious tech research happens and who pays for it can be useful.
YouTube videos of all ten chapters from this book are available here. Lots more tech goodness - in the form of books, courses, and articles - can be had here. And consider taking my AWS, security, and container technology courses here.