To 2014 and Beyond

Last updated 05:00 24/12/2013

Relevant offers

It's the end of the year, which means it's prediction season-when pundits put forth their perennial prognostications about all that shall be in the following 365 days and beyond.

Like any announcement that must compete for human attention in the public sphere, the strange, bold, and surprising predictions gather the most notice.

We at the Futurist magazine love predictions, but we consider them statements as much about the speaker and the time in which she lives as about the future.

With that in mind, we have assembled a list of more than 30 predictions made in 2013, originating from researchers, A-list actors, and industry titans.

In many of these, the person who is making the prediction is as significant as what is being said. With each, we've included a big "but" or countertrend that could get in the way.

The final list is available from the Futurist magazine. But here are 10 of my personal favourites from the past 12 months.

1. As much as 45 percent of the jobs that currently exist in the United States will be taken over by computers or artificial intelligence systems by 2045.

Who made it: Nick Bostrom, director of the Future of Humanity Institute at Oxford and the Oxford Martin Programme on the Impacts of Future Technology.

Why it's a strong prediction: Yes, it's bad news. But the difference between peril and opportunity is the time available to plan. Would you rather hear that half of the jobs in the United States will be gone in the next five minutes?

The team at Oxford ran detailed models on 702 different occupations to assess the effects of computerisation on U.S. labor. This report jibes with previous statements from other experts.

Most technology that's disruptive to labor has historically produced net employment gains within 10 years.

Some economists, most notably Erik Brynjolfsson and Andrew McAfee, have suggested that the trend may soon reverse and that increased productivity through technology could begin to hurt-rather than help-long-term employment. It's one reason why even libertarians are warming to the idea of a guaranteed income for everybody.

At least the Oxford folks had a hopeful takeaway: "Wages and educational attainment exhibit a strong negative relationship with an occupation's probability of computerisation." So the smarter you are, the safer your job.

Ad Feedback

BUT: The 19th century was also filled with anxious futurists. Karl Marx, heavily influenced by the Luddites, was a tech historian and was forever fretting about unemployment through automation, as Amy Wendling describes in her book Karl Marx on Technology and Alienation.

Even Mark Twain, in Life on the Mississippi, bemoans a future in which automation takes away every man's livelihood and dignity.

"Half a dozen sound-asleep steamboats where I used to see a solid mile of wide-awake ones! This was melancholy, this was woeful. The absence of the pervading and jocund steamboatman from the billiard saloon was explained. He was absent because he is no more. His occupation is gone, his power has passed away."

Yet history shows that the process of industrialisation in the 1900s produced more employment and broad-based economic gains than it destroyed. The premise that automation and computerisation are destined to be job killers remains controversial among some very smart people.

Robert Atkinson, president of the Information Technology and Innovation Foundation, for one, takes issue with McAfee and Brynjolfsson's conclusion, arguing on the MIT Technology Review website that "far from being doomed by an excess of technology, we are actually at risk of being held back by too little technology."

Atkinson's own prediction: By 2023, the United States will have 5 percent more jobs than today.

Bottom line: The best way to plan for either future, be it the one in which your career has been lost to a robot or the one in which automation continues to create jobs, is to get smarter about technology.

If you're in a rush, 2014 could be the year to pick up an in-demand programming language like Java.

Alternatively, you could take the advice that Google research director Peter Norvig laid out in this 2001 essay and dedicate the next 10 years to learning programming. After all, you've got some time.

2. Massive amounts of algae for food and fuel will be grown in places that we today consider wasteland.

Who made it: Jason Quinn, of Utah State University, speaking at the Algae Biomass Summit, in Orlando, Fla., in September. Quinn modelled the algae-producing capacity of 4,388 places around the globe. He told me that "the total lipid oil yield for the world using just non-arable land is 48,719 billion litres per year."

Why it's a strong prediction: For years, halophytic (saltwater) algae have been called the super fuel of the future. On paper, it's an extremely attractive replacement for oil.

Yes, to run your car on alga, you have to burn it, which might not smell great. But it's a plant, so you can mitigate the CO2 you are releasing into the environment by growing more algae than you're burning. Also, it doesn't use up valuable freshwater. That's extremely important in a future world where a majority of the human population lives in a water-stressed environment.

Furthermore, you can grow algae in places where you can't grow conventional crops. As Dennis Bushnell, chief scientist at the NASA Langley Research Facility, wrote for theFuturist, "The Great Salt Lake could conceivably be turned into an algae pond to produce something on the order of $250 billion a year in bio fuels. People are looking at turning parts of the Pacific Ocean off of South America into algae ponds."

Even the Kingdom of Saudi Arabia, synonymous with oil wealth, has announced that it will begin to produce 30,000 tons of algae by 2014 (though it's intended for food, not fuel).

BUT: People have been looking for a scalable algae fuel model for more than a century. The firm Pike Research has forecast that the global algae biofuel industry will likely be producing just 61 million gallons per year by 2020, a far cry from an amount sufficient to replace petroleum.

One notable sceptic of a fast and easy path to full commercialisation also happens to be the man carrying out the field's most innovative research: geneticist J. Craig Venter.

In a 2011 interview, Venter told Scientific American writer David Biello, "These are huge challenges. Nobody has the yields, that I'm aware of, to make it economical-and, if it's not economical, it can't compete. It's going to be the ones with scientific innovation and deep-pocket partners that can see to making the long term investment to get someplace."

Venter is currently in a $600 million deal with Exxon Mobil to genetically engineer a form of halophytic algae that grows at a much lower cost than algae that exists naturally. If he succeeds, the world's energy landscape would be transformed overnight. "It's a 10-year plan," he cautions. "We're not promising new fuel for your car in the next 18 months."

Bottom line: A decade from now, algae may transform how we run our growing cities, cars, and gadgets. But for now you're still responsible for your own carbon footprint.

3. We are approaching a post-antibiotic era.

Who made it: Dr. Thomas Frieden, director of the Centers for Disease Control and Prevention, on The Diane Rehm Show, Sept. 18.

Why it's a strong prediction: Every year, 2 million people get sick-and at least 23,000 people die-from infections that have turned drug-resistant, according to the CDC.

There's evidence that overuse of antibiotics (not only in people but also in animals) is terrible for you even if you don't get an infection. It can harm everything from the helpful bacteria that live in your gut (your microbiome) to your DNA.

But a little DNA-denting may be the least of our worries. The CDC is more focused on preventing another Black Death.

One consequence of the overuse of antibiotics is the continued spread of the "nightmare" carbapenem-resistant Enterobacteriaceae (CRE)bacteria to more hospitals and health facilities. A CRE infection has a mortality rate of 50 percent when the germ infects the bloodstream.

BUT: It's not too late. Every hospital needs an antibiotic stewardship program, says Frieden.

Bottom line: One of the most common ways we treat illnesses today will soon be doing us more harm than good. You may want to pay more attention to the antibiotics that go into your food, and into you.

 

4. "New technologies could allow astronauts to live off the land as they explore the ancient valleys of Mars. The capability to manufacture breathable air, rocket fuel, water and more may forever change how we explore space."

Who made it: Jason Crusan, director of NASA's Advanced Exploration Systems Division, in a Sept. 27 press release.

Why it's an important prediction: The quote speaks to the ambitions of the Mars 2020 mission, which is itself part of President Barack Obama's goal of sending astronauts to Mars by 2030. The next step in getting boots on Martian soil is developing a new rover capable of detecting signs of past life.

The rover would also give scientists an understanding of how hard it would be to collect carbon dioxide and turn it into fuel, a key piece of information if we ever want to get the people we send to Mars back home.

BUT: If history is any guide, we're not going to Mars. We still don't know the scope of the technical or practical challenges to getting there.

Also, it's worth remembering that Obama is not the first president to make bold promises about the U.S. space program. Nor is he the first politician to date those promises far enough into the future as to escape any accountability for the failure of said space aspirations to come to fruition. George H.W. Bush forecast a manned mission to Mars by 2019.

His son, George W. Bush, took expectations down a notch promising a return to the moon by 2020. In the late 1960s and '70s, various politicians and NASA managers believed it possible to send a human to Mars by the mid-'80s. The cost of the Vietnam War and the Great Society programs relegated such ambitions to the backburner. Even during the Reagan boom years, the United States never did get around to building that Star Wars defence shield.

Bottom line: If we continue to invest in space exploration, we will soon know a great deal more about the history of Mars, its ability to sustain human life, and the feasibility of a manned Martian mission (with a return home). But don't pack your bags just yet.

5. Big business decisions will be made not be experts or intuition but by big data and predictive analytics

Who made it: Virginia Rometty, CEO and chairwoman of IBM, speaking at the Council of Foreign Relations on March 7.

Why it's a strong prediction: She's right. Big-data analytics is poised to grow from a $14.87 billion market in 2013 to a $46.34 billion market by 2018, according to the research group MarketsandMarkets. And we will also be producing a lot more data-4,300 percent more in 2020 than we did in 2009, according to the research group IDC.Big data flouts the laws of basic economics: It becomes more valuable as more of it exists, because it's useless without the ability to collect, analyse, and execute on it.

BUT: The firm Gartner says that big data is just cresting the hype cycle. It's currently in a position Gartner calls the "peak of inflated expectations." Following the peak comes the "trough of disillusionment" (and then Mordor). Enthusiasm for big data could wane in 2014. But I'm still bullish.

Bottom line: Big data will heavily shape the next era of humanity and will determine tomorrow's winners and losers. But there will be losers.

6. Nissan will sell self-driving cars for the public by 2020.

Who made it: Nissan Executive Vice President Andy Palmer.

Why it's a strong prediction: He's right-and his comment shows courage. The emergence of self-driving cars promises to change the way we live, work, and design cities.

For instance, we'll need a lot fewer parking spaces in dense neighbourhoods when cars can drive themselves to a spot a few miles away. We'll have less traffic when our vehicles can communicate with one another. But, first and foremost, wide adoption of these systems will radically change the auto industry.

The self-driving cars are coming, but the United States may not be their first stop.

Self-driving cars can be shared much more easily since they can drive themselves wherever they are needed. That's why some researchers believe that in an autonomous vehicle era, we may need just one-tenth as many cars as we do today.

Not surprisingly, many car manufacturers, such as Ford, are resisting the change. Nissan is making the hard choice to embrace a smaller market. It has some company: Audi and Tesla have also pledged to make self-driving cars available within the next eight years.

BUT: The resistance to autonomous vehicles doesn't just come from companies worried about selling fewer cars. Many groups, from taxi drivers to street pavers, may soon start fighting to keep robot roadsters out of the United States.

Different countries and cultures will react to unmanned vehicle technology in a variety of ways. Japan is more comfortable with autonomous systems of all types, and there are far fewer legal restrictions on them.

Also, forget Jeff Bezos's announcement that Amazon will begin delivering goods via drone within four or five years. Chinese entrepreneurs are already using unmanned aerial vehicles for the delivery of food and merchandise-albeit not always legally.

Bottom line: The self-driving cars are coming, but the United States may not be their first stop.

7. Bitcoin is doomed to fail.

Who made it: Reuters economics editor Edward Hadas in the New York Times on Nov. 27: "The developers of bitcoin are trying to show that money can be successfully privatised. They will fail, because money that is not issued by governments is always doomed to failure."

Why it's a strong prediction: At the time that Hadas made that prediction, bitcoins were on a tear, rising from $200 per coin around Nov. 1 to a high of $1,200 per coin on Nov. 30. (It was at $1,000 on the day of his statement.)

In recent weeks, the digital currency has been the default method of exchange for all futuristic goods and services of recklessly overhyped value, everything from tickets on Virgin Galactic to electric cars. Take that exuberance to its irrational extreme, and you arrive at a future when bitcoins are financing the manmade islands of Silicon Valley super villains.

Not long after Hadas' prediction, China's central bank issued a warning that while individuals were welcome to play with cryptocoins, the government was not going to accept them as a method of payment and warned financial institutions to stay away.Bitcoins quickly lost more than half of their value.

BUT: Bitcoin still has plenty of buyers, and not every analyst agrees with Hadas, or with the Chinese. On Dec. 5, Merrill Lynch issued a report assessing the value of a bitcoin at $1,300 and a maximum bitcoin market of $15 billion.

Bitcoin itself may not be the gold of tomorrow. (Remember when everyone thought Facebook credits were the cat's pyjamas?) But it has competitors, and virtual currencies, which began primarily as a means to exchange goods in massively multiplayer online role-playing games, have exhibited a far steadier rise as a group.

The research firm Javelin Strategy has charted the climb of the virtual currencies in the United States from about $600 million in 2009 to $4.65 billion in 2012 and projects it will top $10 billion by the end of 2013.

Bottom line: Invest in bitcoins the way you would in an extremely volatile stock. A basket of virtual currencies might be a smarter move. Just know that the government is actively looking at the tax implications of e-money, so if you think you can hide your digital fortune from Uncle Sam, you may soon be in for a nasty surprise.

 

8. Movies and television shows will converge into a new type of episodic, data-driven art form, accessible at all times, across all platforms.

Who made it: Kevin Spacey in his MacTaggart lecture at the Guardian Edinburgh International Television Festival:

I predict that in the next decade or two, any differentiation between these two platforms [film and television] will fall away. ... The device and the length are irrelevant. The labels are useless except for agencies, managers and lawyers who use these labels to conduct business deals. But for kids, there's no difference. It's all content. It's just story.

Why it's a strong prediction: He knows what he's talking about. The original Netflix series House of Cards, which stars Spacey, represents the next phase of entertainment, the creation of content on the basis of data collected constantly and consistently from users or interactive devices.

In a piece for Salon, Andrew Leonard gave a great rundown of how this worked for Netflix: "Netflix's data indicated that the same subscribers who loved the original BBC production also gobbled down movies starring Kevin Spacey or directed by David Fincher.

Therefore, concluded Netflix executives, a remake of the BBC drama with Spacey and Fincher attached was a no-brainer, to the point that the company committed $100 million for two 13-episode seasons."

Unlike movie or television studios, Netflix has a lot of data, with 30 million streaming subscribers in the United States alone. Tim Wu pointed out in the New Republic that Americans reportedly spend more time watching television than French people do working (1,800 hours per year). What you watch, when, and on what device now creates data. You can expect more of that data to start showing up in the form of content.

BUT: The value of the data that Netflix is gathering on its users extends beyond just commissioning the production of new dramatic works.

Your viewing habits, your every viewing decision, recorded and analysed, will help marketers much more effectively target goods to you. Not everyone is or will be happy to simply give that information away. "I'm guessing this will be good for Netflix's bottom line," writes Leonard, "but at what point do we go from being happy subscribers, to mindless puppets?"

My bottom line: Television is going to get better. It's now watching you back.Relatedly ...

9. By 2020, commercials and ads that interrupt your content experience will be gone, and you will be able to pay for things with your data.

Who made it: Media futurist Gerd Leonhard speaking to Harvard Business Reviewreporter Dana Rousmaniere in May.

Why it's a strong prediction: It is the past, present, and future all at once. Leonhard isn't the first person to come up with this idea. Back in June of 2011, Neal Mohan, vice president of display advertising at Google, took the stage at the Innovation Days Internet Week to show that his company was working feverishly to improve ad features.

He described a "user-focused revolution, where people connect and respond to display ads in ways we've never seen before." In his most recent book, Who Owns the Future?, Jaron Lanier makes a powerful argument for why we should demand compensation for the data we give away to marketers. Don't be bought off the promise of an "improved customer experience," he says.

Bottom line: Your data is yours first because you created it. Don't sell it cheap.

10. By 2028, your eyes and pulse will tell your teacher whether you are learning.

Who made it: Terry Heick, former teacher and director of TeachThought.com. He believes that educators (or education systems) will measure their students' biological responses, including pulse, sweat, and eye position, for a real-time understanding of how their students are mentally interacting with material.

Why it's a strong prediction: Educational institutions are not always eager to embrace the opportunities of technological change, yet no field is going to feel the influence of this change more in the next 10 years than is education.

First, he says, in 2015, adaptive computer-based testing will replace those terrible No. 2 pencil tests and game-based learning will truly catch on. Next, in 2018, open-sourced learning models will replace standard curriculums as we know them today.

BUT: The field of education has seen a number of fads come and go over the years. In decades past, some claimed that at-home correspondence-based education would close the schools; then it was the advent of at-home lectures and classes on VHS; then, computers for every student; today, MOOCs, which are facing a backlash already. 

All of these innovations are important, but they never succeeded in replacing the classroom model as we understand it today.

Bottom line: The world will always need educators, but the definition of education could change significantly, hopefully for the better, in the coming decade.

Many of these predictions won't come true, and we certainly don't want all of them to. But they tell us something about the hopes and dreads we will leave behind in 2013.

- SLATE

Comments

Special offers

Featured Promotions

Sponsored Content

My Career