Ninety percent of all of the data that human beings have created was generated in the last two years, according to Thomas Saueressig, chief information officer for SAP, addressing attendees at Huawei Connect in Shanghai, China last week. And with 9 billion mobile devices forecast to be in use by 2020 – one for every person on the planet – the data usage explosion is just beginning, Saueressig said.
His comments set the stage for a broader discussion during the three-day event about how stakeholders – including enterprises, network operators and cloud providers – can thrive in this environment. Some common themes: Artificial intelligence will be important to making sense of all of the data generated. And no single entity can tackle this alone but instead, stakeholders increasingly will need to rely on partnerships.
The Impending Data Usage Explosion
Processing all of the data generated by the impending data usage explosion could help drive further adoption of cloud services. According to a forecast cited by Huawei, 85% of enterprise workloads will move to the cloud by 2020.
“We describe this relationship between things [that are] connected . . . and the cloud as one large virtuous cycle,” said Intel CEO Brian Krzanich, one of several heavy hitters speaking at the Huawei event. “It’s built on a set of technologies that reinforce [value] through a continuous feedback loop.”
Some people use the term “artificial intelligence” to define that continuous feedback loop.
Artificial intelligence has only hit its stride in the last few years, according to Professor Andrew McAfee of the Sloan School of Management at the Massachusetts Institute of Technology. Before that, people had to tell computers what to do – and they had to do that in great detail, McAfee explained. But now “we don’t have to do this anymore,” McAfee said. “We’re building systems that can figure out how to do things themselves.”
He cited the example of the computer designed to play the game of Go, originally invented 2,000 years ago. Before artificial intelligence, “attempts to build software to play Go were failures for one simple reason,” McAfee said. Unlike with chess or checkers, “no human being can explain how to play at a high level. . . Good players can’t explain why they made a move” but instead rely on years of experience with playing the game.
That’s essentially what artificial intelligence enables a computer to do – and how developers were finally able to build a successful Go-playing computer. Developers “showed it hundreds of Go games” and by learning from experience and sensing subtle patterns, the computer that was developed “is the best player of Go in human history,” said McAfee.
Don’t Do It Alone
As the volume of data generated explodes and users increasingly turn to software to make sense of it, stakeholders should not underestimate the size of that task.
“There is too much software for any one company to write themselves,” observed Jim Zemlin, executive director of the Linux Foundation, at Huawei Connect.
To get to market faster, “everyone is leveraging Open Source,” Zemlin said.
Many developers have chosen to use the Linux platform because of its open nature, Zemlin said, noting that 10,800 lines of code are added every day.
Zemlin stressed, though, that “no single organization can keep up with that development pace.”
According to Zemlin, we’re seeing a “transformation from a world where companies built everything themselves to a world where you can’t compete if you try to build everything yourself.”
In today’s world, Zemlin argued that “companies that know how to manage outside developers will win.”
What Does This Mean for U.S. Telcos?
Traditionally U.S. telecom companies were famous for doing everything in house. But that’s changing.
As AT&T continues to move toward a software-defined network, the company is doing some development in house but also is tapping Open Source resources and partnering with other companies. And Verizon seems to be taking a similar approach in its efforts to move to a more agile, software-centric network.
Amid the data usage explosion, with the cloud becoming increasingly important, though, it might not seem to be a smart move for telcos to be selling off their data centers, as more and more of them are doing. Will they miss out on important opportunities? Or have they determined that they don’t have the best skill set or the right partnerships to meet the demands that will be placed on data centers and the cloud as data usage explodes and artificial intelligence becomes critical?