Back in 1964, Bob Dylan sang, “The times they are a-changin'.” That same year IBM announced its System/360, DEC's PDP-8 was launched, and IBM launched the SABRE reservation system for American Airlines.
While technology was certainly changing pretty fast back then, Dylan almost certainly wasn't talking about computer networking because it wasn't really happening at that point. For example, It wasn't until 1973 that Robert Metcalfe co-invented Ethernet at Xerox's PARC.
The point is that technology has always changed at a fair old pace, so once networking became established it probably came as no surprise that the job of the networking professional was and still is one that changes or evolves rapidly.
So how will it change in 2021? One clue is provided by Gartner, in its recently published Magic Quadrant for Cloud Database Management Systems. Gartner predicts that by 2022, 75% of all databases will be deployed or migrated to a cloud platform, with only 5% ever considered for repatriation to on-premises.
When you consider how much data will be sitting in those cloud databases, and therefore how much less data will be travelling over enterprise storage and data networks from storage systems to CPUs and back again, it's obvious that this will have an impact on network architects' and administrators' day-to-day activities. That should make their lives less challenging.
The reverse is true for networking experts working for cloud providers like AWS, Google, and Microsoft, of course. That's because Gartner expects multi-cloud usage to grow, creating data governance and integration challenges, as well as the pure networking ones.
But what about all that data still sitting in corporate databases in enterprise data centers? That data is also going to be travelling around enterprise networks to a lesser and lesser extent in 2021 and beyond. Why? Because of computational storage.
There's a huge growth in the use of data for AI/ML and analytics applications which need very fast access to data. But there are problems. Storage systems are a bottleneck, and so are bandwidth-constrained networks. These problems make it hard to transport all of the required data from storage systems to CPUs fast enough.
Given that storage systems are computer systems in their own right, it makes sense to use the power built into these storage systems to do at least some of the necessary data processing. In-storage embedded processing is not a new idea, but it's an idea whose time is about to come. If done right it will make real-time data processing more real-time, it will reduce power costs, and fundamentally change network architectures as well as the way networking specialists do their jobs.
Change is the Change
So what's the networking outlook for next year? It's hard to be precise, but let's just say that there will be less of some stuff, and more of other stuff. Or, to put it in a nutshell, the only thing that will stay the same is the rapid pace of change.
Also read: Why 5G Isn't Just For Carriers