Steven is Principal Analyst for Enterprise Networking at GlobalData, covering networking hardware and software for the data center and enterprise, including switching, routing, SDN, SD-WAN, and related technologies. This includes NFV for enterprise, automation, AI/ML for networking, location services, and the convergence of networking and security. Steven will also be covering the new edge, as the network edge evolves SD-WAN and IoT and the opportunities around re-inventing the edge as companies move towards digitization. Steven's technology career began over 25 years ago in Fortune 500 IT for retail, where he was a network architect. Prior to Global Data, Steven has served as Managing Technology Editor at Network Computing Magazine, where he did his own testing and writing, Principal Network Analyst for Network Infrastructure at Current Analysis, and most recently at Cisco Systems, where he worked in Data Center switching, Cloud, and Enterprise Switching.
• The NVIDIA-Arm deal has interesting technological potential, but will likely chill competition
• Regulators worldwide are viewing big tech deals with an increasingly skeptical eye
In the ongoing saga of NVIDIA’s proposed purchase of the UK-based silicon design firm Arm Semiconductor Ltd. regulators have stepped in to stop the deal. Arm develops the architecture of the ARM processor, and then licenses it to other companies for use in their designs. ARM-derived processors have become extremely popular, appearing in almost every modern smartphone design, thousands of other proprietary, servers, and probably most famously as the latest CPU architecture for the Macintosh line of computers from Apple. Amazon’s AWS service has servers that AWS developed that use ARM architecture. In short, ARM is essentially everywhere and only Intel’s x86 architecture has had more success. ARM is the first processor architecture to get anywhere close and is considered vital in the technology marketplace.
• New analytics around network and application performance are bringing great visibility to administrators
• These new data sources rarely work with other data sources, making piecing together a picture difficult for enterprises
This is the age of analytics. Well, perhaps the computer era has always been the age of analytics, but today we are seeing a lot more discussions of analytics in general. Areas that are particularly suited to analytics are networking, as well as application performance. These areas are particularly suited because they deal with, for the most part, defined and known measurables. For instance, in networking, packet loss, latency, etc. are all known measurables. In application performance monitoring, you can measure application response time, error rates, and number of application instances among other application specific variables. There is a whole market dedicated to application performance monitoring, to say nothing of the newer observability platforms designed to help DevOps practitioners monitor performance.
• The challenge is for IT to improve the experience for the work from home user
• Palo Alto Networks Okyo Garde addresses issues many enterprise work from home (WFH) solutions do not
The subject of WFH has had a lot of attention, with the COVID-19 pandemic forcing the issue for a lot of companies. Many companies have taken advantage of this and have moved their employees to permanent work from home status or only in the office a few select days of the week. The struggle since the beginning of the pandemic has been to support these home users. IT departments strived mightily and, in many cases, bootstrapped solutions that could at least get the home worker up and running. But now these same IT departments are taking a closer look at more effective and permanent solutions. Issues with home networks, particularly Wi-Fi, ergonomics, and even things as basic as monitors and chairs need to be addressed – they are all part of the WFH equation.
• Security and networking are converging, the evidence is clear, both from a technological and strategic standpoint, with security threats increasing.
• The enterprise needs tools to manage the human aspects of security and networking convergence and the fist instrument they need is real industry examples proving the trend from vendors, ITSP, carriers, and industry analysts.
By design and necessity, the security and networking industries are moving towards consolidation. Security companies are buying networking resources and networking companies are snapping up security vendors left and right. If you address a room full of vendors from the security and networking markets and proclaim that the two markets are converging, you will get heads nodding sagely. But the reality on the ground is much more complicated. Much like a stone arch, something has to move before things begin to fall in the direction gravity is pulling them.
IT security issues are being exacerbated by unregulated auto-update mechanisms.
Systemic and fundamental change to a centralized, approval-based update system is necessary.
A simple rule of thumb for complex systems is that wherever simplicity is added, there is corresponding complexity added elsewhere. For instance, in early PC computing, only software updates were required when the latest version was bought. Bug fixes sent to existing users were exceedingly rare, as they required physical media. With the advent of the Internet, physical media was gradually shunted to the side as bandwidth increased. Bug fixes were suddenly available to anyone who wanted to download and install them. Then came auto updating. Software began to reach out on its own to check to see if it were up to date and, if not, updated itself. Bugs were eliminated and security enhanced. In turn, this enabled rapid iteration software development and the so-called ‘fail fast’ mentality for startups and app developers. After all, if the app was flawed, a patch would simply be applied as fast as the developer could make it. Continue reading “Real Security Demands a Fundamental Change to Software Updates”→
The spinoff of Dell from VMware is long-term good for VMware customers.
Enterprise IT buyers and enterprise IT rivals to Dell need make no changes for the immediate future.
On April 15, 2021, Dell announced plans to spin off its VMware business, which will likely net Dell around $9 billion (USD) that it can used to pay down debt or go on an acquisition spree. Dell currently owns 81% of VMware, which it gained during its acquisition of EMC in 2015. Beyond the obvious need to reduce its debt, it is widely believed that VMware (which still had its own stock) and Dell will both have higher values when they are evaluated as separate entities. Michael Dell will remain chairman of the board for both companies. Continue reading “Dell to Spin Off VMware – Steady as She Goes for Now”→
Business and IT have long treated each other with suspicion, to the detriment of both.
Ongoing changes to business and IT require a re-structuring that puts the business and IT hand in hand.
Corporate culture by its very nature requires people to specialize in specific areas of the business. People settle into their roles, becoming masters of the tasks, priorities, and policies that define their area of expertise in the company. The theory is that management has the broader overview and can provide the needed information to guide the overall company direction and ensure smooth operations. The reality is that most managers, including some members of the C-suite, also operate with their domain as their primary concern.
• Enterprises and organizations have long ignored business continuity / disaster recovery (BC/DR)
• BC/DR is a fundamental business duty like insurance, not an optional expense
Yesterday, French cloud provider OVH suffered a fire in one of its data center complexes in Strasbourg, France. It entirely destroyed one unit, damaged another and caused the shutdown of the rest of the units on site. Thankfully, no one was hurt and OVH is working on restoring service. But an entire data center is gone, along with parts of another. Not down, burned. Gone. Fried. No realistic chance of recovery, not anytime soon if at all. The fire was so hot the metal walls of the building deformed. Continue reading “After a Fire Isn’t the Time to Buy Extinguishers”→
Edge computing is a real thing, but distorted and extended beyond reasonable use cases by FOMO.
Smart edge computing plays are not generalized, but specialized, and they do not play on hype.
The first conversations around the concept of edge computing were both interesting and enlightening. The basic idea was that compute resources needed to be closer to the actual workload in situations where real-time or very near-real time decisions need to be made. Latency could not be tolerated, so cloud or even corporate data centers were out of the question. Examples given were automated materials handling, manufacturing, and – of course, technology marketers’ favorite old trope – self-driving/automated vehicles. All but the last example sounded perfectly reasonable and lined up with customer needs, both today and tomorrow. Continue reading “Already Over the Edge (Computing)”→
• Microsoft has placed its Azure Quantum service into public preview
• Learning and software development are the first step in a long quantum computing journey
In a blog post, Microsoft has announced public availability of the Azure Quantum cloud service, which has been in closed beta testing for a while. All three of the biggest cloud providers, Microsoft Azure, Google Cloud (with IBM), and Amazon Braket, now offer some form of quantum cloud computing. The original vision of quantum computing was much more centered on the idea that enterprises would buy quantum computers. But the operational and facility requirements for the current generation of quantum computers are too steep. Cloud computing is the natural choice for quantum computing, outside of the biggest research institutions and nation-states. Continue reading “Microsoft Opens Azure Quantum Cloud Service for Public Preview”→