Intelligent embedded network agents and sophisticated software heuristics provide key insights into information and performance patterns for predictable data consumption, but interpreting these requires talent
Humans remain the most valuable troubleshooting tool in the IT arsenal
Having worked in infrastructure in the ‘90s and I’ve done my fair share of troubleshooting vampire taps, thick-LAN, and eventually thin LAN (and those finicky terminations) I can say we’ve come a long way. Granted at its most basic we’re troubleshooting low voltage electrical wires in most wired infrastructure. Sophisticated tools are embedded in many switching platforms now which can immediately detect a link loss in addition to whether it’s a damaged cable or connector, or alert correlation from multiple devices to pinpoint the exact location of a ‘noisy’ device polluting the network. Advances such as these have increased efficiency, reduced trouble ticket resolution times, and freed up valuable resources to work on more complex challenges. With wireless access becoming the norm for clients as more and more devices go solely mobile, tools have generally kept pace and network management systems have slowly grown more capable and feature rich. As cloud adoption rates increase and systems grow more diverse, the tools are likely to suffer a setback, though, with many disparate elements, both physical and virtual, contributing to a single application connection. Troubleshooting these will once again require a significant amount of technician involvement to determine root cause during an outage (and no, rebooting your client isn’t the answer, Mr. Helpdesk). Physical and virtual agents must be deployed in order to collect statistics in real time and aggregate these bits into a collective perspective of the health of the network. Whether this is done with one of the extensible “framework” NMS systems or via vendor element management systems does not matter, but at the heart of this is that enterprises need to embrace a more sophisticated management model than they have in the past. Continue reading “IT Pains Evolving: Where’s Holmes & Watson?”→
Be ready to exploit a round of telecom industry consolidation
Potential for merging telco, ITSP, OTT and media assets and abilities
Telecom operator mergers and acquisitions have been sparse during the current economic downturn, though there have been some interesting ones, e.g., Verizon buying Terremark and CloudSwitch, NTT buying Dimension Data, Level 3 acquiring Global Crossing, and CenturyLink buying Savvis and Qwest. The lack of more merger and acquisition activity is surprising given the depressed state of some telco and ITSPs’ share prices. Now in 2012 there have been two attempts to acquire Cable & Wireless, from Vodafone and Tata Communications. Vodafone remains in the running with a GBP 1 billion bid. More recently Carlos Slim of America Movil in Mexico offered US$3.8 million to up his stake (to 28%, from under 5%) in Netherlands-based KPN. In keeping with speculation that expansion in Europe is of interest, Slim is also reported to be ‘eyeing’ Telekom Austria. Continue reading “Fantasy Telco League – Building Dream Service Providers”→
Massive, multi-billion dollar growth projections and continuous vendor and service provider marketing keep a white-hot spotlight on cloud computing.
Yet, as rich as the potential benefits of on-demand computing are, enterprises are approaching the cloud with caution as well as some very fundamental questions which, left unanswered, will keep the model from becoming anything but a tactical tool.
If you believe everything you read, the cloud is supposed to be the cure for all that ails the enterprise. Flexible, cost-effective, and scalable, the cloud holds a lot of promise for organizations struggling with severe budget limitations. Every self-respecting vendor and managed IT services provider needs to have a coherent and innovative cloud strategy or risk looking backward in a fast-moving space. Market projections seem to come out every week talking up the cloud’s potential value in astronomical multi-billion dollar terms. The last one I saw projected cloud sales to top the $39 billion mark… this year. Continue reading “Cloud Computing: A Strategic Engine of Change or Just Another Tool in the Kit?”→
The ‘Flame’ advanced persistent threat (APT) is invisible to commercial AV defences and may lie dormant for years.
Combating APTs may create a new role for the ITU and further international anti-malware efforts.
The latest news on the (often purported to be state-sponsored) APT front is a massive piece of spy software, dubbed ‘Flame,’ which seems to have been around for many years – at least since 2010. The worm was discovered by accident when security vendor Kaspersky was looking for another mystery APT dubbed ‘Wiper,’ which has been deleting files on servers in the Middle East for some time. Much like earlier APTs such as ‘Stuxnet’ and ‘Duqu,’ Flame exploits software and hardware vulnerabilities that evade any of the known AV defences and infects desktops and servers in multiple ways (USB, LAN, drive-by etc.); similar to these other APTs, it appears to harm or spy very selectively, so it may reside dormant on a large number of Windows PCs. Flame is different in that the remote controllers can install different modules (e.g., taking control of the PC’s microphone to record conversations) on infected machines depending on what kind of information the controllers want to steal. So, the net-net is we do not know if our desktops or data centres are infected, and consequently whether they are actively or passively spying on us and stealing our data. We might seek some comfort in the belief that this malicious (often Middle Eastern) activity is politically rather than commercially motivated, but state-sponsored industrial espionage is an obvious use as well. Continue reading “APT Threats Today Need a Different Kind of Response”→