One-fifth of North American businesses surveyed in Current Analysis’ 2011 study on cloud adoption think the cloud will not change IT’s role in the company
The reality is that because the cloud will alter the way businesses consume technology, the IT department can expect profound change
Say what you will about the cloud and the entire noise surrounding the concept, it is time to believe the hype. Sure, it is easy to shrug off the noise around the on-demand delivery model, but a virtualized scheme is no longer some kind of niche idea but rather a pragmatic method that pools resources to supply technology services. Providers are applying virtualization technology to bend their delivery models to meet corporate demand for more flexible and cost-effective IT services. And though there have been more than a few false starts, the early results are promising with some genuine successes in improved efficiencies, real expense reductions and even more agile working models. Continue reading “Role Reversal: Will the Cloud Transform the IT Department Right Out of a Job?”→
Provide simple security commandments to follow under pain of dismissal
The most compelling briefings at this year’s RSA Security Conference in London were focused on how companies can make the journey from their governance, risk and compliance process and the resulting security policy to actually making it work throughout their enterprise, where getting people aligned with security is a real sticking point. It’s not that employees actually want to spill company secrets – mostly, they just want to be helpful to ‘perceived’ colleagues. How many times do we actually read error messages or listen to security warnings? How often do we reflect on the veracity of a caller who seems really nice and obviously knows a lot about the company? Continue reading “Social Engineering – Industrialized Exploitation of Human Helpfulness”→
Initiating a remote agent program is an effective way to get started in mobilizing your contact center.
The economic and human resource benefits of a remote agent program to the enterprise and the agents are too important and substantial to ignore in contact center planning efforts.
The mobility revolution is affecting every aspect of the business world today as workers are expected to be available from anywhere at any time, and customers demand 24×7 corporate access from the device of their choice. In the long term the shift to mobility will affect all stakeholders in the customer care environment including the agents, the contact center managers and the customers. In this initial blog on the topic I will focus on the agents.
A few big enterprises approach penny-a-minute event horizon for switched voice.
Detecting and disabling other revenue siphons is generally more productive.
Voice services pricing collapsed years ago. Gone are the days when businesses sent call detail records for departments to audit line by line. When they do review records, it’s more about catching rogue usage patterns and checking that employees aren’t gabbing with friends all day, not about getting “daytime dial” calls in check. Consumers that choose metered packages can get nickel-a-minute domestic calling rates; sizable businesses can shave another penny, maybe two, off metered plans.
Public cloud services break the typical 18 month product revision cycle down into smaller, more rapid releases, a practice that varies widely among vendors in terms of frequency, focus (bugs vs. new features) and flexibility.
To avoid heavy deployment, training and support issues stemming from quick revision cycles, customers must demand options typically found in on-premises software.
Last week brought an animated and often heated blow up between Oracle’s Larry Ellison and Salesforce.com’s Marc Benioff over the best way to deliver a public cloud offering. The argument, which played out publicly across keynotes given by both men during Oracle’s OpenWorld 2011 conference in San Francisco, centered upon whether or not Salesforce.com’s cloud was indeed open and whether or not Oracle’s newly launched Public Cloud platform was in fact a public cloud. Such a public debate can only serve to ultimately make things easier for enterprise IT departments by exposing many of the often overlooked issues associated with cloud-borne software such as partial multitenancy or API-induced data siloing. But to this analyst’s mind, the debate missed what is the biggest hidden ‘gotcha’ – the breakneck speed at which cloud-centric vendors revise their software.
Steve Jobs’ passing sparks reflection on where technology has been and where it’s going
For the IT sector a focus on the individual, not the impact of consumerization, is hopefully the lasting legacy
It’s not often that a forum dedicated to business IT issues demands reflection on the contributions of an individual, but Steve Jobs’ impact was so extraordinary that failing to do so would seem bizarre. The irony, of course, is that until very recently Jobs and Apple were irrelevant in IT terms. Businesses generally didn’t like Apple’s stuff, and the feeling seemed to be mutual. But as we all know, the more recent jaw-dropping innovations from the company—specifically the iPhone and the iPad—shoved the issue of consumerization to the fore in the minds of IT managers.
These new devices simply had to be accommodated, for it was easy for any end-user to prove that they were as powerful as they were fun. The consumerization trend does not begin and end with the Jobs-inspired Apple devices—social networking and the use of video are other notable trends. But the Apple gadgets were a catalyst, and their impact in terms of management and application delivery will be an IT issue for years to come – one that many IT managers are more than happy to deal with, being themselves fans of the technology.
The consumerization of IT, however, is simply a technology issue. Steve Jobs’ massive contributions in the past decades were not just about marrying technology and aesthetics – this central activity was actually the manifestation of a focus on the individual. Jobs’ mission with Apple was to make technology about the people using it, not the machines. Grandiose? Perhaps, but we’re entering an era in business IT that is about empowering individual innovation. To be “insanely great” you have to “think different.” This is as true for businesses as it is for artists and poets.
Security teams should educate themselves on the options available specifically to protect virtual servers and desktops
Security teams should seek to get involved in virtualization projects early in the planning process
Chief information security officers (CISOs) and security teams should educate themselves on the growing array of threat management products aimed specifically at securing virtual server and/or virtual desktop environments. Why? Because traditional security methods do not scale nor do they match the flexibility required in virtualized environments, or directly protect the hypervisor from breaches. At the same time, the first generation of virtualized endpoint protection, firewalls and other threat management products take up too much overhead, which greatly diminishes the benefits of virtualization that most organizations are seeking. They require an instance of their scanners per-VM, taking up critical CPU and memory, and when multiple signature scanners in a single physical host all update signatures at the same time, it creates a scan storm that can bring a server to its knees. Continue reading “The Virtualization Bulldozer and Security: Time to Get Your Head Out of the Sand”→
M2M is starting to provide many benefits to enterprises across diverse industries
Challenges remain that require solid operational, financial, and resource planning
What is so exciting about M2M technology is that the use cases are expanding so rapidly. Enterprises start out with one application – using low-speed, sporadic data connections – and then think of a half-dozen other aspects of their business that can benefit. The most tangible benefits are the productivity gains from automating processes that have been done manually. By collecting data on the performance and status of remote assets, such as industrial equipment, vehicles, people, inventory, containers, and cargo, businesses can prevent problems and save substantially on onsite service calls. By optimizing routes, companies can save on gas, mechanical wear and tear, and the time it takes for a technician to get to and complete a call. By remotely checking on the status of equipment and setting alerts if a device is out of compliance, companies see reductions in problems, service calls, and the need for routine maintenance, as well as less customer frustration due to out-of-order equipment or low inventory. Benefits relating to productivity gains, resource optimization, and problem and cost avoidance are often the starting point for M2M deployments, as they represent almost guaranteed and rapid ROI. Continue reading “Benefits (and Challenges) of M2M Deployments”→
Data center networking technologies are moving at a pace that few enterprises can keep up with
The networking provider of choice will impact cloud deployment plans and virtualization scale – so choose wisely
No one will argue that there have been more changes in networking coming out of the data center in the last 24 months than in the last ten years for the enterprise campus. This doesn’t demean the value of the campus, but rather highlights the standards and technology explosion inside the data center. Topics of debate and battlegrounds for vendor differentiation range from port speed and scale (1Gig to 100G) to protocol support and networking virtualization. Regardless, the standards remain in motion and a few standards in particular will have significant impact on the network architecture of choice for an enterprise. These include SPB & TRILL (competing standards to address spanning tree limitations), FCoE & DCB (storage over Ethernet and improvements to enhance iSCSI and existing storage over Ethernet), and of course virtualization insight and management of virtual switches. As several of these are not yet ratified, vendor support can only be gauged by stated intent (versus actually implemented). Continue reading “Data Center Fabrics: Enterprises Often Need a Networking Tailor”→
Large providers should think small and think hosted
The public sector is a prime market for UC
The market for unified communications (UC) is an uncertain field. There are advanced solutions out there from multiple vendors, but developing a business proposition to win over customers remains an arcane art. This should hardly be surprising; even fundamental services such as IP telephony are still far from being universally adopted, although take-up is now fairly high. Part of the problem comes from convincing a CFO that the OpEx outlay on a UC solution (and often a small CapEx spend at the beginning) will deliver savings to more than cover its cost. Services such as collaboration tools and presence can seem like ‘gimmicks.’ It is up to the provider to educate decision-makers on the cost-saving benefits of features such as click-to-call IPT on an IM-style contact list window, or on how adding shared document collaborative tools into the mix can increase productivity and reduce the need for further calls and possibly travel. Continue reading “SMEs and the Public Sector Point the Way to Selling UC, as Vodafone UK Demonstrates”→