With the annual Black Hat event in Las Vegas, the global Internet community celebrates its felons.
Like physical combat, Internet security requires a good understanding of enemy black hat strategies.
Last week saw Las Vegas hosting the 15th annual Black Hat event. From its inception in 1997, Black Hat has grown from a single annual conference in Las Vegas (still the main event with the highest stakes) to a global conference series with annual events in Abu Dhabi, Barcelona, Las Vegas and Washington, DC. From its nefarious roots, it spouts uncomfortable truths about the insecurities we face every day as global net workers. It’s difficult to find any other industry where crime and passion are so closely aligned and where ‘respect’ and ‘respectable’ are terms so far apart. Cyber-warfare for profit and power lacks any basic ‘Geneva Convention’ that could specify global rules of conduct and the means to prosecute felons. Continue reading “Black Hat Roundup: Keeping Tabs on the Ones That Got Away”→
Permission by presence status fits some corporate cultures but clashes with others.
Customers and providers of UC services cite low adoption and usage by end users as challenging. Both buyers and providers of UC services have a stake in encouraging end users to adopt UC services; once demanding UC projects have been rolled out, finance directors are keen to see some sorts of return on investment. Some UC features fare better than others (typically telephony gets high use) and they vary from user to user, but the power of ‘presence status’ to give contact permission can both deter and appeal to users. Continue reading “UC Hanging On Users’ Permission”→
An unfortunate series of big impact cloud outages including Windows Azure, Salesforce.com, Amazon Web Services, and Twitter have users fuming and organizations rethinking their on-demand strategies
Are cloud providers doing enough to address the underlying issues and reassure enterprises? The answer, at least for now, seems to be no
Summer malaise has hit the cloud in a big way with a series of service interruptions knocking some of the most popular services offline temporarily and setting some corporate users into a tailspin. Though the root causes of the failures may differ, providers often issue ineffectual mea culpas, often citing avoidable issues like Twitter’s “double system failure” or Salesforce.com’s power failure rather than more unpredictable causes like a natural disaster or an unanticipated massive influx of traffic flooding their service. The result leaves already leery enterprises even more on edge about making the shift to the cloud anytime soon. Continue reading “Service Instability Underscores Serious Cloud Issues – and the Need for Better SLAs”→
A successful mobility campaign for collaboration players requires attention be paid to document synchronization, editing, and sharing.
These documents must follow users across multiple platforms and devices, not just in number but in kind.
Many of us in the analyst industry have watched momentum in the battle for the desktop swing back and forth between various operating systems, favoring from time to time brands such as Microsoft, Apple, and even Linux. However, at all times in this ever-evolving battlefield, Microsoft has held the one key necessary to unlock (read, dominate) the enterprise. That key, which has remained tucked up securely in the pocket of one Mr. Bill Gates from Redmond, Washington, is Microsoft Office. Continue reading “Documents, Not Just Operating Systems, Are Key to Mobility”→
It takes only minutes for a sophisticated attacker to breach an enterprise network, but it can take months to uncover their presence.
Reducing that time to discovery can minimize the damage done, but there are multiple ways to try to achieve faster detection. Which route should you choose?
I had an interesting conversation the other day with a company in the still fairly small market niche called incident response, and it got me thinking about the evolution of the threat landscape and the time that it takes enterprises to respond to new market conditions – especially in the security market. I think by now most large enterprise security administrators and CISOs understand that it is not a matter of if, but when their organization will experience a breach – one that could potentially be very painful for the whole organization. But recognizing that sad fact does not help those administrators and executives understand the most effective way to tackle the new challenge presented by more sophisticated, stealthy, multi-stage attacks. Exacerbating their dilemma is an increasingly porous enterprise perimeter, where computing workloads are shifted outside the traditional DMZ and end users are allowed (or go around policies that prohibit) access to corporate data from their own smartphones, tablets and even laptops. Continue reading “Okay, Breaches Are Inevitable: So Now What Do We Do?”→
For years, operators have been trying to crack the code on how to offer cost-effective global M2M services that can span multiple network footprints.
Two alliances – one from Jasper Wireless-enabled operators and one from standards bodies across the world – point to new models that may accelerate the growth of global M2M.
Over the last couple of weeks, there have been some unusual partnership announcements from the M2M ecosystem that may solve a set of problems which have thwarted the widespread growth of global M2M deals. Last week, KPN, NTT DoCoMo, Rogers Communications, SingTel, Telefónica, Telstra, and Vimpelcom agreed to form an alliance to support a single, global platform that multinational businesses will leverage to enable connected devices to span multiple countries cost-effectively with a uniform SIM. Since these operators all use the Jasper Wireless service delivery platform (as does AT&T, which was not part of the partnership), they can also service customers consistently; the Jasper platform provides a uniform portal for service activation, SIM management, troubleshooting, and subscription/rate management. While a number of operators such as Orange, Vodafone, and AT&T have already developed global SIMs that can be used in multiple countries to simplify inventory management and related expenses, there has still been a lot of work to do to make global M2M seamless and cost-effective. Some operators have also been working together to offer global connectivity at prices below the traditional expensive commercial roaming rates, but in many cases, enterprises have either had to set up relationships with multiple carriers or gone to aggregators to try to bridge together a global network with a single point of contact and single contract. Continue reading “Global M2M Partnerships Heat Up, but Will They Succeed?”→
802.11n, which capped out at a max of roughly 500 Mbps in ideal cases, never filled the 1 Gbps link with which many were connected, avoiding bottlenecks at the access port itself (though potentially congesting aggregation links).
802.11ac, with its initial specification release capably supporting 1.3 Gbps throughput on a single AP, may force a ‘re-think’ on access point attachment and how traffic will be routed onto the physical infrastructure and ultimately back to the data center or services location.
Wireless enterprise networks are a must today for both efficiency and convenience. More frankly, they are necessary to be competitive. The market gets this, as indicated by the continued healthy growth of WLAN as a segment. Originally, 100 Mbps links often connected 802.11a/b/g APs, and given that the top throughput was often less than the 54 Mbps throughput of 802.11g, no bottlenecks were encountered. Then came 802.11n; in many cases, it was either proceeded by or coupled with a Gigabit network upgrade, sufficient to support the initial 150/300 Mbps and scaling to 600 Mbps (in a perfect world), as well as multiple radio technologies. This is still well below the 1 Gbps links that in some cases supply connectivity and power (PoE) to the 802.11n access points. However, with the next-generation 802.11ac specification nearing completion and its initial release throughput providing up to 1.3 Gbps connectivity, we reach the first throughput bottleneck from the AP to the wired environment. No debate has come up yet in the public forums regarding how one would wire and architect an 11ac network, but it is certain to become an issue in the coming quarters as commercial products become available. There is no specification for 10Gbase-T PoE currently, few (if any) access points in the past have had multiple Ethernet ports to connect to the network, and the current link technology employed (1GbE) will be oversubscribed. Continue reading “Wireless: 802.11ac May Break Your Wired Network”→
Automation jeopardizes flexibility when needed by clients deploying new technology
Smaller services deals expose buyer shortcomings that need early due diligence and lifelong flexibility by providers.
The cloud brings exciting innovations that increase the potential of, and customer choices for, unified communications and workspace services. With that fresh potential comes the possibility of clients making more errors in buying decisions and specifications. Service providers and third parties are also likely to make genuine mistakes when advising clients on strategies that exploit new technologies. Continue reading “Keep Flexible to Keep Customers”→
Often criticized for failing to exploit partner relationships to expand its cloud position, Microsoft introduced new Office 365 programs that should stimulate sales through hosting partners.
At the same time, Microsoft previewed new technologies that provide key elements hosting partners will need to offer hybrid solutions.
Microsoft has long withered under accusations that it was failing to address threats through the cloud to its desktop dominance. Criticized first for failing to move quickly enough into the cloud and then later for not capitalizing well enough on third-party partners to extend sales of its cloud-based Office 365 solution, Microsoft was often called out for clinging too long to conventional licensing models even as enterprise clients urged the company to embrace a subscription-based delivery model. However, a series of new partner programs and some associated technology reveal a company that is more than ready to take on any rivals in the cloud. Continue reading “Microsoft Makes a Serious Cloud Play”→