Edge computing solutions will become more diverse and sophisticated in 2020, giving enterprise buyers greater choice but also complicating procurement decisions.
Hyperscale clouds will deepen their engagement with emerging edge computing ecosystems in 2020, expanding relationships with existing partners and forging new ones.
Edge computing commanded increased attention in 2019, amid growing recognition that IT architectures need to evolve in order to support new low-latency, data-rich digital services and applications. Edge computing involves the use of computer processing, data storage, and analytics capabilities close to the places where data is collected and where digital content and applications are consumed. Benefits include the higher performance that can be achieved when powering applications closer to points of consumption. They also include being able to make faster decisions about data collected from Internet-connected sensors on factory floors and transportation networks and in retail outlets and other locations. Continue reading “Edge Computing’s New Ecosystem Will Mature, Expand, and Diversify in 2020”→
• Kubernetes, service mesh, and low-code will drive DevOps’ innovations
• API security will enhance API gateway offerings
Kubernetes to Consolidate OpenStack:
In 2020 Kubernetes will bridge technologies between the OpenStack and platform providers’ market segments, highlighting the growing role of traditional infrastructure vendors in app modernization. Kubernetes has obtained the power to rally, helping infrastructure and cloud providers alike build out their systems and solutions in ways that take advantage of the benefits of cloud services. Cloud providers are preparing cloud management tools to enable better collaboration under a DevOps model and help enterprises through the entire process from building containerized apps to managing Kubernetes clusters, to providing fixes and debugging potential problems. A continued emphasis will be made on new and enhanced OSS technologies in ongoing efforts to improve Kubernetes.
• Looking to build good artificiaI intelligence (AI)? Don’t let the speed and availability of open source frameworks, modules, libraries, and languages lull you into a false sense of confidence.
• Good AI needs to start with good data and good data needs to be ingested, registered, described, validated, and processed well before it reaches the ready hands of AI practitioners.
These are heady times. Enterprises have at their disposal both the raw materials and the necessary tools to achieve great things with AI, be that something grandiose as self-driving cars or unassuming as a fraud detection algorithm. The trouble with an abundance of materials (e.g., data) and tools (e.g., open source machine learning models), however, is speed. Speed kills, as they say.
For AI practitioners, this means learning to run before learning to walk by hastily automating decisions via AI models that are built on unsound data. With a few simple open source frameworks, modules, libraries, and languages, seemingly useful but ultimately erroneous predictions and conclusions can be readily drawn from any old data set in very short order. What’s the answer? More or better tools? No. As with most human problems, good old human knowhow and understanding are necessary. And that begins with data.
Malaysia is a step closer to seeing commercial 5G, with the various live trials and commercial partnerships by the regulator and telcos.
5G could drive the current fixed-mobile convergence trend in the country. However, spectrum availability remains unclear and could delay the commercial availability of the technology.
The 5G Race Is Heating Up
The regulator, MCMC, has been proactive in driving 5G in Malaysia. It has created – and leads – the 5G working group, which facilitates collaborations to roll out 55 5G use cases across 32 cities, and it is looking to bring forward spectrum allocation for the technology from 2021 to the second half of 2020. Apart from the various initiatives by the regulator, the telcos are beefing up their preparations for the technology. The mobile market leader, Maxis, signed a partnership with Huawei in October to deploy 5G and co-develop use cases. Celcom is collaborating with Ericsson and UTM KL (a university in Kuala Lumpur) to launch a live trial on the university campus and drive the research and development of 5G use cases. Digi is partnering with Cyberview (a technology hub enabler) to launch 5G OpenLab to co-create 5G applications in the smart city vertical. TM, the fixed-line incumbent, announced its partnership with Huawei on 5G commercialization, and it plans to be the first telco to deploy a 5G standalone (SA) network. The telcos also recently kicked off their respective live 5G trials as part of the MCMC initiative, e.g., Celcom’s 360-degree 4K surveillance in Langkawi, Maxis’ AR experience at Aquaria KLCC, and TM 8K VR and Smart Tourism. Continue reading “The 5G Race Is Heating Up in Malaysia, but Commercial Availability Remains Unclear”→
Verizon’s annual Payment Security Report captures a snapshot of organizations struggling to continue successful controls and best practices over time.
The evidence shows those who do are rewarded with a better fortified defense against breaches.
Fifteen years after the payment card industry settled on a single data security standard with PCI DSS, there are indications that too many organizations’ security practices haven’t risen to the level of maturity which would have been anticipated at this point. In Verizon’s annual survey of payment card industry security practices, only 37% of the 302 surveyed enterprises sustain full compliance with the 12 specifications outlined in PCI DSS consistently over time. Effectively, most organizations are focusing on meeting the basic requirements rather than developing consistent and effective security practices – not unlike a procrastinating student who is just looking to pass the test. Just 18% check to see if they are meeting PCI DSS specifications more often than what the standard mandates. Continue reading “Verizon Payment Security Research Exposes Execution Issues”→
• Hardware and software are not an either/or operation, but a balance that requires investment, time, and planning.
• Silicon One plus a cloud-friendly IOS XR7, and heavy silicon photonics investments give Cisco a new lease on the Internet For the Future
There have been a lot of changes and announcements at Cisco recently, some of them surprising. It brings up the question of who Cisco really is today? Some would tell you that Cisco is one of the old guard, a legacy IT vendor desperate to keep its market dominance in the face of younger, smaller, and more agile competitors. However, close examination of the evidence reveals something else, not a hidebound legacy vendor, but a survivor changing to match the market.
On December 11th, Cisco had its “Internet for the Future” launch event in San Francisco. This event was unusual, because for the first time in a long time, it featured at its core, a new from-the-ground-up chip family called Cisco Silicon One. It also featured Cisco, for the first time, touting the possibility of becoming a silicon supplier. Cisco outlined the option to sell silicon from the Cisco Silicon One family to anyone, including competitors. Add in all the silicon photonics (pluggable optics for Cisco and third-parties), an open cloud-friendly network operating system (IOS-XR7) capable of running on white box or Cisco hardware and you have some built-in contradictions to what would be considered traditional Cisco.
So, what does this all mean about Cisco? Are they turning their backs on all of the proclamations of being a software-focused company? Critics would say that this announcement proves that Cisco is a hardware company, period. This is a simplistic and reductive argument that fails to consider any nuance or the reality of producing infrastructure. Instead of viewing software and hardware as rivals for the spotlight, they should be viewed as climbers helping each other reach the summit with each trading the lead. Cisco’s investment into silicon shouldn’t be viewed as a return to hardware centricity, but as an investment in an area where it can provide extra value for partners and customers. Software is supreme right now, but things are cyclic, and it is very possible to be software focused and still produce advantages in hardware, especially when you consider physical realities like power consumption, heat, bandwidth, and the chip architecture. None of this diminishes Cisco’s commitment to software nor does it signal a Cisco or market move to hardware-first. In fact, there are whole business units outside of networking, namely security and collaboration, that have moved rapidly and wholesale to software-as-a-service as a business model.
According to a new forecast from GlobalData, the market for enterprise mobility management (EMM) software reached $13.3 billion in 2019, a year in which mobile application management caught up to mobile device management in terms of revenues.
Does this growth imply higher revenue potential for mobile operators in selling managed mobility services?
According to a new forecast from GlobalData, the market for EMM software reached $13.3 billion in 2019, a year in which mobile application management caught up to mobile device management in terms of revenues. In fact, the five-year CAGR for mobile application management, at 27%, is significantly higher than for the other three software capabilities in the forecast, with mobile device management at 18.8%, telecom expense management at 10.3%, and mobile content management at 13.6%. (For the full report, click here.) Continue reading “Enterprise Mobility Management Forecasts Show Strong Growth in Mobile Application Management”→
New partnership initiatives involving AT&T, Microsoft, Verizon, and Amazon Web Services highlight the extent to which 5G and edge computing innovations are starting to take shape.
Initiatives aim to combine cloud resources with 5G network infrastructure in physical locations close to where low-latency and high-performance apps will be developed and consumed.
Two important announcements from the past couple of weeks illustrate how quickly 5G and edge computing may be starting to become a reality.
First, AT&T and Microsoft announced an initiative that will see Microsoft make its Azure-branded cloud services available within so-called ‘edge locations’ on AT&T’s newly deployed 5G network. This will ensure that Microsoft’s cloud infrastructure can be used to support the development and delivery of new digital services at locations that are geographically closer to consumer and business devices, including Internet of Things (IoT) endpoints. Traditionally that infrastructure had to be accessed from one of Microsoft’s regionally distributed cloud data centers. However, making Microsoft’s cloud infrastructure available at the edge of 5G networks means that data generated by IoT sensors can be processed at higher speeds and new services like autonomous cars, virtual reality (VR) and augmented reality (AR)-enabled immersive experiences, and cloud-based gaming can be offered with higher levels of performance. Continue reading “5G Networks Bring Cloud Computing to the Edge, Enabling New Service Development and Delivery”→