• Snowflake’s annual summit sees a slew of platform, data management, and partnership announcements.
• These enhancements bolster its competitive position, but the market remains crowded
At its annual summit in early June, Snowflake announced several enhancements to its data cloud platform, services, and partnership program. These announcements enhance its competitive position, making it more appealing to a wider range of user roles (such as developers) and industries.
Some announcements relate to better support for developers, with the inclusion of Java and Scala support in the Snowpark framework, which allows developers to build queries in their preferred programming languages and execute them on Snowflake’s cloud. Announcements also included support for user defined functions (UDF) in Java, greater ability to handle unstructured data, and support for SQL APIs.
• The IBM Satellite Cloud on AT&T MEC announcement is the latest indicating the mobile edge compute market is heating up.
• Telco edge offers advantages for speed, latency, and security and operators should look to build out both use cases and partners to add most value for enterprises.
Earlier this week IBM and AT&T announced a new partnership that will see IBM develop a new set of computing use cases leveraging its newest product for on premises or edge cloud environments, the IBM Satellite Cloud, which will run on AT&T’s Mobile Edge Compute (MEC) network. IBM environments will run on AT&T edge servers with private 5G connectivity to customer sites. The partnership will also involve Red Hat, acquired by IBM in 2019, which will focus on developing containers for the 5G MEC applications, enabling them to be readily moved from cloud to edge to on premises. Red Hat has a long history of developing open software for telcos across use cases like OSS/BSS and network virtualization, and is a strength for IBM when working in the telco edge. Initially the project will launch at IBM’s James T. Watson research center. Here IBM and AT&T hope to collaborate to build a set of 5G edge compute use cases aligned to key vertical industries such as healthcare, manufacturing, construction, energy, and utilities. The aim is to develop solutions that leverage technologies like AI and analytics towards mission critical applications like worker health and safety, industrial automation, or remote control of machines and networks. This IBM partnership is one of many for AT&T’s MEC solution. The carrier is also working with HPE, Google Cloud Platform, and Microsoft Azure to develop a set of cloud computing solutions that run on the telco edge. Further, AT&T is far from the only carrier in the race to develop enterprise edge use cases alongside the big cloud players. SKT, Verizon, Orange, Telefonica, and more all have partnerships with cloud providers like AWS, Azure, IBM, Google, and HPE with the aim of developing a new market for telco edge compute. Continue reading “Telco Edge: The Next Big Battleground in Cloud Computing”→
Robert Stoneman, Principal Analyst for local government at GlobalData, analyzes the key messages from the second day of techUK’s “Building the Smarter State” conference.
Day two of techUK’s flagship conference for 2020 took on a decidedly local theme compared to the day before, with the response of local public sector organizations firmly in the spotlight. With close to 130 attendees, and despite moving online in the current circumstances, the conference continued to be an industry-leading forum with a range of high-caliber speakers detailing the latest developments in public service ICT.
Kris Burtwistle, Head of UK Local Government at sponsors AWS, kicked day two off by summarizing some of their work with local authorities and introducing many of the key themes of the day. This included highlighting the rapidity with which councils embraced home working during lockdown, citing how the London Borough of Waltham Forest moved many its contact center staff to home working in just a matter of days. Continue reading “Building a Local Smarter State in the UK, Post-Pandemic”→
• Hardware and software are not an either/or operation, but a balance that requires investment, time, and planning.
• Silicon One plus a cloud-friendly IOS XR7, and heavy silicon photonics investments give Cisco a new lease on the Internet For the Future
There have been a lot of changes and announcements at Cisco recently, some of them surprising. It brings up the question of who Cisco really is today? Some would tell you that Cisco is one of the old guard, a legacy IT vendor desperate to keep its market dominance in the face of younger, smaller, and more agile competitors. However, close examination of the evidence reveals something else, not a hidebound legacy vendor, but a survivor changing to match the market.
On December 11th, Cisco had its “Internet for the Future” launch event in San Francisco. This event was unusual, because for the first time in a long time, it featured at its core, a new from-the-ground-up chip family called Cisco Silicon One. It also featured Cisco, for the first time, touting the possibility of becoming a silicon supplier. Cisco outlined the option to sell silicon from the Cisco Silicon One family to anyone, including competitors. Add in all the silicon photonics (pluggable optics for Cisco and third-parties), an open cloud-friendly network operating system (IOS-XR7) capable of running on white box or Cisco hardware and you have some built-in contradictions to what would be considered traditional Cisco.
So, what does this all mean about Cisco? Are they turning their backs on all of the proclamations of being a software-focused company? Critics would say that this announcement proves that Cisco is a hardware company, period. This is a simplistic and reductive argument that fails to consider any nuance or the reality of producing infrastructure. Instead of viewing software and hardware as rivals for the spotlight, they should be viewed as climbers helping each other reach the summit with each trading the lead. Cisco’s investment into silicon shouldn’t be viewed as a return to hardware centricity, but as an investment in an area where it can provide extra value for partners and customers. Software is supreme right now, but things are cyclic, and it is very possible to be software focused and still produce advantages in hardware, especially when you consider physical realities like power consumption, heat, bandwidth, and the chip architecture. None of this diminishes Cisco’s commitment to software nor does it signal a Cisco or market move to hardware-first. In fact, there are whole business units outside of networking, namely security and collaboration, that have moved rapidly and wholesale to software-as-a-service as a business model.
• Ultra broadband access will drive enterprise digital transformation, forcing requirements for more agile telco network services including cloud access and multi-cloud connectivity.
• The key factor is the decoupling of service and network management through the use of overlay/underlay networks, resulting in more flexible solutions that can be deployed quickly.
Huawei hosted UBBF 2018 in Geneva last week, bringing telcos, enterprises, and analysts up to date on its efforts in ultra broadband access (i.e., technology capable of 500 Mbps to 1 Gbps bandwidth). Curiously, the program also included significant time and content dedicated to B2B services with a focus on cloud-network synergy and the benefits to service provider and user. At times, the message wasn’t completely clear on which clouds Huawei was including in its vision (e.g., telco network clouds, telco public clouds, OTT clouds), but eventually, the vendor’s ideas for using a cloud management system to offer enterprises a one-stop shop for network and cloud services using SD-WAN for multi-cloud connectivity came through in several proposed use cases. (For more detail on this topic, see the full advisory report, “UBBF 2018: Cloud-Network Synergy High on Agenda at Huawei’s Ultra Broadband Forum,” published by GlobalData on September 17, 2018. Continue reading “Huawei UBBF: Cloud-Network Synergy Can Drive Managed Cloud Services for Telcos”→
• U.S. communications service providers are racing to launch 5G services this year
• What we really expect are 2019 deployments, as standards finalize and devices are commercialized
U.S. Providers 5G Rollout Plans
In the U.S., 5G rollouts are planned for 2018 by all four major wireless operators. However the launch dates, use cases and underlying technologies are all a bit different. While the other three operators are planning mobile rollouts from the beginning, Verizon is sticking with fixed broadband for now. And while AT&T, Sprint and T-Mobile claim mobile launches in 2018, standardized 5G with devices that can run on it are not expected until 2019.
Meanwhile, the use cases for 5G in the enterprise are still TBD. Aside from faster, lower latency services, and the futuristic advent of driverless cars and surgeon-free operations, 5G allows for more granular pricing and use case types via its “network slicing” capability. This lets network operators choose the characteristics they need per slice such as level of latency, throughput, the number and type of devices to be supported, and these in turn effect the pricing model.
Benefits for Consumers and Business Users
According to 5G technology suppliers, the benefits of 5G to consumers will include higher quality, faster speeds, wider coverage (indoors and out), and lower latency (down by 10x) – this translates to better support for applications that use streaming video or are aimed at the interactive gaming user base. 5G will also support the growing market for applications that use augmented and virtual reality technologies.
In the enterprise, suppliers note that massive communications traffic is expected from sensors embedded in roads, railways, and vehicles that are not only sending information to the cloud or to edge processing devices for analysis, but will also be sending data to each other. 5G also aims to leverage its inherent reliability and low latency to control critical services and infrastructure for public safety, government organizations, and utilities. Real-time video streaming, support for IoT applications such as autonomous vehicles, and advanced use of robotics in manufacturing are other likely use cases in the not-too-distant future.
While service providers have not yet set prices, a major objective for 5G is to lower data transmission costs compared to 4G LTE, by making bandwidth utilization more efficient and leveraging new higher-band spectrum. However, operators tend to charge what they can get companies and consumers to pay. They are not certain to pass these economies of scale and technology down to the end-customer, especially for such a premium service.
But there remain skeptics about the use cases for 5G: will they be different enough from 4G to allow operators to recoup their investments? Are 2018 launches meaningful when devices won’t be ready until 2019? And as far as the race to launch services is concerned – does it really matter which operator gets there first? Should enterprises wait to deploy fixed or mobile broadband or IoT services until they have 5G available? Probably not.
• With big data and analytics, older ideas like predictive analytics and AI are coming together to solve long-standing problems, most notably data quality.
• Sisense is adding another twist by taking advanced design and visualization concepts and putting those to work at the very beginning of the analytics lifecycle.
Invention invariably involves theft. Each generation of inventors stands on the shoulders of its predecessors, borrowing freely from their available pool of knowledge. Ideas are deconstructed, mixed up, and reapplied in new ways and within unexpected contexts to form, well, something new. Sometimes these new inventions are simply the opportunistic reinterpretation of an existing idea, taking something unique but impractical and turning it into something incredibly useful. That’s the way it was with the invention of the automobile, the light bulb and the radio. And that’s how it is with big data and analytics, where older ideas are only now coming together to solve long-standing problems. Continue reading “To Improve Data Quality, Sometimes the Best Place to Start is at the Very End”→
• Collaboration vendors’ use of vague industry jargon tricks people into believing something important is behind the technology they represent, rather than describing how technology can be applied to solve business problems.
• Vendors should instead use plain, instructive language to explain how their technology can be a strategic asset that helps organizations meet their business objectives.
Every industry has its own unique jargon and buzzwords. Sometimes it’s useful, serving as a shortcut to ‘make sure we are all on the same page’; however, I have sat through far too many empty, jargon-laden vendor presentations and become annoyed at how ambiguous jargon inhibits effective vendor communication. Continue reading “The Bad Habits of Using Business Buzzwords”→