Rena is a Director of Custom Research at Current Analysis, specializing in Delivery Management. She is responsible for ensuring the delivery of actionable recommendations and guidance to clients to assist them in formulating their market development and execution strategies. Her expertise is in telecommunications and IT services, including business networking , communications, data center, security, and business continuity services.
• Capgemini rolled out a suite of six services to help companies leverage the benefits of edge computing and 5G.
• Success will depend on Capgemini’s ability to demonstrate its industry expertise and experience in managing complex ecosystems, and customers’ appetite for new digital transformation projects during the pandemic.
Edge computing and 5G networking are hot technologies that claim to propel businesses into the future by supporting intelligent industry and digital transformation. There is no doubt that the proliferation of connected devices holds the potential to transform how businesses operate, leading to greater efficiency, increased automation, and more insightful and timely data-driven decision-making. Edge computing provides processing power closer to the source of data collection, improving latency and addressing privacy-related concerns. 5G networking enables businesses to transmit large volumes of data to an edge device quickly, reducing latency and costs associated with sending data to the cloud, and facilitating a more near real-time experience. Despite the benefits these technologies promise to deliver, identifying appropriate use cases and deploying the solutions can be a challenge for many organizations. Continue reading “Capgemini’s New Suite of Services Helps Customers Leverage Edge and 5G Technologies”→
A substantial portion of survey respondents revealed that IT budgets will remain relatively unchanged, and an encouraging 20% even indicated budget increases due to COVID-19.
Communication and collaboration, cloud services, security, networking, and mobility spending will be most resilient; at least 30% of respondents expect spending increases for these technologies.
Although infectious disease experts had warned of the potential of a pandemic for years, COVID-19 took most organizations by surprise. The majority of large enterprises had extensive business continuity and resiliency plans in place, but most were still scrambling to keep employees securely connected when governments mandated that all but the most essential workers stay home. This sudden change in normal business operations, combined with the economic impact of the global lockdown, has (not surprisingly) influenced IT spending plans for the remainder of the year – and most likely into 2021. Continue reading “COVID-19: GlobalData Survey Reveals Bright Spots for IT Spending”→
• AI was featured prominently during Microsoft Build 2020, with a tagline of ‘Putting AI Into Action’ and a goal of bringing state of the art AI to all developers.
• Microsoft made several announcements that supported this vision, including updates on Microsoft Project Turing model, investments in infrastructure for AI processing, and the preview of Project Bonsai.
Microsoft’s annual developer conference, Microsoft Build 2020, was held as a virtual event on May 19 and 20, with more than 200,000 people registered. When kicking off Microsoft Build 2020, CEO Satya Nadella noted that the technology industry is being called upon to address the world’s most acute needs, and that developers are now more important than ever. He pointed out that organizations need the ability to remote everything at a moment’s notice, and to simulate and automate everywhere to enable more agile responses. Continue reading “Build 2020: Microsoft ‘Goes Big’ on AI and Demonstrates Thought Leadership”→
In mid-May, AWS highlighted its portfolio of AI tools and solutions during its AWS Summit Online for the Americas region and announced the general availability of Amazon Kendra for enterprises.
Tools that support AI model development and management and pre-built solutions that can be easily deployed by developers who aren’t AI experts help streamline AI adoption.
AWS understands the challenges enterprises face when building their own machine learning models. The company notes that when scaling AI adoption, enterprises face wide-ranging complexities that can start as early as the data collection stage and continue throughout the model management lifecycle. At the beginning of a project, organizations face challenges related to data identification, storage, and curation as they pull together disparate data sources. Later, while building and training models, they need to manage numerous other complexities, such as sharing notebooks and pre-trained models. They need to ensure effective collaboration among what can be a growing number of individuals or teams, each with their own specializations. And, since machine learning models aren’t usually perfect the first time, team members need to communicate during the process of model tuning and optimization. They need to manage multiple versions of models, run experimental models in real time, and compare results. Even after deployment, machine learning algorithms need to be managed and monitored for concerns such as data drift, with newer versions deployed as additional data is collected or the factors that impact model results change. Managing these tasks can be challenging, and as AWS rightly points out, tools that help manage the complexities do much to streamline and speed AI deployments. Continue reading “AWS Aims to Make AI More Accessible for Both AI Specialists and Non-AI Experts”→
The current pandemic has provided the opportunity to broaden the definition of corporate social responsibility (CSDR) and for technology providers to take even greater action.
A key question is whether these initiatives will pay off in the long run; the answer is likely ‘yes,’ but much will depend on a company’s track record prior to COVID-19.
The move toward greater corporate social responsibility (CSR) has been taking place for quite some time. Traditional activities that fall into this category are broad, including educational programs, investments in local or underprivileged communities, increased hiring of minorities, and initiatives to reduce carbon footprints or to support environmentally friendly projects. And the list goes on. The current pandemic has provided the opportunity for technology providers to take even greater action and possibly broaden what should be included in the definition of CSR. Continue reading “COVID-19: Tech Providers Demonstrate Corporate Social Responsibility Leadership Now More Than Ever”→
• Overall, the response by IT services providers (ITSP) to COVID-19 has been muted, though this is starting to change.
• Now is the time to promote digital transformation initiatives related to workplace virtualization, cloud migration, hybrid or multi-cloud management, IoT, adoption of advanced analytics and RPA, and cybersecurity.
Many data analytics companies are beginning to embed artificial intelligence (AI) capabilities directly into their software, allowing users to reap the benefits of AI-driven insights without developing the machine learning algorithms themselves.
Domo is taking a different approach from some of its peers by working closely with AWS to add automatic machine learning, recommended actions, and drag-and-drop predictive model deployment in the Domo Business Cloud.
Many businesses are eager to reap the productivity and efficiency-enhancing benefits of AI. They have collected and stored vast amounts of data but face challenges when it comes to uncovering the nuggets of insight that can improve operations, enhance customer service, and speed faster and more informed decision-making. One of the biggest hurdles to AI adoption is a lack of resources. Building, training, tuning, and deploying machine learning models is a lengthy process that requires the expertise of expensive data scientists and AI experts. Many businesses don’t have these resources readily available; nor do they have the time or money to invest in acquiring them. Continue reading “Domo Partners with AI Leader to Help Customers Gain Greater Insights from Their Data”→
• AWS launched three initiatives designed to help customers explore quantum processing as well as develop quantum expertise and identify quantum applications
• Amazon Braket gives users the opportunity to experiment with quantum algorithms; the AWS Center for Quantum Computing promotes collaborative research and development; and the Quantum Solutions Lab will help identify use cases for the technology.
Although quantum computing is still in the early stages, and practical applications for it still need to be developed, there is no doubt that the technology’s impressive processing power holds the potential to have a major impact across vertical industries. It’s only a matter of time before the ability to create and deploy quantum solutions becomes a competitive differentiator, allowing some companies to better leverage the wealth of data they have collected to uncover new insights. But building proficiency in new technologies takes years and can be expensive; however, many experts argue that the time is right to start developing internal quantum expertise. With a technology that is just emerging, how and where should enterprises start?
The ‘democratization of analytics,’ essentially getting analytics tools and insights into the hands of the masses, is the next step forward in a world eager to leverage greater business intelligence.
Tableau is taking on the challenge by providing tools such as Explain Data and Ask Data, which are designed to make it easier for line-of-business users to extract insights from their data visualizations.
There is no doubt that the vast amounts of data being generated today contain a wealth of valuable information. But, unlocking the strategic insights contained within this treasure trove of material remains elusive to many. Sure, data scientists and data programmers have the tools to perform the analysis at their fingertips, but their techniques remain out of reach to many line-of-business users. Extracting insights from data and getting it into the hands of those outside of the IT department is a challenge. The ‘democratization of analytics,’ essentially getting analytics tools and insights into the hands of the masses, is the next step forward in a world eager to leverage greater business intelligence. Continue reading “Tableau Tackles the Challenge of ‘Democratizing Analytics’ by Offering New Tools”→
• Competition in the AI chipset space is heating up; new players are looking to join the fray and they are raising impressive amounts of capital.
• New vendors face stiff competition from tech heavyweights such as Nvidia, hyper-scale cloud providers such as Google and Amazon, and well-funded Chinese organizations.
Just as the market for AI platforms is heating up, so is competition in the AI chipset space. And it isn’t only the large well-established competitors such as Nvidia, Google, and Huawei vying for market share. New players are looking to join the fray as well, and they are raising impressive amounts of capital. Untether, a Toronto-based chip manufacturing start-up, announced in early November that it had raised $20 million in series A funding. The one year old company plans to release a chip designed for AI inference using near-memory design, reducing the distance data must travel, thereby moving data to processors at 2.5 petabits per second, which improves overall processing efficiency. Across the pond, Graphcore, a UK-based organization, has raised a substantial $200 million to develop its Intelligence Processing Units (IPUs), parallel processors designed for machine learning.