Enterprise big data and analytics cuts through the hype to make sense of data collection, storage, management, dissemination and discovery technologies, all employed collectively as a means of realizing corporate efficiencies and uncovering business opportunities.
• Looking to build good artificiaI intelligence (AI)? Don’t let the speed and availability of open source frameworks, modules, libraries, and languages lull you into a false sense of confidence.
• Good AI needs to start with good data and good data needs to be ingested, registered, described, validated, and processed well before it reaches the ready hands of AI practitioners.
These are heady times. Enterprises have at their disposal both the raw materials and the necessary tools to achieve great things with AI, be that something grandiose as self-driving cars or unassuming as a fraud detection algorithm. The trouble with an abundance of materials (e.g., data) and tools (e.g., open source machine learning models), however, is speed. Speed kills, as they say.
For AI practitioners, this means learning to run before learning to walk by hastily automating decisions via AI models that are built on unsound data. With a few simple open source frameworks, modules, libraries, and languages, seemingly useful but ultimately erroneous predictions and conclusions can be readily drawn from any old data set in very short order. What’s the answer? More or better tools? No. As with most human problems, good old human knowhow and understanding are necessary. And that begins with data.
The ‘democratization of analytics,’ essentially getting analytics tools and insights into the hands of the masses, is the next step forward in a world eager to leverage greater business intelligence.
Tableau is taking on the challenge by providing tools such as Explain Data and Ask Data, which are designed to make it easier for line-of-business users to extract insights from their data visualizations.
There is no doubt that the vast amounts of data being generated today contain a wealth of valuable information. But, unlocking the strategic insights contained within this treasure trove of material remains elusive to many. Sure, data scientists and data programmers have the tools to perform the analysis at their fingertips, but their techniques remain out of reach to many line-of-business users. Extracting insights from data and getting it into the hands of those outside of the IT department is a challenge. The ‘democratization of analytics,’ essentially getting analytics tools and insights into the hands of the masses, is the next step forward in a world eager to leverage greater business intelligence. Continue reading “Tableau Tackles the Challenge of ‘Democratizing Analytics’ by Offering New Tools”→
At its annual user conference, Tableau rolled out several data prep and management capabilities, highlighted by the ability for Tableau Prep Builder users to write out to third-party data stores, such as the popular Snowflake solution.
This, along with several ongoing cloud and database initiatives, marks a significant philosophical shift for the vendor away from pure analytics and toward a more complete solution to help buyers establish a company-wide data culture.
For a company powered by analytics, Tableau put very few on display during its annual user conference in Las Vegas last week. Certainly, there were numerous stats to be found, particularly relating to the adoption of Tableau Online, where there are now 15,000+ active customer accounts. What’s more, Tableau Online is maintaining 100% YoY growth, as reported by CEO Adam Selipsky – a fitting fact given that Tableau and its new, cloud-first parent company, Salesforce, are now free to talk integration and rationalization. Continue reading “Tableau Tackles Analytics at Scale, Not Through Tech Alone but with a ‘Culture of Data’”→
• Competition in the AI chipset space is heating up; new players are looking to join the fray and they are raising impressive amounts of capital.
• New vendors face stiff competition from tech heavyweights such as Nvidia, hyper-scale cloud providers such as Google and Amazon, and well-funded Chinese organizations.
Just as the market for AI platforms is heating up, so is competition in the AI chipset space. And it isn’t only the large well-established competitors such as Nvidia, Google, and Huawei vying for market share. New players are looking to join the fray as well, and they are raising impressive amounts of capital. Untether, a Toronto-based chip manufacturing start-up, announced in early November that it had raised $20 million in series A funding. The one year old company plans to release a chip designed for AI inference using near-memory design, reducing the distance data must travel, thereby moving data to processors at 2.5 petabits per second, which improves overall processing efficiency. Across the pond, Graphcore, a UK-based organization, has raised a substantial $200 million to develop its Intelligence Processing Units (IPUs), parallel processors designed for machine learning.
Just one short year after an internal reorganization to more fully meld artificial intelligence (AI) with data and analytics, IBM is back with a new, more accessible vision for IBM Watson.
This time around, the company isn’t focused on game shows or scientific discovery but instead on solving very basic, often human-centric challenges.
When it comes to chasing the market’s heated but somewhat unrequited love affair with AI, IBM has certainly done its part in terms of generating hype for its multi-billion dollar investment in IBM Watson. That hype, which has taken aim at some rather lofty goals such as identifying and diagnosing cancer, has not fully panned out, with some early adopters scaling back or halting operations altogether due to concerns over cost and efficacy. Continue reading “IBM Data and AI Forum: Say Hello to a More Accessible IBM Watson”→
• IBM is poised to grow its cloud services business by helping customers to accelerate their migration of mission-critical applications to the cloud.
• IBM brings a lot of value by helping customers remove complexities in their cloud migration; and avoiding vendor lock-in through an open source, hybrid cloud and multi-cloud strategy.
IBM held its annual analyst event – IBM Asia-Pacific 2019 Analyst Insights – in August 2019. The event was held in Singapore and coincided with THINK Singapore 2019, which was the first THINK event in the country. The THINK event helped IBM showcase its capabilities to customers in the country as it starts to ramp up customer engagement. Over the past few years, companies have experimented with AI and moved non-critical workloads to the cloud. IBM now advocates moving from experimentation to more substantial transformation to gain speed and scale. This will involve moving mission-critical applications (80% are still kept on-premises) to the cloud for scalability and agility. While this is ideal, to benefit from cloud-native features, enterprises need to deal with many layers of complexity ranging from regulations and compliance through to re-architecting legacy systems, security considerations, underlying infrastructure, and change management (people and processes). Continue reading “IBM Advocates Open, Hybrid, and Multi-Cloud in Helping Customers Transform their IT”→
• HPE announced plans to acquire MapR, augmenting its data analytics portfolio with proprietary file system technology.
• HPE’s purchase reinforces the message that to derive true value from an artificial intelligence (AI) implementation, enterprises need to master the basics of data management.
Life isn’t always as it seems, and the same can be said of AI. Sure, the sexy parts of AI are the platforms, the algorithms, the APIs, and the use cases. We are enamored with the natural language processing capabilities, the predictive maintenance, the improved decision making, and the ability to provide a more personalized customer experience. But there is also the intrigue. The seedy underbelly of AI is comprised of the ethical concerns that reveal the potential dark sides of the technology. What if models result in unfair bias against a specific gender or race? What about privacy concerns? What if it’s used for destructive rather than constructive purposes? Continue reading “HPE’s Acquisition of MapR Underscores That AI is All About Data”→