Enterprise big data and analytics cuts through the hype to make sense of data collection, storage, management, dissemination and discovery technologies, all employed collectively as a means of realizing corporate efficiencies and uncovering business opportunities.
Internet of Things (IoT) adoption is certainly being driven by the promise of real-time analytics and AI at scale, but its ultimate feasibility still depends on something much more mundane, namely how efficiently it can move data between connected devices and backend systems.
And yet, according to a recent GlobalData study, IoT practitioners haven’t yet learned that lesson, relying not on fit-for-purpose protocols like MQTT, but instead on the ubiquitous, now aging web standard, HTTP.
At Mobile World Congress this week, networking giant Cisco rolled out a new networking and device management platform for IoT practitioners that promises to enable the creation of extremely large-scale deployments without breaking the bank. IoT at scale is a no-brainer. More devices equal more data. More data equals deeper business insights. But, IoT at scale can be expensive in terms of delivering basic device interconnectivity and management costs. Continue reading “The Internet of Things Isn’t Driven by Devices as Much as by the Internet Itself”→
• Many organizations are unsure of how to best incorporate AI to meet their industry-specific challenges – often because the use case options are so vast and so varied.
• Organizations – particularly mid-sized businesses, companies starting out on their analytics journeys, or those rolling out IoT solutions – should explore the services available from their telecom provider, many of which have built out their professional services capabilities around digital transformation.
• With big data and analytics, older ideas like predictive analytics and AI are coming together to solve long-standing problems, most notably data quality.
• Sisense is adding another twist by taking advanced design and visualization concepts and putting those to work at the very beginning of the analytics lifecycle.
Invention invariably involves theft. Each generation of inventors stands on the shoulders of its predecessors, borrowing freely from their available pool of knowledge. Ideas are deconstructed, mixed up, and reapplied in new ways and within unexpected contexts to form, well, something new. Sometimes these new inventions are simply the opportunistic reinterpretation of an existing idea, taking something unique but impractical and turning it into something incredibly useful. That’s the way it was with the invention of the automobile, the light bulb and the radio. And that’s how it is with big data and analytics, where older ideas are only now coming together to solve long-standing problems. Continue reading “To Improve Data Quality, Sometimes the Best Place to Start is at the Very End”→
Recent survey results reveal that companies have high expectations when it comes to artificial intelligence; close to half expect the technology to bring new capabilities to their organization.
Vendors are bringing new solutions to market to help companies implement their AI vision, offering solutions that speed and ease adoption of artificial intelligence, machine learning, and deep learning in particular.
Businesses are looking at artificial intelligence (AI) as a truly disruptive technology, with the potential to change the way they run their organizations. Unlike many other new solutions, which are often adopted because of their promises to cut costs, companies are embracing AI so they can to bring new capabilities to their teams or to improve the support they provide to their customers and partners. Continue reading “New Capabilities, Not Cost Savings, Are Biggest Driver of AI Adoption”→
Enterprise buyers looking to simplify their data integration woes through centralization are missing the value inherent in diversity.
Database diversity (actually diversity across all workloads) should not only be welcomed but actually sought after as a means of blending opportunity with capability.
Back in the ‘90s, the average enterprise maintained not one, not two, but seven databases on average: one for transactional information, one for data mining cubes, one for server logs, etc. Today, that has grown dramatically thanks to the proliferation of NoSQL-style databases built to handle unstructured, semi-structured and polymorphic data. Add to this the ever-expanding list of data storage options across public cloud data platforms, and you’ve an honest to goodness embarrassment of riches. Continue reading “Let’s Drain the Database Swamp! (Okay, Just Kidding)”→
• In 2017 the enterprise data and analytics vendor community emphasized opportunity in the cloud and the democratization of data. What will 2018 bring?
• We expect to see a shift in focus towards quality, to solving problems such as data governance, and putting AI to work within tactical business workflows.
What does the coming year have in store for the enterprise data and analytics marketplace? Sometimes, the best way to predict the future is to look at the past. To that end, here’s what we predicted for 2017 back in December of last year.
• IoT success will ride on pre-built data models and packaged software
• Smaller players will drive cognitive software innovation
• Vendors will prioritize self-service data integration, prep and management
• Vertical markets and specific use cases will fuel data-as-a-service adoption
• How did a computer algorithm like Google’s AlphaZero manage to learn, master and then dominate the game of chess in just four hours?
• AlphaZero’s mastery of chess stemmed from the sheer, brute force of Google’s AI-specific Tensorflow processing units (TPUs) – 5,000 of them to be exact.
“How about a nice game of chess?” With that iconic line of dialog from what is one of my favorite films, the 1983 cold war sci-fi thriller WarGames, nuclear war was narrowly averted by a machine (named Joshua) capable of teaching itself how to play a game. This week another machine, one of Google’s DeepMind AI offspring, AlphaZero, did something similar in that it took four hours to teach itself how to play chess and then proceeded to demolish the best, highest rated chess computer, Stockfish. After 100 games, AlphaZero racked up 28 wins and zero losses. So much for more than a millenium of human effort in teaching a computer how to play chess. But how was this possible? Was this a fair match? How did a computer algorithm like AlphaZero manage to learn, master and then dominate the game of chess in just four hours? Continue reading “The Chess Dominance of Google’s AlphaZero Teaches Us More About Chips Than About Brains”→
Digital home assistants like Google Home Mini and Amazon Echo owe users much more than privacy; if they are to be truly trusted, they must also explain how they think and how they make decisions.
Fortunately, regulations such as General Data Protection Regulation (GDPR) will begin asking such questions. The only problem is that artificial intelligence (AI) may not be able to provide any answers.
Google was quick to lay blame for its recent eavesdropping Home Mini fiasco on a ‘hardware bug,’ rolling out a quick update that purportedly prevents devices from inadvertently recording and reporting on overheard conversations should their owners accidentally press the wrong button. From now on, Google Home Mini will only record what you say after you capture its attention via “Hey Google” or “Okay Google.”
Machine learning (ML) algorithms are incredibly powerful, and companies like Google, Microsoft, Amazon, and Salesforce.com realize that – hence their intense interest in operationalizing ML and DL tooling.
But, those algorithms alone are no guarantee of value. Whether you’re predicting the weather or optimizing a delivery route, AI lives or dies according to the humans within whose care it finds itself.
Can we truly know whether or not we’re living out our lives as a part of a simulated, holographic model of the universe as proposed by mega-entrepreneur Elon Musk? Should we even care about such things? If you’re at all concerned about the weather – about the expected path a hurricane will take, let’s say – then the answer is a resounding ‘yes.’ I would argue in fact that we are living out our lives based upon countless simulations. Continue reading “Without People, There Would Be No Artificial Intelligence”→
• At its 10th annual user conference, modern BI leader Tableau unveiled a means by which customers can embed business processes within the Tableau interface, effectively upending commonly accepted ideas about the role of analytics in business.
• With Tableau’s new Extensions API, companies can start to think about analytics, not as a passive, informational adjunct to business processes, but instead as an active participant in the business itself.
These days APIs are a dime a dozen. Every vendor has one (or two), supporting basic routines like software automation or enabling more elaborate objectives like application embedding. The driving factor powering the proliferation of APIs is simple. They grant both interoperability and extensibility, two traits that are crucial to success – particularly within the enterprise data and analytics marketplace where heterogeneity reigns supreme.