As Principal Analyst for Collaboration and Conferencing at Current Analysis, Brad analyzes the rapidly expanding use of collaboration software and services as a means of improving business agility, fostering employee optimization and driving business opportunities.
• There’s a race right now in high tech to build the first general purpose quantum computer, with industry leaders IBM, Google, D-Wave Technologies, and Intel each building out very different implementations of a single, revolutionary idea — the use of qubits instead of plain old bits.
• But unlike most races, this one has no clear finish line as we’re still figuring out the best approach to quantum computing or to building software for them. Enter IT services powerhouse Atos, which is backing a pure but as yet simulated idea of quantum computing in an effort to garner what matters most, namely the hearts and minds of future quantum developers.
There’s an awful lot of noise in the technology industry right now regarding the promise of quantum computing. A sizable number of dissimilar technology and platform players, ranging from Intel to Google to Atom Computing (a 2018 startup) are all busy building increasingly capable computers that push and pull qubits rather than bits. And as you might expect from such a diverse cast, there are a lot of differing views on how to build such a beast and how to best put it to use. Continue reading “Atos Has a Secret Weapon, and It Rhymes with Awesome Computing”→
• When it comes to swapping ones and zeros, quantum computing promises to outpace traditional processors in pure scale.
• Yet its true promise will play out when we learn how to invoke quantum phenomena in order to speed up artificial intelligence (AI).
At last week’s IBM Think conference in Las Vegas, Big Blue and AI chip manufacturer NVIDIA talked up the importance of hardware in resolving AI performance bottlenecks. As it turns out, building a smart AI system demands not only copious amounts of data but also the ability to rapidly run machine learning (ML) and deep learning (DL) algorithms against that data. The trouble is that quite often hardware gets in the way. Continue reading “This is Your Brain on Quantum Computing”→
Domo remains as flamboyant as ever both in how it goes to market and in how it approaches BI as a business operating system.
Yet, a surprising new go-to-market message hints at a newfound maturity that underscores the company’s desire to play a crucial, central role in the success of its customers.
To say that the corporate culture at Domo is unique is to do a serious disservice to all Domo employees, or ‘Domosapiens,’ as they like to call themselves. Domo’s corporate culture is not your typical corporate attempt to feign a sense of style. Domo is downright wacky behind the leadership of its enigmatic founder and CEO, Josh James. Case in point, at this year’s Domopalooza conference in Salt Lake City, Mr. James made a rather interesting entrance during the keynote. Not content to follow the opening entertainment act, put on by the KinJaz dance group, the Domo CEO actually danced a full routine with the group. Continue reading “Take Two Domo and Call Me in the Morning”→
The Internet of Things is not only changing how consumers interact with the world around them; it is also driving a tectonic shift in how companies process and analyze device data.
Traditional best practices for gathering and analyzing data, where information is stored and processed centrally, are no longer relevant. Forget big data warehouses. IoT customers are looking to analyze data as close to the source as possible, at the edge of the network.
Internet of Things (IoT) adoption is certainly being driven by the promise of real-time analytics and AI at scale, but its ultimate feasibility still depends on something much more mundane, namely how efficiently it can move data between connected devices and backend systems.
And yet, according to a recent GlobalData study, IoT practitioners haven’t yet learned that lesson, relying not on fit-for-purpose protocols like MQTT, but instead on the ubiquitous, now aging web standard, HTTP.
At Mobile World Congress this week, networking giant Cisco rolled out a new networking and device management platform for IoT practitioners that promises to enable the creation of extremely large-scale deployments without breaking the bank. IoT at scale is a no-brainer. More devices equal more data. More data equals deeper business insights. But, IoT at scale can be expensive in terms of delivering basic device interconnectivity and management costs. Continue reading “The Internet of Things Isn’t Driven by Devices as Much as by the Internet Itself”→
Enterprise buyers looking to simplify their data integration woes through centralization are missing the value inherent in diversity.
Database diversity (actually diversity across all workloads) should not only be welcomed but actually sought after as a means of blending opportunity with capability.
Back in the ‘90s, the average enterprise maintained not one, not two, but seven databases on average: one for transactional information, one for data mining cubes, one for server logs, etc. Today, that has grown dramatically thanks to the proliferation of NoSQL-style databases built to handle unstructured, semi-structured and polymorphic data. Add to this the ever-expanding list of data storage options across public cloud data platforms, and you’ve an honest to goodness embarrassment of riches. Continue reading “Let’s Drain the Database Swamp! (Okay, Just Kidding)”→
• In 2017 the enterprise data and analytics vendor community emphasized opportunity in the cloud and the democratization of data. What will 2018 bring?
• We expect to see a shift in focus towards quality, to solving problems such as data governance, and putting AI to work within tactical business workflows.
What does the coming year have in store for the enterprise data and analytics marketplace? Sometimes, the best way to predict the future is to look at the past. To that end, here’s what we predicted for 2017 back in December of last year.
• IoT success will ride on pre-built data models and packaged software
• Smaller players will drive cognitive software innovation
• Vendors will prioritize self-service data integration, prep and management
• Vertical markets and specific use cases will fuel data-as-a-service adoption
• How did a computer algorithm like Google’s AlphaZero manage to learn, master and then dominate the game of chess in just four hours?
• AlphaZero’s mastery of chess stemmed from the sheer, brute force of Google’s AI-specific Tensorflow processing units (TPUs) – 5,000 of them to be exact.
“How about a nice game of chess?” With that iconic line of dialog from what is one of my favorite films, the 1983 cold war sci-fi thriller WarGames, nuclear war was narrowly averted by a machine (named Joshua) capable of teaching itself how to play a game. This week another machine, one of Google’s DeepMind AI offspring, AlphaZero, did something similar in that it took four hours to teach itself how to play chess and then proceeded to demolish the best, highest rated chess computer, Stockfish. After 100 games, AlphaZero racked up 28 wins and zero losses. So much for more than a millenium of human effort in teaching a computer how to play chess. But how was this possible? Was this a fair match? How did a computer algorithm like AlphaZero manage to learn, master and then dominate the game of chess in just four hours? Continue reading “The Chess Dominance of Google’s AlphaZero Teaches Us More About Chips Than About Brains”→
Digital home assistants like Google Home Mini and Amazon Echo owe users much more than privacy; if they are to be truly trusted, they must also explain how they think and how they make decisions.
Fortunately, regulations such as General Data Protection Regulation (GDPR) will begin asking such questions. The only problem is that artificial intelligence (AI) may not be able to provide any answers.
Google was quick to lay blame for its recent eavesdropping Home Mini fiasco on a ‘hardware bug,’ rolling out a quick update that purportedly prevents devices from inadvertently recording and reporting on overheard conversations should their owners accidentally press the wrong button. From now on, Google Home Mini will only record what you say after you capture its attention via “Hey Google” or “Okay Google.”
Machine learning (ML) algorithms are incredibly powerful, and companies like Google, Microsoft, Amazon, and Salesforce.com realize that – hence their intense interest in operationalizing ML and DL tooling.
But, those algorithms alone are no guarantee of value. Whether you’re predicting the weather or optimizing a delivery route, AI lives or dies according to the humans within whose care it finds itself.
Can we truly know whether or not we’re living out our lives as a part of a simulated, holographic model of the universe as proposed by mega-entrepreneur Elon Musk? Should we even care about such things? If you’re at all concerned about the weather – about the expected path a hurricane will take, let’s say – then the answer is a resounding ‘yes.’ I would argue in fact that we are living out our lives based upon countless simulations. Continue reading “Without People, There Would Be No Artificial Intelligence”→