As Principal Analyst for Collaboration and Conferencing at Current Analysis, Brad analyzes the rapidly expanding use of collaboration software and services as a means of improving business agility, fostering employee optimization and driving business opportunities.
At Google I/O this week, Sundar Pichai walked attendees through a number of impressive implementations of AI, one of which showed how Google Assistant could book a haircut and make a dinner reservation via an unnervingly convincing conversation between human and machine.
What happens, then, if that assistant eventually learns how to pass itself off as you?
You know it’s spring when the cherry blossoms appear in force, the birds start singing in unison, and Google CEO Sundar Pichai takes the stage at Google I/O and nonchalantly demonstrates some new bit of technology that simultaneously manages to amaze and terrify. I’m talking about Google Duplex, an interesting blend of natural language understanding (NLU), deep learning (DL), and text-to-speech technology designed to do one thing: use AI to emulate at least one half of an actual human conversation. Continue reading “Google I/O 2018: Did Google AI Just Pass the Turing Test?”→
There are many AI-savvy chipsets on the market right now, each fine-tuned to support specific AI workloads, development frameworks, or vendor platforms.
But, what if developers could flexibly combine AI-specific hardware resource pools on the fly, on-premises as well as online?
There’s certainly enough buzz in the industry right now about artificial intelligence (AI). If you look beyond the doomsday predictions of a machine uprising, the prevailing view is that AI is a literal Swiss Army knife of circumstance, able to cut through any and all problems, ready to assemble opportunity out of nothing more than data. It seems that every vendor has one or two machine learning (ML) and deep learning (DL) frameworks lying about. It’s no wonder. There’s TensorFlow, Caffe, Theano, Torch, and many, many more to choose from, most of which open source and are quite accessible to the broader developer community. Continue reading “It’s Time to Orchestrate AI Hardware for Maximum Effect”→
• There’s a race right now in high tech to build the first general purpose quantum computer, with industry leaders IBM, Google, D-Wave Technologies, and Intel each building out very different implementations of a single, revolutionary idea — the use of qubits instead of plain old bits.
• But unlike most races, this one has no clear finish line as we’re still figuring out the best approach to quantum computing or to building software for them. Enter IT services powerhouse Atos, which is backing a pure but as yet simulated idea of quantum computing in an effort to garner what matters most, namely the hearts and minds of future quantum developers.
There’s an awful lot of noise in the technology industry right now regarding the promise of quantum computing. A sizable number of dissimilar technology and platform players, ranging from Intel to Google to Atom Computing (a 2018 startup) are all busy building increasingly capable computers that push and pull qubits rather than bits. And as you might expect from such a diverse cast, there are a lot of differing views on how to build such a beast and how to best put it to use. Continue reading “Atos Has a Secret Weapon, and It Rhymes with Awesome Computing”→
• When it comes to swapping ones and zeros, quantum computing promises to outpace traditional processors in pure scale.
• Yet its true promise will play out when we learn how to invoke quantum phenomena in order to speed up artificial intelligence (AI).
At last week’s IBM Think conference in Las Vegas, Big Blue and AI chip manufacturer NVIDIA talked up the importance of hardware in resolving AI performance bottlenecks. As it turns out, building a smart AI system demands not only copious amounts of data but also the ability to rapidly run machine learning (ML) and deep learning (DL) algorithms against that data. The trouble is that quite often hardware gets in the way. Continue reading “This is Your Brain on Quantum Computing”→
Domo remains as flamboyant as ever both in how it goes to market and in how it approaches BI as a business operating system.
Yet, a surprising new go-to-market message hints at a newfound maturity that underscores the company’s desire to play a crucial, central role in the success of its customers.
To say that the corporate culture at Domo is unique is to do a serious disservice to all Domo employees, or ‘Domosapiens,’ as they like to call themselves. Domo’s corporate culture is not your typical corporate attempt to feign a sense of style. Domo is downright wacky behind the leadership of its enigmatic founder and CEO, Josh James. Case in point, at this year’s Domopalooza conference in Salt Lake City, Mr. James made a rather interesting entrance during the keynote. Not content to follow the opening entertainment act, put on by the KinJaz dance group, the Domo CEO actually danced a full routine with the group. Continue reading “Take Two Domo and Call Me in the Morning”→
The Internet of Things is not only changing how consumers interact with the world around them; it is also driving a tectonic shift in how companies process and analyze device data.
Traditional best practices for gathering and analyzing data, where information is stored and processed centrally, are no longer relevant. Forget big data warehouses. IoT customers are looking to analyze data as close to the source as possible, at the edge of the network.
Internet of Things (IoT) adoption is certainly being driven by the promise of real-time analytics and AI at scale, but its ultimate feasibility still depends on something much more mundane, namely how efficiently it can move data between connected devices and backend systems.
And yet, according to a recent GlobalData study, IoT practitioners haven’t yet learned that lesson, relying not on fit-for-purpose protocols like MQTT, but instead on the ubiquitous, now aging web standard, HTTP.
At Mobile World Congress this week, networking giant Cisco rolled out a new networking and device management platform for IoT practitioners that promises to enable the creation of extremely large-scale deployments without breaking the bank. IoT at scale is a no-brainer. More devices equal more data. More data equals deeper business insights. But, IoT at scale can be expensive in terms of delivering basic device interconnectivity and management costs. Continue reading “The Internet of Things Isn’t Driven by Devices as Much as by the Internet Itself”→
Enterprise buyers looking to simplify their data integration woes through centralization are missing the value inherent in diversity.
Database diversity (actually diversity across all workloads) should not only be welcomed but actually sought after as a means of blending opportunity with capability.
Back in the ‘90s, the average enterprise maintained not one, not two, but seven databases on average: one for transactional information, one for data mining cubes, one for server logs, etc. Today, that has grown dramatically thanks to the proliferation of NoSQL-style databases built to handle unstructured, semi-structured and polymorphic data. Add to this the ever-expanding list of data storage options across public cloud data platforms, and you’ve an honest to goodness embarrassment of riches. Continue reading “Let’s Drain the Database Swamp! (Okay, Just Kidding)”→
• In 2017 the enterprise data and analytics vendor community emphasized opportunity in the cloud and the democratization of data. What will 2018 bring?
• We expect to see a shift in focus towards quality, to solving problems such as data governance, and putting AI to work within tactical business workflows.
What does the coming year have in store for the enterprise data and analytics marketplace? Sometimes, the best way to predict the future is to look at the past. To that end, here’s what we predicted for 2017 back in December of last year.
• IoT success will ride on pre-built data models and packaged software
• Smaller players will drive cognitive software innovation
• Vendors will prioritize self-service data integration, prep and management
• Vertical markets and specific use cases will fuel data-as-a-service adoption
• How did a computer algorithm like Google’s AlphaZero manage to learn, master and then dominate the game of chess in just four hours?
• AlphaZero’s mastery of chess stemmed from the sheer, brute force of Google’s AI-specific Tensorflow processing units (TPUs) – 5,000 of them to be exact.
“How about a nice game of chess?” With that iconic line of dialog from what is one of my favorite films, the 1983 cold war sci-fi thriller WarGames, nuclear war was narrowly averted by a machine (named Joshua) capable of teaching itself how to play a game. This week another machine, one of Google’s DeepMind AI offspring, AlphaZero, did something similar in that it took four hours to teach itself how to play chess and then proceeded to demolish the best, highest rated chess computer, Stockfish. After 100 games, AlphaZero racked up 28 wins and zero losses. So much for more than a millenium of human effort in teaching a computer how to play chess. But how was this possible? Was this a fair match? How did a computer algorithm like AlphaZero manage to learn, master and then dominate the game of chess in just four hours? Continue reading “The Chess Dominance of Google’s AlphaZero Teaches Us More About Chips Than About Brains”→