Salesforce’s new platform features improve application building, packaging, and versioning.
Salesforce is well poised to leverage its low-code strengths to enhance core platforms, namely DX.
During the technology keynote on Day 2 of the 2018 Salesforce TrailheaDX conference in San Francisco, the company’s co-founder, Parker Harris, admitted he was short on Trailhead training and certification badges. And so, he quickly handed over the mic to his product execs to elaborate on key initiatives and new product features in Salesforce DX, Heroku, and Einstein that the company is preparing for release in the second half of the year. No announcements were made on Day 2, as Salesforce chose to save its big product releases for this fall’s Dreamforce conference. However, a few important new platform features were revealed. Continue reading “Salesforce TrailheaDX 2018, Day 2: Salesforce Faces Exciting Platform Integration Possibilities, Daunting Task to Tackle”→
• Integration Cloud will be based on the MuleSoft Anypoint Platform.
• Benioff says trust is highest value when dealing with next-gen technology, social media, and mobile; he wants to move into blockchain and cryptocurrency.
Salesforce kicked off its third-annual TrailheaDX developer conference this week in San Francisco with 10,000 attendees, largely made up of the “Trailblazers” trained on the company’s platforms and solutions. These individuals, who range from non-coders to savvy developers in levels of expertise, make up its Trailhead community, now 5 million strong. They are trained across a spectrum of tools and software, from no-code App Builder to sophisticated Heroku platform services. Attendees seemed to enjoy the intimacy of this week’s conference – especially compared to the unwieldy 171,000 attending last fall’s Dreamforce conference – eagerly saddling up to technology demos and chatting with Salesforce’s experts. Continue reading “Salesforce TrailheaDX 2018, Day 1: Integration Cloud Release Anticipates Mega MuleSoft Buy”→
• When it comes to swapping ones and zeros, quantum computing promises to outpace traditional processors in pure scale.
• Yet its true promise will play out when we learn how to invoke quantum phenomena in order to speed up artificial intelligence (AI).
At last week’s IBM Think conference in Las Vegas, Big Blue and AI chip manufacturer NVIDIA talked up the importance of hardware in resolving AI performance bottlenecks. As it turns out, building a smart AI system demands not only copious amounts of data but also the ability to rapidly run machine learning (ML) and deep learning (DL) algorithms against that data. The trouble is that quite often hardware gets in the way. Continue reading “This is Your Brain on Quantum Computing”→
• U.S. communications service providers are racing to launch 5G services this year
• What we really expect are 2019 deployments, as standards finalize and devices are commercialized
U.S. Providers 5G Rollout Plans
In the U.S., 5G rollouts are planned for 2018 by all four major wireless operators. However the launch dates, use cases and underlying technologies are all a bit different. While the other three operators are planning mobile rollouts from the beginning, Verizon is sticking with fixed broadband for now. And while AT&T, Sprint and T-Mobile claim mobile launches in 2018, standardized 5G with devices that can run on it are not expected until 2019.
Meanwhile, the use cases for 5G in the enterprise are still TBD. Aside from faster, lower latency services, and the futuristic advent of driverless cars and surgeon-free operations, 5G allows for more granular pricing and use case types via its “network slicing” capability. This lets network operators choose the characteristics they need per slice such as level of latency, throughput, the number and type of devices to be supported, and these in turn effect the pricing model.
Benefits for Consumers and Business Users
According to 5G technology suppliers, the benefits of 5G to consumers will include higher quality, faster speeds, wider coverage (indoors and out), and lower latency (down by 10x) – this translates to better support for applications that use streaming video or are aimed at the interactive gaming user base. 5G will also support the growing market for applications that use augmented and virtual reality technologies.
In the enterprise, suppliers note that massive communications traffic is expected from sensors embedded in roads, railways, and vehicles that are not only sending information to the cloud or to edge processing devices for analysis, but will also be sending data to each other. 5G also aims to leverage its inherent reliability and low latency to control critical services and infrastructure for public safety, government organizations, and utilities. Real-time video streaming, support for IoT applications such as autonomous vehicles, and advanced use of robotics in manufacturing are other likely use cases in the not-too-distant future.
While service providers have not yet set prices, a major objective for 5G is to lower data transmission costs compared to 4G LTE, by making bandwidth utilization more efficient and leveraging new higher-band spectrum. However, operators tend to charge what they can get companies and consumers to pay. They are not certain to pass these economies of scale and technology down to the end-customer, especially for such a premium service.
But there remain skeptics about the use cases for 5G: will they be different enough from 4G to allow operators to recoup their investments? Are 2018 launches meaningful when devices won’t be ready until 2019? And as far as the race to launch services is concerned – does it really matter which operator gets there first? Should enterprises wait to deploy fixed or mobile broadband or IoT services until they have 5G available? Probably not.
SD-WAN products and technology offer distinctly different features and benefits compared to branch routers. SD-WAN won’t augment routers but will replace them in the branch.
Vendors making branch devices like routers and firewalls should be very concerned about being replaced with SD-WAN hardware and software.
I make no secret that I think SD-WAN is the cat’s meow. It really is transformative technology that, in most cases, can deliver on the promise of an as robust or better WAN overlay that obviates the need for a complex routed WAN architecture and the skills need to maintain it. If an enterprise wants to relegate its WAN to just pipes, it can overlay an SD-WAN on top of the WAN and manage it themselves. If the enterprise wants an SD-WAN and WAN service that has management integration from service provisioning to management, it can get a combined service—or soon will be able to get a combined service—from any number of managed service providers. In either case, gone is the complex routed WAN which is brittle and takes a long time to respond to problems. Whether the enterprise router jockeys will want to give up their beautifully crafted BGP is another matter, but the potential exists for most companies. Continue reading “SD-WAN Won’t Become a Feature of Branch Routing”→
Domo remains as flamboyant as ever both in how it goes to market and in how it approaches BI as a business operating system.
Yet, a surprising new go-to-market message hints at a newfound maturity that underscores the company’s desire to play a crucial, central role in the success of its customers.
To say that the corporate culture at Domo is unique is to do a serious disservice to all Domo employees, or ‘Domosapiens,’ as they like to call themselves. Domo’s corporate culture is not your typical corporate attempt to feign a sense of style. Domo is downright wacky behind the leadership of its enigmatic founder and CEO, Josh James. Case in point, at this year’s Domopalooza conference in Salt Lake City, Mr. James made a rather interesting entrance during the keynote. Not content to follow the opening entertainment act, put on by the KinJaz dance group, the Domo CEO actually danced a full routine with the group. Continue reading “Take Two Domo and Call Me in the Morning”→
A new wave of innovative low-code tools is being integrated into popular cloud offerings to provide developer access to high-value cognitive and IoT services.
New low-code platforms are being integrated with operational tools to automate workflows and other application lifecycles.
Cloud providers are finding new opportunities offering low-code platforms that address the labor-intensive requirements involved in the development of web, mobile, and IoT apps. Modern apps are being developed through visual UI tools and frameworks, which engage customers through access to high-value services including analytics, IoT, and big data. Continue reading “Low-Code Platforms Are a Driving Force Behind the Cloud’s Success”→
Businesses looking to adopt AI must not only evaluate the technology’s implications on job displacement and data security, but also consider that algorithms may unintentionally undermine the organization’s ethical standards.
Customers are quick to pass judgement; if unintentional biases become public, a company’s brand reputation may suffer significantly.
Much has been written about ethics and artificial intelligence (AI), and rightly so. With many organizations looking to adopt some form of AI technology in 2018, business leaders are wise to stay on top of emerging ethical concerns.
Job displacement is still a key consideration, as is safeguarding data. In a recent GlobalData survey, 23% of organizations indicated they had cut or not replaced employees because of AI; 57% indicated security as a top concern.
However, looking ahead, the question of ethics is the real challenge the AI community will need to tackle. And it is a challenge that is far more controversial than security or privacy. What happens when a self-driven car needs to decide between hitting a child that has run into the road, or swerving and risking the injury of its passenger? How proactive should a personal assistant be when it detects wrongdoing? What should be done when a personal assistant believes that a user’s usage pattern points to having committed a serious offense – should it alert authorities?
Probably more relevant to business leaders is the concern that they may not know if an AI infused application will perform up to their organization’s ethical standards. It may contain unintentional racial bias – say a financial algorithm that is biased against a specific race, or an application that demonstrates a preference towards one gender over another. What should be done when a phrase that is acceptable when said by one demographic is completely unacceptable when uttered by another – can an algorithm be trained to reliably make this distinction? Maybe, but what happens when it makes a mistake?
On the one hand, unintentional results are not the fault of the organization using the AI solution. The responsibility may lie in the data used to train the underlying machine learning model. However, customers are quick to pass judgement. If and when these unintentional biases become public, customers will quickly assign blame to the company using them, potentially with enormous impact to a brand’s reputation.
Just as CEOs may take the blame for customer data breaches, and as a result may lose their jobs, senior leaders are also at risk of taking the fall when an AI solution implemented by their organization crosses an ethical line. It’s in their best interest to ensure that doesn’t happen – their reputation depends on it.
SREs will play a key role in determining the shape of the DevOps pipeline.
Lack of quality SREs has hindered some containerized apps from moving into production.
The evolution of the DevOps pipeline highlights the importance of the software reliability engineer (SRE), as is increasingly evident amid the growing complexity surrounding the application lifecycle. This topic came up during last week’s Container World conference in Silicon Valley in reference to container management and orchestration. Enterprises need to invest in SREs whose operational expertise will take DevOps to the next level, as these experts strive to support new services to empower the knowledge worker. Continue reading “SREs Take DevOps to the Next Level”→