Sponsored content
The Future of Computing at the Edge: Intel’s Tom Lantzsch talks about how algorithms are the new critical asset
Sponsored Content

The Future of Computing at the Edge: Intel’s Tom Lantzsch talks about how algorithms are the new critical asset

An interview with Tom Lantzsch, SVP and GM, Internet of Things Group at Intel

An interview with Tom Lantzsch

Senior Vice President and General Manager of the Internet of Things Group (IoT) at Intel Corporation

Edge computing had been on the rise in the last 18 months – and accelerated amid the need for new applications to solve challenges created by the Covid-19 pandemic. Tom Lantzsch, Senior Vice President and General Manager of the Internet of Things Group (IoT) at Intel Corp., thinks there are more innovations to come – and wants technology leaders to think equally about data and the algorithms as critical differentiators.

In his role at Intel, Lantzsch leads the worldwide group of solutions architects across IoT market segments, including retail, banking, hospitality, education, industrial, transportation, smart cities and healthcare. And he's seen first-hand how artificial intelligence run at the edge can have a big impact on customers' success.

Protocol sat down with Lantzsch to talk about the challenges faced by companies seeking to move from the cloud to the edge; some of the surprising ways that Intel has found to help customers and the next big breakthrough in this space.

What are the biggest trends you are seeing with edge computing and IoT?

A few years ago, there was a notion that the edge was going to be a simplistic model, where we were going to have everything connected up into the cloud and all the compute was going to happen in the cloud. At Intel, we had a bit of a contrarian view. We thought much of the interesting compute was going to happen closer to where data was created . And we believed, at that time, that camera technology was going to be the driving force – that just the sheer amount of content that was created would be overwhelming to ship to the cloud – so we'd have to do compute at the edge. A few years later – that hypothesis is in action and we're seeing edge compute happen in a big way.

The last 18 months have been a wild time to be in technology. We've seen edge compute come to life to help businesses adapt during the pandemic. At the same time, we are also seeing how 5G is going to drive up interest, especially from a networking perspective, rather than a use-case perspective. We also see lot of people that are focused on their data, but the algorithms that companies train with the data are going to be their critical assets. It doesn't matter what company or what industry; this will really become the differentiator for many companies.

And particularly amid the backdrop of the pandemic, we've seen how companies are using technology to either bring workers back in safely or serve their customers in new ways , educate children in new ways -- all things that have accelerated the digital transformation that had been underway. I recently read a McKinsey survey that reported on the "speedup in creating digital or digitally enhanced offerings. Across regions, the results suggest a seven-year increase, on average, in the rate at which companies are developing these products and services."

And based on what we're seeing at Intel, those are lifesaving and industry saving investments happening today – completely re-designing the way that patients are treated , and businesses operate.

What are you particularly expecting to see this year?

What I find most interesting is the work that combines IoT with networking capabilities – and I think this is going to be a year of expansive exploration and working with our customers to put these two things together to solve real business challenges.

When we first started the integration of networking technology and operational technology capabilities and layered that across the industry verticals, we could count the amount of interested customers and opportunities on one hand. Last year, it was five times that many and this year we see that number increasing significantly already. So, we're excited to see the industry continue to build excitement to scale those types of deployments.

What role does AI play at the edge and in what way is Intel involved?

An AI use case recently grabbed my attention: employees in a restaurant being screened by a device that checks temperature while the employees are washing their hands – and provides a determination on whether they'd done it adequately before they report to work. I have seen other applications using similar technology in the construction industry , where an AI algorithm scans to see if employees have their helmet, goggles, vest and other safety equipment on properly to determine if they are ready to work. And the best part is – that the employer can see aggregate data on these interactions to determine if more training on handwashing or helmet wearing is needed.

At Intel, in addition to providing the fundamental base technology to enable these applications, we work with a lot of third-party developers to create these applications. And we help the developers scale them across multiple industries with our salesforce. So, we may not make the scan technology that determines if you have the right gear on, but we orchestrate that coordination across our ecosystem with all of our partners to effectively put it into a catalog so that if customers are interested in that sort of application, we can provide them with different parties to make that happen.

Is there a problem with fragmentation in the market and how can that be solved?

It's very fragmented. I gave just two examples of totally different workers coming to work in different industries and I can give you five more that are different again. Although the base technology that Intel creates to enable this type of innovation is very horizontal in nature, the reality is that bringing those to life must be very "vertically-centered" and very "use-case centered" – and must take into account, geography. The two employee use cases I cited look very different if you are talking about employees in North America, India or Germany. So, all in all, this is a holistic ecosystem challenge – and an ecosystem solution.

At Intel, we're in a unique situation to orchestrate these solutions.

People tend to think of Intel as providing the technical footprint that enables edge computing – but we also have the developer reach – and we can use our ecosystem and scale to help our end customers get access to the best solutions.

What sort of challenges do customers face when they're attempting to adopt edge computing?

There are two common challenges. One is the technical question of 'Who can I work with?' A customer may have a chip focus but actually, it's a more complex question than a chip. We like that complexity, because we can be an adviser to them and bring to the table partners that they can work with to solve that complexity leveraging our technical knowledge and ecosystem network.

The second thing that companies struggle with is the challenge of how to fund getting into this space – even if the business case warrants the investment. The world has changed. Companies don't want to write a big capital check for these types of investments – they want a pay-as-you-go model, like you see with the cloud. We've been working with customers to find a way to make that happen – and I think that is going to be a bigger part of the conversation moving forward.

You can see it across almost every aspect of technology now. We rent compute more than we buy compute. Evidenced by the success of AWS, companies can rent compute from Amazon instead of building their own data center – and there are many benefits to that approach. In the old world, even for this video conference that we're having today we would have installed capital to do this. Now it's just a service. And I think that edge as a service is right around the corner.

Do you have an example of that?

A customer in Mexico wanted to deploy outdoor WiFi and add security features into it. So, it was an edge-based computing issue that was part connectivity. But there was actually way more to it than just installing it. We not only helped to solved the technological challenge creatively and coordinated the ecosystem of providers to solve this, but we also found a way so that the customer could get into this market with a service-based model without needing to outlay a lot of capital.

Other semiconductor companies would have a difficult time partnering on both sides of that challenge, but Intel can do it because we have the scale.

How do businesses identify which functions they do want to perform at the edge versus in the cloud?

It really comes down to a question of what you're trying to do and how you do it at the lowest cost, highest quality and standard in computability.

If you have an application and it can perform what you need it to do in the cloud, you'll probably run it there. The cloud is a great thing - there's infinite compute and lots of choice in a well-understood developer environment.

But there are many things you can't or shouldn't do in the cloud for certain reasons. Take the camera example from earlier. Say you have eight high-density cameras, and you want an action on them immediately. You may not be able to afford the latency that comes with going up into the cloud. Or it may be a financial challenge – that its actually more expensive to send that data to the cloud over and over again, and it makes sense to invest in compute at the edge. There are nuances in the decision-making. It really comes down to what you want the application to do, how quickly and how much it should cost.

How does Intel help developers to learn, to build and then test their solutions at the edge?

To enable what we know to be interesting and challenging use cases at the edge, required us to change our focus on who these developers are and what they care about. We learned that they really don't care about what hardware they're running on. The modern developers are fairly well extracted from the hardware – they just assume the compute is going to be available for them to do their work.

To make sure we can deliver on that, we typically target developers by specific capability. The most advanced case that we've been studying on the edge recently is focused video and specifically, video inferencing and AI inferencing. We created a tool called OpenVINO , which was developed in a way that developers didn't have to care about the hardware it runs on. It's a model optimization technology that enables them to easily scale out to different hardware platforms. That's proven to be an interesting value proposition to these developers.

Our goal with OpenVINO is democratize AI activity and inference at the edge. We want to make it simple -- to put these tools in the hands of non-data scientists and make it work. As a part of that process, we're also creating an environment where they can develop anyplace, anytime, anywhere they want to be. We're so proud of our OpenVINO community and are constantly working to grow the community and release new features and capabilities to edge developers.

What's the possible next breakthrough that you see for Intel in edge computing and the IoT space?

I'm looking forward to the big breakthrough when we show integrated 5G, Virtual Private Network and the edge workloads all in one unit. More to come.