News Analysis

Balancing the Cost-Performance-Energy Equation

Living on the edge—or should we say, at the edge via fog or edge computing—is becoming an increasingly popular business model. Edge computing, i.e., embedding computing infrastructure near the end devices, along the “edges” of a network, often to accelerate time-to-insight and reduce costs, can solve a real need for IoT (Internet of Things) applications that require very fast response times. It can also create opportunities for autonomous edge operation, which can benefit from ultra-fast AI (artificial intelligence) data-processing technology. But is AI too power-hungry and expensive to operate at the edge?

A new company aims to make it more cost-efficient to deploy high-performance AI hardware at the edge, which could be a game changer for companies that want to manufacture AI or IoT devices at scale—especially small form factor devices and those with a low price point. GTI (Gyrfalcon Technology Inc.) emerged from stealth mode in September with the goal of providing the industry with low-cost, low-power, high-performance AI processors, thereby expanding the power of cloud AI to local devices. By offering better performance and greater efficiency, GTI is essentially looking to make AI “productization” possible.

The company’s edge-first approach has appealed to early customers like LG, Fujitsu, and Samsung, among a handful of others in development, because unlike cloud-based AI solutions, which require relaying data to and from the cloud, GTI’s solution cuts out this latency by enabling core AI processing in edge equipment. As a result, the solution avoids the latency inherent in cloud computing by allowing edge equipment to receive actionable insight more quickly. It also reduces the costs involved in transmitting data, and it may also increase data privacy, since the data never leaves the edge.

GTI’s Lightspeeur 2801S AI Accelerator is at the heart of its solution, along with its patented MPE (Matrix Processing Engine), which enables AI processing for IoT and other edge equipment within a small energy envelope. The technology opens the door for a wide range of use cases and equipment designs at a much lower cost—up to ten times lower than competing hardware, according to GTI. With these savings, companies can better balance the cost-performance-energy equation and potentially achieve better profit margins.

Traditionally, manufacturers looking to bring small, inexpensive IoT and AI-enabled devices to market at scale had to make tradeoffs, whether it was compromising on cost, performance, or energy use, or a combination of all three. However, with innovations like GTI’s AI Accelerator, it’ll become more plausible for more companies to bring AI-enabled equipment to more people and more businesses, essentially paving the way for next-gen AI products to come to market.

While cloud-based solutions offer plenty of benefits, one industry trend is moving toward a blend of cloud and edge computing architectures in order to maximize the potential of both. Edge-first companies like GTI will become more commonplace, because the benefits of pushing high-performance, low-power processing to the edge are hard to deny. However, most in the industry agree, there’s no need to pick one type of architecture over the other; rather, companies must look at how cloud and edge computing can complement each other to accomplish their particular business goals.

Want to tweet about this article? Use hashtags #IoT #M2M #AI #artificialintelligence #edge #edgecomputing #fog #fogcomputing #cloud #cloudcomputing #security #blockchain #cybersecurity #data #analytics #machinelearning

By |2018-10-17T13:28:43+00:0010/17/2018|

Leave A Comment