AI On The Edge: New Flex Logix X1 Inference AI Chip For Fanless Designs

https://www.anandtech.com/show/14189/flex-logix-inferx-x1-inference-ai-chip

A large number of inference demonstrations published by the big chip manufacturers revolve around processing large batch sizes of images on trained networks. In reality, when video is being inferenced, it is frame by frame – an effective batch size of one. The large chips on the market aren’t optimized for a batch size of one, and grossly overconsume power to do so: Flex Logix believes it has the answer with its new InferX chip design and IP for this market, focusing directly on those edge devices that process at a batch size of one and are fanless.

via AnandTech http://bit.ly/phao0v

April 10, 2019 at 07:35AM

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.