Intel Corporation
Methods and apparatus to implement multiple inference compute engines
Last updated:
Abstract:
Methods and apparatus to implement multiple inference compute engines are disclosed herein. A disclosed example apparatus includes a first inference compute engine, a second inference compute engine, and an accelerator on coherent fabric to couple the first inference compute engine and the second inference compute engine to a converged coherency fabric of a system-on-chip, the accelerator on coherent fabric to arbitrate requests from the first inference compute engine and the second inference compute engine to utilize a single in-die interconnect port.
Status:
Grant
Type:
Utility
Filling date:
15 Aug 2019
Issue date:
19 Oct 2021