6 episodes

EdgeCortix subject matter experts discuss edge AI processors, AI software frameworks, and AI industry trends.

The EdgeCortix Podcast EdgeCortix Inc.

    • Technology

EdgeCortix subject matter experts discuss edge AI processors, AI software frameworks, and AI industry trends.

    AI Drives the Software-Defined Heterogeneous Computing Era

    AI Drives the Software-Defined Heterogeneous Computing Era

    By Dr. Sakyasingha Dasgupta.

    The rapid development of artificial intelligence (AI) applications has created enormous demand for high-performance and energy-efficient computing systems. However, traditional homogeneous architectures based on Von Neumann processors face challenges in meeting the requirements of AI workloads, which often involve massive parallelism, large data volumes, and complex computations.

    Heterogeneous computing architectures integrate different processing units with specialized capabilities and features and have emerged as promising solutions for AI applications. In our view, AI is driving the next era of software-defined heterogeneous computing, enabling better solutions for complex problems.

    Read the full blog here

    • 8 min
    Multimodal Generative AI on Energy-Efficient Edge Processors

    Multimodal Generative AI on Energy-Efficient Edge Processors

    By Dr. Sakyasingha Dasgupta. 

    Edge computing will grow exponentially in the next few years, as more and more devices and applications demand low-latency, high-performance, and privacy-preserving computation at the edge of the network. 
    However, one of the biggest challenges facing edge computing is handling the increasing complexity and diversity of data sources and modalities, such as images, videos, audio, text, speech, and sensors. This challenge is where multimodal generative artificial intelligence (AI) comes into play.

    Read the full blog here

    • 6 min
    Efficient Edge AI Chips with Reconfigurable Accelerators

    Efficient Edge AI Chips with Reconfigurable Accelerators

    By Nikolay Nez. 

    DNA IP and MERA are a potent combination, reducing the hardware-specific knowledge OEMs would otherwise need to power AI applications efficiently. 
    Much of the AI inference conversation focuses on delivering as many operations as quickly as possible. Massive GPU-based implementations can find homes in data centers with practically unlimited power and cooling. However, add some embedded system constraints found outside the data center.

    Read the full blog here

    • 7 min
    再構成可能なアクセラレータを搭載した効率的なエッジAIチップ

    再構成可能なアクセラレータを搭載した効率的なエッジAIチップ

    DNA IPとMERAの強力なコンビネーションで、ユーザーがAIアプリケーションを効率的に動かすためのハードウエア固有の特別な知識習得の必要性を減らすことができます。
    AI推論の話題の多くは、できるだけ早く多くの処理を提供することに焦点が当てられています。データセンターのように実質的に無制限の電力と冷却を備えた施設では、GPUベースの大きなシステムを実装することが可能です。しかし、データセンター以外の場所(広義では「エッジ」)にある組み込みシステムの制約が加わると、サイズ、重量、電力、利用率を考慮してスケーリングされた、より効率的なエッジAIチップが不可欠となります。EdgeCortix Dynamic Neural Accelerator (DNA) プロセッサアーキテクチャは、カスタムASICやFPGAベースの多くのアプリケーションで、AI推論の高速化ソリューションを提供します。

    詳しくは、edgecortix.com でテクノロジーをご確認ください。

    • 10 min
    Connecting Edge AI Software with PyTorch, TensorFlow Lite, and ONNX Models

    Connecting Edge AI Software with PyTorch, TensorFlow Lite, and ONNX Models

    By Antonio Nevado. 

    Data scientists and others may have concerns moving PyTorch,  TensorFlow, and ONNX models to edge AI software applications – MERA makes it easy and is model-agnostic.

    PyTorch, TensorFlow, and ONNX are familiar tools for many data scientists and AI software developers. These frameworks run models natively on a CPU or accelerated on a GPU, requiring little hardware knowledge. But ask those same folks to move their applications to edge devices, and suddenly knowing more about AI acceleration hardware becomes essential – and perhaps a bit intimidating for the uninitiated.

    Read the full blog here

    • 6 min
    What is edge AI inference doing for more devices?

    What is edge AI inference doing for more devices?

    By Jeffrey Grosman. 

    AI inference is a common term - but what is edge AI inference? EdgeCortix provides an answer in terms of workloads, efficiency, and applications. 
    Artificial intelligence (AI) is changing the rules for many applications. Teams train AI models to recognize objects or patterns, then run AI inference using those models against incoming data streams. When size, weight, power, and time are of little concern, data center or cloud-based AI inference may do. But in resource-constrained edge devices, different technology is needed. What is edge AI inference doing for more devices? Let’s look at differences in AI inference for the edge and how intellectual property (IP) addresses them.

    Read the full blog here

    • 7 min

Top Podcasts In Technology

No Priors: Artificial Intelligence | Technology | Startups
Conviction | Pod People
Lex Fridman Podcast
Lex Fridman
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Hard Fork
The New York Times
Acquired
Ben Gilbert and David Rosenthal
The Neuron: AI Explained
The Neuron