Skills Artificial Intelligence Real-Time Object Detection With Coral TPU

Real-Time Object Detection With Coral TPU

v20260421
yolo-detection-2026-coral-tpu-macos
This skill provides high-performance, real-time object detection utilizing the Google Coral Edge TPU accelerator. It processes live camera frames to identify up to 80 COCO classes (e.g., person, car, dog). Optimized for native macOS and Linux deployment, it achieves extremely low latency (e.g., ~4ms at 320x320), making it ideal for embedded vision systems, surveillance, and low-latency monitoring applications.
Get Skill
240 downloads
Overview

Coral TPU Object Detection

Real-time object detection natively utilizing the Google Coral Edge TPU accelerator on your local hardware. Detects 80 COCO classes (person, car, dog, cat, etc.) with ~4ms inference on 320x320 input.

Requirements

  • Python 3.9–3.13

How It Works

┌─────────────────────────────────────────────────────┐
│ Host (Aegis-AI)                                     │
│   frame.jpg → /tmp/aegis_detection/                 │
│   stdin  ──→ ┌──────────────────────────────┐       │
│              │ Native Python Environment     │       │
│              │   detect.py                   │       │
│              │   ├─ loads _edgetpu.tflite     │       │
│              │   ├─ reads frame from disk     │       │
│              │   └─ runs inference on TPU    │       │
│   stdout ←── │   → JSONL detections          │       │
│              └──────────────────────────────┘       │
│   USB ──→ Native System USB / edgetpu drivers       │
└─────────────────────────────────────────────────────┘
  1. Aegis writes camera frame JPEG to shared /tmp/aegis_detection/ workspace
  2. Sends frame event via stdin JSONL to the local Python instance
  3. detect.py invokes PyCoral and executes natively on the mapped USB Edge TPU
  4. Returns detections event via stdout JSONL

Platform Setup

Linux

# Uses the official apt-get google-coral packages natively
./deploy.sh

macOS

# Downloads and installs the libedgetpu OS payload framework inline
./deploy.sh

Important Deployment Notice: The updated deploy.sh script will natively halt execution and prompt you securely for your OS sudo password to securely register the USB drivers (libedgetpu) system-wide. If you refuse the prompt, it gracefully outputs the exact terminal instructions for you to configure it manually.

Performance

Input Size Inference On-chip Notes
320x320 ~4ms 100% Fully on TPU, best for real-time
640x640 ~20ms Partial Some layers on CPU (model segmented)

Cooling: The USB Accelerator aluminum case acts as a heatsink. If too hot to touch during continuous inference, it will thermal-throttle. Consider active cooling or clock_speed: standard.

Protocol

Same JSONL as yolo-detection-2026:

Skill → Aegis (stdout)

{"event": "ready", "model": "yolo26n_edgetpu", "device": "coral", "format": "edgetpu_tflite", "tpu_count": 1, "classes": 80}
{"event": "detections", "frame_id": 42, "camera_id": "front_door", "objects": [{"class": "person", "confidence": 0.85, "bbox": [100, 50, 300, 400]}]}
{"event": "perf_stats", "total_frames": 50, "timings_ms": {"inference": {"avg": 4.1, "p50": 3.9, "p95": 5.2}}}

Bounding Box Format

[x_min, y_min, x_max, y_max] — pixel coordinates (xyxy).

Installation

Linux / macOS

./deploy.sh

The deployer builds the local Python virtual environment and installs the Edge TPU runtime. No Docker required.

Info
Name yolo-detection-2026-coral-tpu-macos
Version v20260421
Size 10.58MB
Updated At 2026-04-28
Language