技能 人工智能 Coral TPU实时目标检测

Coral TPU实时目标检测

v20260421
yolo-detection-2026-coral-tpu-macos
本技能利用Google Coral Edge TPU加速器,提供高性能的实时物体检测功能。它能够处理实时摄像头画面,识别并返回高达80个COCO数据集中的各类物体(如人、车、狗等)。该系统针对macOS和Linux原生部署进行了优化,实现极低的延迟(如320x320下约4ms),非常适用于嵌入式视觉系统和实时监控场景。
获取技能
240 次下载
概览

Coral TPU Object Detection

Real-time object detection natively utilizing the Google Coral Edge TPU accelerator on your local hardware. Detects 80 COCO classes (person, car, dog, cat, etc.) with ~4ms inference on 320x320 input.

Requirements

  • Python 3.9–3.13

How It Works

┌─────────────────────────────────────────────────────┐
│ Host (Aegis-AI)                                     │
│   frame.jpg → /tmp/aegis_detection/                 │
│   stdin  ──→ ┌──────────────────────────────┐       │
│              │ Native Python Environment     │       │
│              │   detect.py                   │       │
│              │   ├─ loads _edgetpu.tflite     │       │
│              │   ├─ reads frame from disk     │       │
│              │   └─ runs inference on TPU    │       │
│   stdout ←── │   → JSONL detections          │       │
│              └──────────────────────────────┘       │
│   USB ──→ Native System USB / edgetpu drivers       │
└─────────────────────────────────────────────────────┘
  1. Aegis writes camera frame JPEG to shared /tmp/aegis_detection/ workspace
  2. Sends frame event via stdin JSONL to the local Python instance
  3. detect.py invokes PyCoral and executes natively on the mapped USB Edge TPU
  4. Returns detections event via stdout JSONL

Platform Setup

Linux

# Uses the official apt-get google-coral packages natively
./deploy.sh

macOS

# Downloads and installs the libedgetpu OS payload framework inline
./deploy.sh

Important Deployment Notice: The updated deploy.sh script will natively halt execution and prompt you securely for your OS sudo password to securely register the USB drivers (libedgetpu) system-wide. If you refuse the prompt, it gracefully outputs the exact terminal instructions for you to configure it manually.

Performance

Input Size Inference On-chip Notes
320x320 ~4ms 100% Fully on TPU, best for real-time
640x640 ~20ms Partial Some layers on CPU (model segmented)

Cooling: The USB Accelerator aluminum case acts as a heatsink. If too hot to touch during continuous inference, it will thermal-throttle. Consider active cooling or clock_speed: standard.

Protocol

Same JSONL as yolo-detection-2026:

Skill → Aegis (stdout)

{"event": "ready", "model": "yolo26n_edgetpu", "device": "coral", "format": "edgetpu_tflite", "tpu_count": 1, "classes": 80}
{"event": "detections", "frame_id": 42, "camera_id": "front_door", "objects": [{"class": "person", "confidence": 0.85, "bbox": [100, 50, 300, 400]}]}
{"event": "perf_stats", "total_frames": 50, "timings_ms": {"inference": {"avg": 4.1, "p50": 3.9, "p95": 5.2}}}

Bounding Box Format

[x_min, y_min, x_max, y_max] — pixel coordinates (xyxy).

Installation

Linux / macOS

./deploy.sh

The deployer builds the local Python virtual environment and installs the Edge TPU runtime. No Docker required.

信息
Category 人工智能
Name yolo-detection-2026-coral-tpu-macos
版本 v20260421
大小 10.58MB
更新时间 2026-04-28
语言