Skip to content

ultralytics/yolo-ios-app

Ultralytics logo

English | 简体中文

🚀 Ultralytics YOLO for iOS: App and Swift Package

Ultralytics Actions CI codecov

Ultralytics Discord Ultralytics Forums Ultralytics Reddit

Welcome to the Ultralytics YOLO iOS App GitHub repository! 📖 This project transforms your iOS device into a powerful real-time inference tool for object detection, segmentation, classification, pose estimation, and oriented bounding box detection. The SDK supports both legacy YOLO11 (with Core ML NMS) and state-of-the-art YOLO26 models (NMS-free, with Swift-side postprocessing). Download the app directly from the App Store or explore our guide to integrate YOLO capabilities into your own Swift applications.

✨ Why Choose YOLO iOS Native?

Feature iOS
Detection
Classification
Segmentation
Pose Estimation
OBB Detection
  • Native iOS Performance — Maximum speed with Swift & Core ML
  • Optimized for Apple Silicon — Leverages the Neural Engine & GPU
  • Real-time Detection — 60+ FPS on recent iPhones
  • Low Latency — Direct hardware access without framework overhead
  • iOS-First Design — Native UI/UX following Apple guidelines
  • Core ML Integration — Apple's optimized ML framework
  • YOLO26 (NMS-free) and YOLO11 models both supported

📂 Repository Content

This repository provides a comprehensive solution for running YOLO models on Apple platforms, enabling powerful Edge AI capabilities:

The primary iOS application allows easy real-time object detection using your device's camera or image library. Official Ultralytics models are downloaded on demand, and you can also test your custom Core ML models by adding them to the app project.

A lightweight Swift package designed for iOS and iPadOS. It simplifies the integration and usage of YOLO-based models like YOLO26 within your own applications. Integrate YOLO models effortlessly with minimal code using SwiftUI:

// Perform inference on a UIImage
let result = model(uiImage)
// Use the built-in camera view for real-time detection
var body: some View {
    YOLOCamera(
        url: URL(string: "https://github.com/ultralytics/yolo-ios-app/releases/download/v8.3.0/yolo26n-seg.mlpackage.zip")!,
        task: .segment,
        cameraPosition: .back
    )
    .ignoresSafeArea()
}

🛠️ Quickstart Guide

New to YOLO on mobile or want to quickly test your custom model? Start with the main YOLOiOSApp. Official models are downloaded on demand.

Ready to integrate YOLO into your own project? Explore the Swift Package and example applications.

✨ Key Highlights

🧪 Testing Procedures

This repository includes comprehensive unit tests for both the YOLO Swift Package and the example applications, ensuring code reliability and stability.

Running Tests

Tests require Core ML model files (.mlpackage), which are not included in the repository due to their size. To run the tests with model validation:

  1. Set SKIP_MODEL_TESTS = false in the relevant test files.
  2. Download the required models from the Ultralytics releases or train your own using tools like Ultralytics Platform.
  3. Convert the models to Core ML format using the Ultralytics Python library's export function.
  4. Add the exported .mlpackage files to your Xcode project, ensuring they are included in the test targets.
  5. Run the tests using Xcode's Test Navigator (Cmd+U).

If you don't have the model files, you can still run tests by keeping SKIP_MODEL_TESTS = true. This will skip tests that require loading and running a model.

Test Coverage

  • YOLO Swift Package: Includes tests for core functionalities like model loading, preprocessing, inference, and postprocessing across different tasks.
  • Example Apps: Contains tests verifying UI components, model integration, and real-time inference performance within the sample applications.

Test Documentation

Each test directory (e.g., Tests/YOLOTests) may include a README.md with specific instructions for testing that component, covering:

  • Required model files and where to obtain them.
  • Steps for model conversion and setup.
  • Overview of the testing strategy.
  • Explanation of key test cases.

💡 Contribute

We warmly welcome contributions to our open-source projects! Your support helps us push the boundaries of Artificial Intelligence (AI). Get involved by reviewing our Contributing Guide and sharing your feedback through our Survey. Thank you 🙏 to all our contributors!

Ultralytics open-source contributors

📄 License

Ultralytics provides two licensing options to accommodate diverse use cases:

  • AGPL-3.0 License: An OSI-approved open-source license ideal for academic research, personal projects, and experimentation. It promotes open collaboration and knowledge sharing. See the LICENSE file and the full AGPL-3.0 license text for details.
  • Enterprise License: Tailored for commercial applications, this license allows the integration of Ultralytics software and AI models into commercial products and services without the open-source requirements of AGPL-3.0. If your scenario involves commercial use, please contact us via Ultralytics Licensing.

🤝 Contact

  • For bug reports and feature requests related to this iOS project, please use GitHub Issues.
  • For questions, discussions, and support regarding Ultralytics technologies, join our active Discord community!

Ultralytics GitHub space Ultralytics LinkedIn space Ultralytics Twitter space Ultralytics YouTube space Ultralytics TikTok space Ultralytics BiliBili space Ultralytics Discord