Welcome to the Ultralytics YOLO iOS App GitHub repository! 📖 This project transforms your iOS device into a powerful real-time inference tool for object detection, segmentation, classification, pose estimation, and oriented bounding box detection. The SDK supports both legacy YOLO11 (with Core ML NMS) and state-of-the-art YOLO26 models (NMS-free, with Swift-side postprocessing). Download the app directly from the App Store or explore our guide to integrate YOLO capabilities into your own Swift applications.
| Feature | iOS |
|---|---|
| Detection | ✅ |
| Classification | ✅ |
| Segmentation | ✅ |
| Pose Estimation | ✅ |
| OBB Detection | ✅ |
- Native iOS Performance — Maximum speed with Swift & Core ML
- Optimized for Apple Silicon — Leverages the Neural Engine & GPU
- Real-time Detection — 60+ FPS on recent iPhones
- Low Latency — Direct hardware access without framework overhead
- iOS-First Design — Native UI/UX following Apple guidelines
- Core ML Integration — Apple's optimized ML framework
- YOLO26 (NMS-free) and YOLO11 models both supported
This repository provides a comprehensive solution for running YOLO models on Apple platforms, enabling powerful Edge AI capabilities:
The primary iOS application allows easy real-time object detection using your device's camera or image library. Official Ultralytics models are downloaded on demand, and you can also test your custom Core ML models by adding them to the app project.
A lightweight Swift package designed for iOS and iPadOS. It simplifies the integration and usage of YOLO-based models like YOLO26 within your own applications. Integrate YOLO models effortlessly with minimal code using SwiftUI:
// Perform inference on a UIImage
let result = model(uiImage)// Use the built-in camera view for real-time detection
var body: some View {
YOLOCamera(
url: URL(string: "https://github.com/ultralytics/yolo-ios-app/releases/download/v8.3.0/yolo26n-seg.mlpackage.zip")!,
task: .segment,
cameraPosition: .back
)
.ignoresSafeArea()
}New to YOLO on mobile or want to quickly test your custom model? Start with the main YOLOiOSApp. Official models are downloaded on demand.
- Ultralytics YOLO iOS App (Main App): The easiest way to experience YOLO detection on iOS.
Ready to integrate YOLO into your own project? Explore the Swift Package and example applications.
- Swift Package (YOLO Library): Integrate YOLO capabilities into your Swift app.
- Example Apps: See practical implementations using the YOLO Swift Package.
- Real-Time Inference: Achieve high-speed, high-accuracy object detection on iPhones and iPads using optimized Core ML models, potentially enhanced through techniques like model quantization.
- Apple Mobile Support: The Swift Package targets iOS and iPadOS with native Core ML integration.
- Flexible Tasks: Supports object detection, segmentation, classification, pose estimation, and oriented bounding box (OBB) detection.
This repository includes comprehensive unit tests for both the YOLO Swift Package and the example applications, ensuring code reliability and stability.
Tests require Core ML model files (.mlpackage), which are not included in the repository due to their size. To run the tests with model validation:
- Set
SKIP_MODEL_TESTS = falsein the relevant test files. - Download the required models from the Ultralytics releases or train your own using tools like Ultralytics Platform.
- Convert the models to Core ML format using the Ultralytics Python library's export function.
- Add the exported
.mlpackagefiles to your Xcode project, ensuring they are included in the test targets. - Run the tests using Xcode's Test Navigator (Cmd+U).
If you don't have the model files, you can still run tests by keeping SKIP_MODEL_TESTS = true. This will skip tests that require loading and running a model.
- YOLO Swift Package: Includes tests for core functionalities like model loading, preprocessing, inference, and postprocessing across different tasks.
- Example Apps: Contains tests verifying UI components, model integration, and real-time inference performance within the sample applications.
Each test directory (e.g., Tests/YOLOTests) may include a README.md with specific instructions for testing that component, covering:
- Required model files and where to obtain them.
- Steps for model conversion and setup.
- Overview of the testing strategy.
- Explanation of key test cases.
We warmly welcome contributions to our open-source projects! Your support helps us push the boundaries of Artificial Intelligence (AI). Get involved by reviewing our Contributing Guide and sharing your feedback through our Survey. Thank you 🙏 to all our contributors!
Ultralytics provides two licensing options to accommodate diverse use cases:
- AGPL-3.0 License: An OSI-approved open-source license ideal for academic research, personal projects, and experimentation. It promotes open collaboration and knowledge sharing. See the LICENSE file and the full AGPL-3.0 license text for details.
- Enterprise License: Tailored for commercial applications, this license allows the integration of Ultralytics software and AI models into commercial products and services without the open-source requirements of AGPL-3.0. If your scenario involves commercial use, please contact us via Ultralytics Licensing.
- For bug reports and feature requests related to this iOS project, please use GitHub Issues.
- For questions, discussions, and support regarding Ultralytics technologies, join our active Discord community!









