1OpenCV iOS - Video Processing {#tutorial_video_processing}
2=============================
3
4This tutorial explains how to process video frames using the iPhone's camera and OpenCV.
5
6Prerequisites:
7--------------
8
9-   Xcode 4.3 or higher
10-   Basic knowledge of iOS programming (Objective-C, Interface Builder)
11
12Including OpenCV library in your iOS project
13--------------------------------------------
14
15The OpenCV library comes as a so-called framework, which you can directly drag-and-drop into your
16XCode project. Download the latest binary from
17<http://sourceforge.net/projects/opencvlibrary/files/opencv-ios/>. Alternatively follow this
18guide @ref tutorial_ios_install to compile the framework manually. Once you have the framework, just
19drag-and-drop into XCode:
20
21![](images/xcode_hello_ios_framework_drag_and_drop.png)
22
23Also you have to locate the prefix header that is used for all header files in the project. The file
24is typically located at "ProjectName/Supporting Files/ProjectName-Prefix.pch". There, you have add
25an include statement to import the opencv library. However, make sure you include opencv before you
26include UIKit and Foundation, because else you will get some weird compile errors that some macros
27like min and max are defined multiple times. For example the prefix header could look like the
28following:
29@code{.objc}
30//
31// Prefix header for all source files of the 'VideoFilters' target in the 'VideoFilters' project
32//
33
34#import <Availability.h>
35
36#ifndef __IPHONE_4_0
37#warning "This project uses features only available in iOS SDK 4.0 and later."
38#endif
39
40#ifdef __cplusplus
41#import <opencv2/opencv.hpp>
42#endif
43
44#ifdef __OBJC__
45    #import <UIKit/UIKit.h>
46    #import <Foundation/Foundation.h>
47#endif
48@endcode
49### Example video frame processing project
50
51#### User Interface
52
53First, we create a simple iOS project, for example Single View Application. Then, we create and add
54an UIImageView and UIButton to start the camera and display the video frames. The storyboard could
55look like that:
56
57![](images/xcode_hello_ios_viewcontroller_layout.png)
58
59Make sure to add and connect the IBOutlets and IBActions to the corresponding ViewController:
60@code{.objc}
61@interface ViewController : UIViewController
62{
63    IBOutlet UIImageView* imageView;
64    IBOutlet UIButton* button;
65}
66
67- (IBAction)actionStart:(id)sender;
68
69@end
70@endcode
71#### Adding the Camera
72
73We add a camera controller to the view controller and initialize it when the view has loaded:
74@code{.objc}
75#import <opencv2/videoio/cap_ios.h>
76using namespace cv;
77
78
79@interface ViewController : UIViewController
80{
81    ...
82    CvVideoCamera* videoCamera;
83}
84...
85@property (nonatomic, retain) CvVideoCamera* videoCamera;
86
87@end
88@endcode
89@code{.objc}
90- (void)viewDidLoad
91{
92    [super viewDidLoad];
93    // Do any additional setup after loading the view, typically from a nib.
94
95    self.videoCamera = [[CvVideoCamera alloc] initWithParentView:imageView];
96    self.videoCamera.defaultAVCaptureDevicePosition = AVCaptureDevicePositionFront;
97    self.videoCamera.defaultAVCaptureSessionPreset = AVCaptureSessionPreset352x288;
98    self.videoCamera.defaultAVCaptureVideoOrientation = AVCaptureVideoOrientationPortrait;
99    self.videoCamera.defaultFPS = 30;
100    self.videoCamera.grayscale = NO;
101}
102@endcode
103In this case, we initialize the camera and provide the imageView as a target for rendering each
104frame. CvVideoCamera is basically a wrapper around AVFoundation, so we provie as properties some of
105the AVFoundation camera options. For example we want to use the front camera, set the video size to
106352x288 and a video orientation (the video camera normally outputs in landscape mode, which results
107in transposed data when you design a portrait application).
108
109The property defaultFPS sets the FPS of the camera. If the processing is less fast than the desired
110FPS, frames are automatically dropped.
111
112The property grayscale=YES results in a different colorspace, namely "YUV (YpCbCr 4:2:0)", while
113grayscale=NO will output 32 bit BGRA.
114
115Additionally, we have to manually add framework dependencies of the opencv framework. Finally, you
116should have at least the following frameworks in your project:
117
118-   opencv2
119-   Accelerate
120-   AssetsLibrary
121-   AVFoundation
122-   CoreGraphics
123-   CoreImage
124-   CoreMedia
125-   CoreVideo
126-   QuartzCore
127-   UIKit
128-   Foundation
129
130    ![](images/xcode_hello_ios_frameworks_add_dependencies.png)
131
132#### Processing frames
133
134We follow the delegation pattern, which is very common in iOS, to provide access to each camera
135frame. Basically, the View Controller has to implement the CvVideoCameraDelegate protocol and has to
136be set as delegate to the video camera:
137@code{.objc}
138@interface ViewController : UIViewController<CvVideoCameraDelegate>
139@endcode
140@code{.objc}
141- (void)viewDidLoad
142{
143    ...
144    self.videoCamera = [[CvVideoCamera alloc] initWithParentView:imageView];
145    self.videoCamera.delegate = self;
146    ...
147}
148@endcode
149@code{.objc}
150#pragma mark - Protocol CvVideoCameraDelegate
151
152#ifdef __cplusplus
153- (void)processImage:(Mat&)image;
154{
155    // Do some OpenCV stuff with the image
156}
157#endif
158@endcode
159Note that we are using C++ here (cv::Mat). Important: You have to rename the view controller's
160extension .m into .mm, so that the compiler compiles it under the assumption of Objective-C++
161(Objective-C and C++ mixed). Then, __cplusplus is defined when the compiler is processing the file
162for C++ code. Therefore, we put our code within a block where __cplusplus is defined.
163
164#### Basic video processing
165
166From here you can start processing video frames. For example the following snippet color-inverts the
167image:
168@code{.objc}
169- (void)processImage:(Mat&)image;
170{
171    // Do some OpenCV stuff with the image
172    Mat image_copy;
173    cvtColor(image, image_copy, COLOR_BGR2GRAY);
174
175    // invert image
176    bitwise_not(image_copy, image_copy);
177
178    //Convert BGR to BGRA (three channel to four channel)
179    Mat bgr;
180    cvtColor(image_copy, bgr, COLOR_GRAY2BGR);
181
182    cvtColor(bgr, image, COLOR_BGR2BGRA);
183}
184@endcode
185#### Start!
186
187Finally, we have to tell the camera to actually start/stop working. The following code will start
188the camera when you press the button, assuming you connected the UI properly:
189@code{.objc}
190#pragma mark - UI Actions
191
192- (IBAction)actionStart:(id)sender;
193{
194    [self.videoCamera start];
195}
196@endcode
197#### Hints
198
199Try to avoid costly matrix copy operations as much as you can, especially if you are aiming for
200real-time. As the image data is passed as reference, work in-place, if possible.
201
202When you are working on grayscale data, turn set grayscale = YES as the YUV colorspace gives you
203directly access the luminance plane.
204
205The Accelerate framework provides some CPU-accelerated DSP filters, which come handy in your case.
206