You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: source/_posts/2019-04-08-testing-the-camera-on-the-simulator.markdown
+12-10Lines changed: 12 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -15,11 +15,11 @@ On iOS, to use the camera, one has to use the machinery that comes with [`AVFou
15
15
16
16
<!--more-->
17
17
18
-
Although you can use `protocols` to generalize the real objects, at some point, you are going to stumble upon a dilemma: The simulator doesn't have a camera, and you can't instantiate the framework classes, making the tests (almost) impossible.
18
+
Although you can use `protocols` to generalize the real objects, at some point, you are going to stumble upon a dilemma: the simulator doesn't have a camera, and you can't instantiate the framework classes making the tests (almost) impossible.
19
19
20
20
#### What are you talking about?
21
21
22
-
Let's start with a very simple program that captures QR Code (I'm skipping lots of boilerplate but if you are looking for a more thorough example, [here](https://www.hackingwithswift.com/example-code/media/how-to-scan-a-qr-code) you have a great article.
22
+
Let's start with a very simple program that captures QR Code (I'm skipping lots of boilerplate but if you are looking for a more thorough example, [here](https://www.hackingwithswift.com/example-code/media/how-to-scan-a-qr-code) you have a great article).
When the detection happens, you can compute from framework-provided values, by implementing the following method from [`AVCaptureMetadataOutputObjectsDelegate`](https://developer.apple.com/documentation/avfoundation/avcapturemetadataoutputobjectsdelegate/1389481-metadataoutput) and say we want to exercise our program in a way that we ensure that the `CameraOutputDelegate` methods are properly called, given what
71
-
72
-
The problem here is that all of these classes are provided by the framework and you can't `init` them.
70
+
When the detection happens, you can compute from framework-provided values, by implementing the following method from [`AVCaptureMetadataOutputObjectsDelegate`](https://developer.apple.com/documentation/avfoundation/avcapturemetadataoutputobjectsdelegate/1389481-metadataoutput). Say we want to exercise our program in a way that we ensure that the `CameraOutputDelegate` methods are properly called, given what `AVFoundation` provides.
73
71
74
72
```swift
75
73
finalclassCameraOutputSpy: CameraOutputDelegate {
@@ -103,6 +101,10 @@ camera.metadataOutput(
103
101
)
104
102
```
105
103
104
+
Waat!?
105
+
106
+
The problem here is that all of these classes are concrete, so we can't abstract them into an interface. Also they are supposed to be created and populated at runtime, hence you can't `init` them.
107
+
106
108
#### 🍸 `Swizzle` to the rescue
107
109
108
110
One possible solution for this kind of scenario (since the framework it's all `Objective-C`...for now at least), is to use the [`Objective-C` runtime shenanigans](https://nshipster.com/method-swizzling/) to "fill this gap".
@@ -113,19 +115,19 @@ I'm not going to lay down the nitty-gritty details about how it works, but the m
0 commit comments