Skip to content

Commit 7acdf95

Browse files
committed
fix links and typos
1 parent 120dede commit 7acdf95

File tree

1 file changed

+20
-18
lines changed

1 file changed

+20
-18
lines changed

_mobile/ios.md

+20-18
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,11 @@ published: true
1212

1313
To get started with PyTorch on iOS, we recommend exploring the following [HelloWorld](https://github.com/pytorch/ios-demo-app/tree/master/HelloWorld).
1414

15-
## Quickstart with a Hello World example
15+
## Quickstart with a Hello World Example
1616

1717
HelloWorld is a simple image classification application that demonstrates how to use PyTorch C++ libraries on iOS. The code is written in Swift and uses Objective-C as a bridge.
1818

19-
### Model preparation
19+
### Model Preparation
2020

2121
Let's start with model preparation. If you are familiar with PyTorch, you probably should already know how to train and save your model. In case you don't, we are going to use a pre-trained image classification model - Resnet18, which is already packaged in [TorchVision](https://pytorch.org/docs/stable/torchvision/index.html). To install it, run the command below.
2222

@@ -34,7 +34,7 @@ python trace_model.py
3434

3535
If everything works well, we should have our model - `model.pt` generated in the `HelloWorld` folder. Now copy the model file to our application folder `HelloWorld/model`.
3636

37-
> To find out more details about TorchScript, please visit [tutorials on pytorch.org](https://pytorch.org/docs/stable/jit.html)
37+
> To find out more details about TorchScript, please visit [tutorials on pytorch.org](https://pytorch.org/tutorials/advanced/cpp_export.html)
3838
3939
### Install PyTorch C++ libraries via Cocoapods
4040

@@ -46,11 +46,13 @@ pod install
4646

4747
Now it's time to open the `HelloWorld.xcworkspace` in XCode, select an iOS simulator and launch it (cmd + R). If everything works well, we should see a wolf picture on the simulator screen along with the prediction result.
4848

49+
![](https://github.com/pytorch/ios-demo-app/blob/master/HelloWorld/HelloWorld/HelloWorld/image.jpg?raw=true)
50+
4951
### Code Walkthrough
5052

5153
In this part, we are going to walk through the code step by step.
5254

53-
#### Image loading
55+
#### Image Loading
5456

5557
Let's begin with image loading.
5658

@@ -76,9 +78,9 @@ for i in 0 ..< w * h {
7678
}
7779
```
7880

79-
The code might look weird at first glance, but it’ll make sense once we understand our model. The input data of our model is a 3-channel RGB image of shape (3 x H x W), where H and W are expected to be at least 224. The image have to be loaded in to a range of [0, 1] and then normalized using mean = [0.485, 0.456, 0.406] and std = [0.229, 0.224, 0.225].
81+
The code might look weird at first glance, but it’ll make sense once we understand our model. The input data is a 3-channel RGB image of shape (3 x H x W), where H and W are expected to be at least 224. The image has to be loaded in to a range of [0, 1] and then normalized using mean = [0.485, 0.456, 0.406] and std = [0.229, 0.224, 0.225].
8082

81-
#### TorchScript module
83+
#### TorchScript Module
8284

8385
Now that we have preprocessed our input data and we have a pre-trained TorchScript model, the next step is to use them to run predication. To do that, we'll first load our model into the application.
8486

@@ -97,7 +99,7 @@ Note that the `TorchModule` Class is an Objective-C wrapper of `torch::jit::scri
9799
```cpp
98100
torch::jit::script::Module module = torch::jit::load(filePath.UTF8String);
99101
```
100-
Since Swift can not talk to C++ directly, we have to either use an Objective-C class as a bride, or create a C wrapper for the C++ library. For the demo purpose, we're going to wrap everything in this Objective-C class, but we are working on bringing the Swift wrapper to PyTorch.
102+
Since Swift can not talk to C++ directly, we have to either use an Objective-C class as a bridge, or create a C wrapper for the C++ library. For demo purpose, we're going to wrap everything in this Objective-C class, but we are working on bringing the Swift wrapper to PyTorch.
101103

102104
#### Run Inference
103105

@@ -117,21 +119,21 @@ at::AutoNonVariableTypeMode non_var_type_mode(true);
117119
auto outputTensor = _impl.forward({tensor}).toTensor();
118120
void* tensorBuffer = outputTensor.storage().data();
119121
```
120-
The C++ function `torch::from_blob` will create an input tensor from the pixel buffer. Note that the shpae of the tensor is `{1,3,224,224}` which represents `NxCxWxH` as we discuessed in above section.
122+
The C++ function `torch::from_blob` will create an input tensor from the pixel buffer. Note that the shape of the tensor is `{1,3,224,224}` which represents `NxCxWxH` as we discussed in above section.
121123
122124
```cpp
123125
torch::autograd::AutoGradMode guard(false);
124126
at::AutoNonVariableTypeMode non_var_type_mode(true);
125127
```
126-
The above two lines tells the PyTorch engine to do inference only. This is beacuse By default, PyTorch has built-in support for doing auto-differentiation, which is also known as autograd. Since we don't do training on mobile, we can just disable the autograd mode.
128+
The above two lines tells the PyTorch engine to do inference only. This is because by default, PyTorch has built-in support for doing auto-differentiation, which is also known as autograd. Since we don't do training on mobile, we can just disable the autograd mode.
127129

128130
Finally, we can call this `forward` function to get the output tensor as the results.
129131

130132
```cpp
131133
auto outputTensor = _impl.forward({tensor}).toTensor();
132134
```
133135
134-
### Collect results
136+
### Collect Results
135137
136138
The output tensor is a one-dimensional float array of shape 1x1000, where each value represents the confidence that a label is predicted from the image. The code below sorts the array and retrieves the top three results.
137139
@@ -140,28 +142,28 @@ let zippedResults = zip(labels.indices, outputs)
140142
let sortedResults = zippedResults.sorted { $0.1.floatValue > $1.1.floatValue }.prefix(3)
141143
```
142144

143-
### PyTorch demo app
145+
### PyTorch Demo App
144146

145-
For more complex use cases, we recommend to check out the PyTorch demo application. The demo app contains two showcases. A camera app that runs a quantized model to predict the images coming from device’s rear-facing camera in real time. And a text-based app that uses a text classififcation model to predict the topic from the input string.
147+
For more complex use cases, we recommend to check out the [PyTorch demo application](https://github.com/pytorch/ios-demo-app). The demo app contains two showcases. A camera app that runs a quantized model to predict the images coming from device’s rear-facing camera in real time. And a text-based app that uses a text classififcation model to predict the topic from the input string.
146148

147-
## Build PyTorch iOS libraries from source
149+
## Build PyTorch iOS Libraries from Source
148150

149151
To track the latest progress on mobile, we can always build the PyTorch iOS libraries from the source. Follow the steps below.
150152

151-
### Setup local Python development environment
153+
### Setup Local Python Development Environment
152154

153155
Follow the PyTorch Github page to set up the Python environment. Make sure you have `cmake` and Python installed correctly on your local machine.
154156

155-
### Build LibTorch.a for iOS simulator
157+
### Build LibTorch for iOS Simulators
156158

157159
Open terminal and navigate to the PyTorch root directory. Run the following command
158160

159161
```
160162
BUILD_PYTORCH_MOBILE=1 IOS_PLATFORM=SIMULATOR ./scripts/build_ios.sh
161163
```
162-
After the build succeed, all static libraries and header files will be generated under `build_ios/install`
164+
After the build succeeds, all static libraries and header files will be generated under `build_ios/install`
163165

164-
### Build LibTorch.a for arm64 devices
166+
### Build LibTorch for arm64 Devices
165167

166168
Open terminal and navigate to the PyTorch root directory. Run the following command
167169

@@ -170,7 +172,7 @@ BUILD_PYTORCH_MOBILE=1 IOS_ARCH=arm64 ./scripts/build_ios.sh
170172
```
171173
After the build succeed, all static libraries and header files will be generated under `build_ios/install`
172174

173-
### XCode setup
175+
### XCode Setup
174176

175177
Open your project in XCode, copy all the static libraries as well as header files to your project. Navigate to the project settings, set the value **Header Search Paths** to the path of header files you just copied in the first step.
176178

0 commit comments

Comments
 (0)