Tutorial – Image handling in DeepLearningKit

Image handling, i.e. from UIImage (on tvOS/iOS, or NSImage on OSX) to DeepLearningKit representation and vice versa is one of the core functionalities needed. Found a nice iOS (Swift) project on Github called ImagePixelFun (by kNeerajPro)- that allowed setting and getting RGB(A) pixels directly on an UIImage through an extension. The only thing I needed was to do minor updates of ImagePixelFun to Swift 2.x and then it worked nicely on both tvOS and iOS, and then I integrated the extension into DeepLearningKit – (note: for OS X I still haven’t solved the issue since the NSImage API is slightly different from UIImage)

1. CIFAR-10 Image Handling (ref. Deep Learning model used in app examples)
CIFAR 10 Images are (small) 32×32 pixel images in 3 (1 byte) channels – RGB – in the json file in the tvOS and iOS app examples this is stored as a single array of length 3072 (i.e. width*height*#channels = 32x32x3) in the file conv1.json in the “input” field. The Swift code below shows how to convert the internal DeepLearningKit format (i.e. Caffe converted to JSON with caffemodel2json). The main method is setPixelColorAtPoint(CGPoint(x: j,y: i), color: UIImage.RawColorType(r,g,b,255))!. Note that the reverse method (also shown below) – getPixelColorAtLocation(CGPoint(x:i, y:j)) – can be used to get RGB(A) from an existing UIImage (e.g. an image taken with camera and shown inside the app).

2. imageToMatrix(image: UIImage)
kindly added a function to convert an UIImage to a tuple of RGB(A) vectors (and provided a small fix to it), this is another approach (and likely faster) to get pixels from UIImage than the approach above. It has been added to iOS and tvOS example apps (in the ImageUtilityFunctions.swift file for each of the examples)

Code signature:

Conclusion
The extensions allowing setting and getting RGBA values in UIImage is added to DeepLearningKit tvOS and iOS app examples or the imageToMatrix method – which should make it fairly easy to do whatever image conversion you want.

Best regards,
Amund Tveit