Recent Posts

Metal Camera Tutorial Part 2: Converting sample buffer to a Metal texture

4 minute read

In the first part of Metal Camera Tutorial series we managed to fire up a session that would continuously send us frames from device’s camera via a delegate callback. Now, this is already pretty exciting, but we need to get hold of actual textures to do something useful with it — and we are going to use Metal for that.

Running TensorFlow with Jupyter Notebook on AWS

6 minute read

Google’s open source TensorFlow is one of the most promising machine learning frameworks nowadays. Even though Google is said to use a slightly different version internally, and the current version of TensorFlow is somewhat behind its competitors performance wise, one can hardly argue that it has a lot of potential.

Metal Camera Tutorial Part 1: Getting raw camera data

6 minute read

A lot of apps nowadays use iPhone and iPad cameras. Some even do pretty badass things with it (performance wise), like running each frame through a neural network or applying a realtime filter. Either way you may want to get as low as you can in terms of the level at which you interact with the device hardware, be it getting data from a camera sensor or computations involving GPU — you still want to minimise the impact on device’s limited computational resources.

Unit tests for Touch ID

2 minute read

Writing unit tests for iOS apps had been challenging for a while, mainly due to a lack of solid and stable testing capabilities out of the Xcode’s box. However, with the Apple’s XCTest framework things have improved greatly: you no longer have an excuse of needing 3rd party frameworks to test your code properly.

Motion Sensors in iOS

1 minute read

Apple mobile devices have so many capabilities nowadays, that it is not always obvious where this or that functionality is coming from. Have you ever thought of how the Google Cardboard VR apps work? The answer is — they all use device motion sensors, be it an Android or iOS device.