iOS Development

File system permissions and paths in iOS Permalink

less than 1 minute read

Although Juno makes coding on iPad a breeze, there are still some tricks you need to know — one of them is working with the file system and handling paths. For example, when your code is supposed to read file’s contents or write data to a file, how do you specify file’s location in iOS?

Jupyter client for iPad

4 minute read

I have been a huge fan of Jupyter for a while now, and most importantly of the flexibility it is offering: I strongly believe that the fact that you only need a screen and network connection to get access to pretty much unlimited computational resources has enormous potential.

Metal Camera Tutorial Bonus: Running Metal project in iOS Simulator

2 minute read

In the Metal Camera Tutorial series we have created a simple app that renders camera frames on screen in real time. However, this app uses Metal framework, which is not available in iOS Simulator. Basically, your app won’t even build if you select simulator as a build device, which is a shame in case you want to add unit tests for example, being able to run them without actual device connected to your machine.

Swift: Type of a class conforming to protocol

1 minute read

Although protocols are not by any means a new thing, Swift specifically encourages the developers to use it over inheritance. Not that Objective-C didn’t make use of protocols, but due to the dynamic nature of Objective-C Runtime one would be tempted to put chunks of common declarations in a superclass instead.

Metal Camera Tutorial Part 2: Converting sample buffer to a Metal texture

4 minute read

In the first part of Metal Camera Tutorial series we managed to fire up a session that would continuously send us frames from device’s camera via a delegate callback. Now, this is already pretty exciting, but we need to get hold of actual textures to do something useful with it — and we are going to use Metal for that.

Metal Camera Tutorial Part 1: Getting raw camera data

6 minute read

A lot of apps nowadays use iPhone and iPad cameras. Some even do pretty badass things with it (performance wise), like running each frame through a neural network or applying a realtime filter. Either way you may want to get as low as you can in terms of the level at which you interact with the device hardware, be it getting data from a camera sensor or computations involving GPU — you still want to minimise the impact on device’s limited computational resources.

Unit tests for Touch ID

2 minute read

Writing unit tests for iOS apps had been challenging for a while, mainly due to a lack of solid and stable testing capabilities out of the Xcode’s box. However, with the Apple’s XCTest framework things have improved greatly: you no longer have an excuse of needing 3rd party frameworks to test your code properly.

Motion Sensors in iOS

1 minute read

Apple mobile devices have so many capabilities nowadays, that it is not always obvious where this or that functionality is coming from. Have you ever thought of how the Google Cardboard VR apps work? The answer is — they all use device motion sensors, be it an Android or iOS device.

Mobile app navigation: designing a questionnaire

5 minute read

There are quite a few potential scenarios where you may want your user to go through a set of questions, take a test or simply provide feedback. I hope this post will give you a useful example of interacting with the user on a mobile device, and will inspire you to design something straightforward and clear next time you face a similar challenge.

Transitions with CoreAnimation

5 minute read

I have come across an interesting UX use case on medium.com recently: a concept of a banking mobile app. Not only this concept looks impressive when it comes to usability in comparison with pretty much every mobile banking app, it also has a couple of neat and engaging UI design tricks that really catch your eye.