I have been a huge fan of Jupyter for a while now, and most importantly of the flexibility it is offering: I strongly believe that the fact that you only need a screen and network connection to get access to pretty much unlimited computational resources has enormous potential.
In the Metal Camera Tutorial series we have created a simple app that renders camera frames on screen in real time. However, this app uses Metal framework, which is not available in iOS Simulator. Basically, your app won’t even build if you select simulator as a build device, which is a shame in case you want to add unit tests for example, being able to run them without actual device connected to your machine.
In the second part of the Metal Camera Tutorial series we managed to convert frame data to a Metal texture. Now we are going to render it on screen with the help of a very simple Metal shader.
Although protocols are not by any means a new thing, Swift specifically encourages the developers to use it over inheritance. Not that Objective-C didn’t make use of protocols, but due to the dynamic nature of Objective-C Runtime one would be tempted to put chunks of common declarations in a superclass instead.
In the first part of Metal Camera Tutorial series we managed to fire up a session that would continuously send us frames from device’s camera via a delegate callback. Now, this is already pretty exciting, but we need to get hold of actual textures to do something useful with it — and we are going to use Metal for that.
A lot of apps nowadays use iPhone and iPad cameras. Some even do pretty badass things with it (performance wise), like running each frame through a neural network or applying a realtime filter. Either way you may want to get as low as you can in terms of the level at which you interact with the device hardware, be it getting data from a camera sensor or computations involving GPU — you still want to minimise the impact on device’s limited computational resources.
Writing unit tests is like having sex in high school: everybody is talking about it, although very few are actually doing it. In the iOS world it had a couple of additional roadblocks for a while due to a lack of solid and stable testing capabilities out of the Xcode’s box. However, with the Apple’s XCTest framework things have improved greatly: you no longer have an excuse of needing 3rd party frameworks to test your code properly.
Apple mobile devices have so many capabilities nowadays, that it is not always obvious where this or that functionality is coming from. Have you ever thought of how the Google Cardboard VR apps work? The answer is — they all use device motion sensors, be it an Android or iOS device.
There are quite a few potential scenarios where you may want your user to go through a set of questions, take a test or simply provide feedback. I hope this post will give you a useful example of interacting with the user on a mobile device, and will inspire you to design something straightforward and clear next time you face a similar challenge.
I have come across an interesting UX use case on medium.com recently: a concept of a banking mobile app. Not only this concept looks impressive when it comes to usability in comparison with pretty much every mobile banking app, it also has a couple of neat and engaging UI design tricks that really catch your eye.