Archive for Apps

iOS Localization, some reflection and a hack

As an app developer, it can be quite effective to localize your apps (for this article, we focus on language), especially if you speak both English and your native tongue. At Japps, this is exactly what we do: localize at least to English and Dutch. Both iOS and Android provide means for this, although lozalizing the text on UI components is, frankly, a bit easier on Android. On iOS you are basically forced to duplicate your entire UI, which makes maintenance tricky (basically this violates the DRY principle).

Apple’s solution (prior to iOS 6)

Sadly, there are no real solutions available yet, although there are tools that speed up the process. The steps boil down to: duplicate Storyboard / nib files, extract strings from one language, translate strings into another and put them back (both using a command line tool). Any UI changes will require a repetition of these steps – however you can take a shortcut here by reusing most of the earlier translations. The great thing is that you can adjust your UI (sizes) to the different word lengths, but this is not always a big issue.

Alternative solution

Another solution that has been proposed is transferring the translation proces entirely to code by creating an outlet for each control that has text on it (see this tutorial). The downside is that that it requires quite some code for every new view. Furthermore, the texts in IB / storyboard will become totally meaningless, which may be confusing. To prevent this, they could be added just for clarity, but they’d still need to be defined in the strings file, which would again be non-DRY solution, and extra work.

Base Internationalization (iOS 6)

Luckily, Base Internationalization is coming, and it seems this will provide a KISS solution to localization on iOS. Auto layout could then ensure UI elements adapt their size and location to the length of the strings in the current language. This won’t work on iOS 5 though, so we’ll have to wait a bit before it is ‘acceptable’ to stop supporting the iOS 5 users.

Our solution (for the time being)

In the mean time, we crafted a bit of code that basically loops recursively over all views, checks the strings for square brackets. If anything is found like ‘[Blah]’ it is replaced by ‘BlahTranslated’, or ‘Blah’ if no translation was found. The latter means that for the ‘original’ language (ie the language that is used inside IB), no or only a few entries have to be added to Localizable.strings. For other (newly added) languages, everything can be added into one Localizable.strings file. Alongside, the UI in IB keeps being fairly understandable (since all texts are still relevant, albeit wrapped in square brackets).

This was implemented as ‘categories’ for NSString and UIView. In the latter case, subviews are checked recursively, and for different types of view (label, button, etc), slightly different steps are taken. The only thing that needs to be done is for every view that is loaded from IB  (in view controller’s viewDidLoad, or after manually loading the view from a nib) lngfkt.h needs to be included and [theTopView lngfk] needs to be called. In views that are instantiated from code NSLocalizedString can be used like ‘normal’.

The code



There’s a fair amount of edge cases that are not covered by this approach. Some just didn’t pop up yet (the above series of if statements could probably cover more UIView subclasses), others need some more trickery.

For navigation bars, the lngfk method has to be called separtely: [self.navigationController.navigationBar lngfk]. For static table headers and cell views, a bit of code is required in the class that implements the UITableViewDelegate class (the UITableViewController for instance):

The pitfalls

It is a hack. I doesn’t even really adhere to any naming conventions. We used it and found it a fair alternative compared to the other options. The code is posted just in case others like this idea as well. Please use it at your own risk. Also, this happens on runtime, which costs time. Not notably, in our case, but it could be.


Why I think Android developers should rotate their phone often

In a well written app, when you rotate your phone, a lot happens on the background, but the user should not be aware of it. As a developer however, you should be aware, you should adapt to it, and you could use it in your advantage. Here’s why I think Android developers should rotate their phone often. Note: I’m linking to Stackoverflow questions here or there because the corresponding answers contain very practical and minimal code examples.

Adaptive layout

When a phone is rotated, the screen dimensions change. This may have a detrimental effect on the appearance of your app. It is very tempting as a developer to have only one testing device and adjust everything to that. Rotating your phone enables you to also get your landscape view good looking, and it also makes you aware of how important it is to expect the unexpected when it comes to screen size and resolutions, especially with Android, which runs on a wide range of devices (the HTC ChaCha for instance, has a screen that’s in landscape by default). You could decide you to ‘lock’ your screen to portrait. But when you’ve finished reading this blogpost, you might decide to do so only at the very last moment.

Activity restarting

When the phone rotates, the Android OS restarts the current activity. Something needs to be done to hide this event from the user! In a very simple app with views that have an id assigned, this will be done for you. But in many cases you will need to add code to make this work. Although you could catch the orientation event and prevent the restart of your activity altogether, the preferred method is to store the state of your activity in onSaveInstanceState. In the latter case, all important information about the current state of  your activity needs to be stored as key / value pairs (keys being strings). Stuff that does not fit the Bundle object used to be stored via onRetainNonConfigurationInstance, but this has been deprecated and fragments should be used instead.

Use it to your advantage

If you lock your app into portrait mode, you would not need to deal with the aforementioned issues. Well, at least not on rotation change. But your activity might be destroyed at any time. Especially when the user went away to do something else, but hopes to find your app in the same state as before when he / she gets back to it. This is one of the – in my opinion – nifty parts of Android’s ‘multi tasking’: stuff is kept in memory if possible, but destroyed and restored (in the original state, if the developer did his/her job) when needed in case of low memory. How to test for such events? That’s quite tricky, because it might take some effort to force the system to really destroy the activity. That’s when rotating your screen comes in handy: during this event, as far as saving and restoring instance state are concerned, the same happens as when your Activity gets destroyed.

Closing remarks

I’m currently using a Hackers News reader that actually did not implement the above. Every time I rotate my screen the app needs to reload stuff from the internet. When reviewing a developer, this is a simple way of testing whether he / she has at least some grasp of Android. If you care about your users and their experience, rotate your screen on a regular basis.










Hands on{x}

A Microsoft team from Israel released on{x} yesterday (as a beta version). This team researches location and activity awareness for Bing. On{x} enables the user to trigger tasks (reminders, website, sms) on events such as reaching a location, changing the way of movement (driving, walking, running) or simply on time.  For now, on{x} consists of an Android application with a website back end, connected by Facebook login. Let’s dive in.

Two layers of complexity

On{x} has basically two layers of complexity: ‘recipes’ enable non-technical users to quickly configure rules, while technical users can show their JavaScript skills. The interesting part is not yet up and running: having the latter group of users create ‘recipes’ that the first group can use.

Rules = Scripts

Basically, on{x} is a bunch of scripts, called ‘rules’ that can be turned on or off. Such a rule is simply JavaScript code that gets executed when the rule gets (re)loaded. JavaScript event handlers can be defined that are triggered when an event takes place somewhere in the future. An API (with documentation) is available for a range of triggers, and a range of actions. Scripts are written in the browser ( Cloud9 Editor) and pushed to the phone upon save. There’s a logging system which is also accessible through the browser.

Play a song when reaching home

I decided to write a rule that starts playing a song when I get home. Pointless, but technically interesting. There’s a possibility to do this via geo-fencing: one defines an area after which triggers can be set to go off whenever the phone leaves or enters the defined area. This however requires GPS to be on, which is a potential energy-hog.

Trigger on Wifi

Instead I decide to try and detect if my home Wifi signal (SSID) gets into range. For this, the API is a bit vague but with help of the forums and some debugging I found:

This event is triggered when the phone does a Wifi scan. It is also possible to force a scan yourself:

The major deviation from the documentation here is that to get to the scanResults in JavaScript, toArray() has to be called. Basically, scanResults is a (wrapper around a) Java List object.

Play a song

Playing a song from the sdcard also was not very obvious. The good thing is that on{x} is only a very thin layer on top of Android and I finally found this to work:

Saving state

The last challenge was how to ‘remember’ that when we find the SSID of interest to be in range, it hasn’t been so before. For this, I used localStorage to store the state. State starts uninitialized. As soon as a scan has been conducted, state changes to ‘in-range’ or ‘out-of-range’. Only when switching from the latter to the first, the song will be played.

Closing thoughts

Before I paste my entire script (which I will submit for review by on{x}) below, first the final verdict on on{x}. Overall, it is a true beta: documentation is immature, the app kept crashing on me, logging has quite a lag, there’s very little ‘user friendly’ recipes at the moment, there’s a lot of ‘actions’ missing (changing volume, turning Wifi on / off) and finally the required Facebook login which seems to disappoint quite some users. However, there is already a lot of activity on the forums, and every webdeveloper who know his/her JavaScript can get started in no time. The recipe layer and the option for coders to publish their rules have a lot of potential as well.

At first it seems surprising Microsoft published this for Android. But in the end, this platform is the most flexible. The availability of an existing opensource JavaScript engine (Rhino) for Java may also have helped. Given the fact that on{x} is written so close to the Android architecture (WifiScan results, for instance, are objects in a JavaScript jacket, device.applications.launchViewer is a very thin wrapper around Androids View intent, etc), I truly doubt if this will be available for other platforms any time soon, not without a lot of effort and abstraction of the API.

If a small startup had come up with this beta, they would have gotten my approval. But from Microsoft, I would have expected something more. They should have started with a closed beta: now the Play market is flooded with 1 and 2 star reviews, and they already used up most of the attention wave.

If you’re not a hacker, I’d stay away from on{x} for now. If you are, you might want to try it out, and hang around to see if this does becomes mature at some point and you can quickly jump in to seize whatever opportunity there may be.

Final script

As promised, the entire script on a platter: