Back in April 2008, on the 25 April to be precise, I wrote a blog that asked the question, “Did Apple Make A Mistake Choosing Objective-C For The iPhone SDK?”
It was a really popular entry on the blog, with some great discussions about the pros and cons surrounding the choice of Objective-C. Despite us unfortunately having to nuke the old blog and start again, that entry still shows up in Google auto-suggest and is referenced on the Wikipedia page about the iOS SDK (someone might want to update that reference now it’s broken btw).
Anyway, to recap – the reason for the question was straightforward. Back in 2008, Objective-C (the language used for developing for the Mac) had close to zero adoption among developers. Whereas, the language Java had many millions of developers. If you were going to release a phone SDK, Java seemed like a language you’d at least have considered supporting. Indeed, when the Android SDK was first released just a few months later, in September 2008, the Java language (smile, everyone) turned out to be the chosen language for that.
Anyway, now, it’s 2012, and we have a definitive answer to my question. The answer is – Apple definitely did not make a mistake.
Let’s have a look at the trajectories for the popularities of Java and Objective-C (based on data from the TIOBE index) , starting in 2007.
Driven by the huge success of two products that are genuinely revolutionizing personal computing (iPhone and iPad), Objective-C has seen massive adoption in the last few years. So much so that Objective-C and Java are now clearly on a collision course in terms of popularity among developers.
Java is still a hugely popular language. What’s happened since 2008 is that the rise in popularity of Objective-C has been meteoric. Looking at the trends, it’s possible that Objective-C and Java may reach parity in terms of popularity before the end of 2013.
It’s really an amazing story.