By now, iPhone apps are so ubiquitous that it’s actually hard no to find an app for a particular problem. Case in point: I had to use a calling card for a conference call which meant entering about 30 digits (including the actual phone number) but most of these digits never change (calling card number, pin, conference code, conference ID) or in other words, there ought to be an app that would dial the number(s) and enter the correct pin at the appropriate time. Turns out there are several apps that do just that.
But the most amazing apps in my opinion are the photo effect apps – it’s truly mind-bending the level of creativity one can achieve with these apps. It adds a whole new dimension to photography – even with the somewhat limited camera of the iPhone.
To take this concept further, I started to modify photos on the iPhone that I took with a professional DSLR camera by using an Eye-Fi SD memory card that allows me to transfer the picture just taken via WiFi to my iPhone. This solution works (at times) but is clearly a hack. The bigger question for me is why camera makers have not taken advantage of Smartphones’ functionality in precisely image manipulation and distribution. It’s just crazy how the iPhone is now the most widely used camera per Flickr’s stats – while Canon, Nokia, Panasonic, Olympus, Pentax continue to not play this game. Which brings me to the other question: what would Apple prevent from making a camera that leverages all the functionality the iPhone offers? By that I mean of course the apps but also the compass and the GPS.
To look at that, we can look at a precedence of such a scenario – it’s not Apples to Apples (pun intended) but should be good enough to illustrate how one can leverage existing functionality from other devices:
Apple’s Laserwriter (ca. 1985) was a computer in itself. It had the same CPU as the Macintosh at that time, it had its own RAM, firmware and software renderers for Postscript. Back then, that meant it also cost more than a computer.
When Steve Jobs started NeXT, he used the same Adobe PostScript to generate the UI and called the resulting standard DisplayPostscript. This had the big advantage that everything you saw, you could print because it was already Postscript. Hence, NeXT’s laser printer did not need its own brain but was essentially just a mechanical extension of the NeXT computer and it leveraged the NeXT’s CPU, RAM and DisplayPostscript facility to control the print cycle. The drawback was that the NeXT laser printer would only work on NeXT – but it only cost a fraction of a full-fledged laser printer.
So what if Apple produced a semi-pro mirror less camera that tied neatly into the iPhone’s eco-system?
I don't think Apple would want to do that for several reasons - for starters, it lacks the expertise (not necessarily an issue, except that Steve Jobs is no longer around and I'm not sure the new crew has the necessary vision for something like this) but more importantly, it might not fit into Apple's overall simplicity model. Having two devices that need to communicate just because one device is much better at doing something than the other device is in itself a verdict. So, no - it probably won't happen. But keep in mind, Apple made cameras at some point (early 90's) - and they will certainly continue to invest in the camera that exists in the various iDevices.
Realizing this, leaves open a gap for others to fill - but it also opens up a bigger playing field, to be addressed by the question of inter device connectivity and communication. We have not only one but in fact three wireless standards that are in phones: WiFi, Bluetooth and NFC - of these, only WiFi has sufficient bandwidth to potentially deal with the large amount of data that a camera and the iDevice would have to exchange, since a Raw picture can easily be 12-18 MB (depending on your camera) - but WiFi is also very power hungry and would certainly made a dent in the usability of your iDevice (or camera).
Enter the freshly announced world of WiGig - operating in unused 60GHz it only transmits up to a couple of meters - in other words, it's between bluetooth and NFC. However, what it lacks in distance it makes up for in terms of bandwidth: multiple gigabits. It's the thunderbolt of wireless transmission. Further, it's very energy efficient and would address the unnecessary battery drain that my eye-fi card is so famous for.
Another emerging standard is Bluetooth 4.0 - currently only available in iPhone 4s. It transmits up to 100m but at only 200 kbit/s. But similarly to WiGig, it optimized for mobile devices and uses very little energy. The Bluetooth consortium sees its application in standalone devices that only need to change batteries every couple of years: smoke detectors, car door locks etc. But of course, its uses would be perfect to transmit coordinate from the iPhone's GPS or direction changes based on the compass' reading.
It will take another 2-3 rounds of CES until all these piece come together with the necessary software stacks - but when they come, I can finally use my iPhone (version 6 or 7) to do heavy duty image manipulation on the fly.
Update: I should note that since I've written this article, Polaroid has introduced a point-and-shoot camera at CES that runs on Android. One can only hope that Canon and Nokia follow that principle - but given that the Japanese were never strong in Software (and still see it as something for softies) it probably remiains a hope. WiGig to the rescue, please!