Like undoubtedly most of you, I’m now years into taking pictures of just about everything at this point. Not just of my family, or what I’m doing, or what I’m eating, or what I’m drinking, but literally everything. Receipts. Where I parked. What I just bought. What I’m thinking about buying. A movie I want to remember to watch. Something I need to remember to buy at the store. Everything.
I’ve written before about the concept of taking pictures of everything and letting algorithms sort out what we need to parse in the images, but I’ve been thinking about something far more simple recently.
What I really want in a mobile OS is the ability to fire up the camera, take a picture, and launch apps from there based on that picture. A good example: I’ve been using an app called
Vivino to track the wine we’re drinking and/or buying. Why on Earth do I need to load Vivino, then hit the camera button inside the Vivino app to take the photo? Shouldn’t I just be able to take the photo with my camera and launch the app right from there, wrapping it around that image?
Better yet, given enough smarts, the camera should know I just took a photo of a wine label, and know that I have the Vivino app installed, so it should automatically pop up the option to load the app. Or not even load the app. Just do some basic app-like things right there, inside the camera app, powered by Vivino. A micro app, of sorts, with the camera as the springboard.
I can think of at least a dozen other examples like this. This is so clearly how phone cameras should work in 2019 that I’m honestly surprised they don’t. My caveat here is that I’m an iOS user. It’s entirely possible there’s some fork/feature of Android that will let you do this.
And I know Google Lens is taking us steps closer to some of what I want from a smartphone camera. But this should clearly be a feature of iOS. And it could even serve as a new platform/app discovery mechanism within the camera.