I have seen the future of food transparency, and it is optical. Also, it fits in your smartphone.
Imagine a scanner the size of a grain of rice, built into your phone. You go to the grocery store and point it at something you want to buy. If it’s an apple, the scanner will tell you what variety it is, how much vitamin C it has and how long it has been in cold storage. If it’s a fish, you’ll learn whether it’s really orange roughy or just tilapia being passed off as something more expensive. If it’s a muffin, the device will tell you whether there’s gluten in it.
Although you won’t be able to do it tomorrow, this isn’t some kind of distant Jetsonian vision of the future. I’ve held the rice-size scanner in my hand; it was built for only a few dollars. I’ve seen bigger, more robust versions of the scanner do the things that your smartphone will be able to do, probably during the administration of the president we’re deciding on right now.
[Ten things we should do to fix our broken food system]
As cutting-edge as the applications are, the technology dates to Isaac Newton, who first separated light into its constituent wavelengths, through a prism, back in the 1600s. Fast-forward to when you went to high school, and Mrs. Weiss (or whoever your chemistry teacher was) had you identify a mystery chemical based on the light that reflected off it.
Every substance reflects (and absorbs) light in a different way, and the graph of that reflected light is a kind of optical fingerprint. Except it’s better. Although the whorls and lines in our fingertips don’t say anything inherent about their owner (See that swirl? Doesn’t mean you’re smart.), the peaks and valleys of the optical fingerprint do. That peak there is vitamin C. That other one is sugar. This pattern means gluten.
Identifying a food and its characteristics based on the scan is a twofold job: First, you simply match the optical reading to a library of known objects; second, you read the topography of the graph to zero in on specific characteristics. The two together can tell you an awful lot about what you’re scanning. The Mrs. Weisses of the world are rejoicing.
As am I. Because the implications of this technology are enormous.
I checked in with three companies now working on bringing optical scanning technology to the food supply. Two of them, TellSpec and SCiO, are working on handheld scanners designed for consumer use. The third, Target, is already starting to implement optical scanning in its supply chain.
Target, one of the nation’s largest retailers, is collaborating with MIT and business design firm Ideo in a venture called Food + Future coLab, based in Cambridge, Mass., which has the broad mission of helping consumers better understand their food. Greg Shewmaker, a Target entrepreneur in residence (yes, that’s a title!), leads the lab and took me on a tour.
Exhibit A was Brent Overcash, whose job is to investigate interesting technologies that might have a food-related application. The interesting technology he’s focused on is optical scanning, and he showed me how they’re doing it.
Scanning something is the easy part, as easy as taking a fingerprint. What’s tough is figuring out what that something is. The library the researchers need is huge. It’s not like they can take an apple, scan it and just file it under A. They have to scan Fujis and Honeycrisps and Jonagolds, and know what their scans look like when the apples are just-picked and when they’ve been sitting in a warehouse for a year. (Some of the nutrients deteriorate.) They have to know that the side of the apple that didn’t get sun will have less vitamin c than the side that did. They have to know that Grower A’s Fuji will look a little different from Grower B’s Fuji, but they’re both Fujis.;