“The human eye has to be one of the cruelest tricks nature ever pulled.” These were Abi Ramanan’s opening words in a talk called “The Future of the Food Chain,” presented at Singularity University’s Global Summit in San Francisco last week.
Ramanan explained that all the human eyes can see is a tiny cone-shaped area of light in front of our faces, and it’s restricted to a very narrow band of the electromagnetic spectrum.
“We can’t see around walls, we can’t see electricity or radio signals, we can’t see out of this bit,” she said, pointing to the corner of her eye. “And we have wound up with the utterly mad and often fatal delusion that if we can’t see something, it doesn’t exist.”
Ramanan is the CEO and co-founder of Impact Vision, a company that’s using hyperspectral imaging technology to provide fast, noninvasive information about foods and their quality attributes in the supply chain.
[embedded content]
“You can think of it as a type of fingerprinting technology,” she said. “It allows you to understand what the constituents of food products are, how much of them there are, and in fact, whether they should be there at all.”
Ramanan pointed out that a third of the food produced worldwide gets thrown out, and modern agriculture is one of the biggest drivers of climate change. In addition, she said, food fraud costs the global economy another $40 billion a year.
There’s got to be an efficient way to improve these dismal statistics. Hyperspectral imaging is one promising solution.
The Old Ways
For food producers and retailers, current supply chain mechanisms are far outdated. “They’re visual, destructive, and sample-based, or they’re time-consuming lab tests,” Ramanan said. “And this is one of the causes of huge amounts of supply chain inefficiency, loss and waste.”
Some of the key measurements that happen in the food supply chain include:
- pH: A measure of freshness that’s taken via a pH meter, which consists of a probe that’s inserted into meat to get a reading. It would be too time-consuming to test every product using this method, so pH is measured for just a small sample size of total product.
- Color: Measured by visual comparison, usually by a supply chain operator, who holds up a color chart and compares it to cuts of meat. There’s a wide margin for human error here.
- Tenderness: Assessed using an instrument that applies pressure to cut a piece of meat and measures the pressure in newtons.
The New Way
A single hyperspectral image can provide information about pH, color, tenderness, lean fat analysis, and protein content. The technology combines digital imaging, or computer vision, with spectroscopy, the technique of acquiring chemical information from the light of a single pixel. A sensor processes light and measures how it’s reflected across hundreds of continuous wavelengths.
“What this means in practice is you can take an image of a food product and understand the nutritional content, the freshness levels, and how much protein, fat, or moisture it contains,” Ramanan said.
Hyperspectral imaging was developed a few decades ago by NASA for use in space. Up until now, though, it hasn’t had practical applications in other industries. Thanks to the “better, faster, cheaper” trend that’s swept many of the technological components involved, that’s changing: the size and cost of sensors has decreased, as has the cost of computation, while image processing power has gone up.
Insight Vision’s system analyzes hyperspectral images of food using chemometric models and machine learning. Their software then turns this data into actionable insights; having more information about the pH of meat, the ripeness of fruits, or the freshness of fish can help food companies make real-time decisions earlier, which in turn reduces waste and fraud while maximizing yield and consistency.
Ramanan mentioned that the fish industry particularly suffers from fraud issues, such as frozen fish being defrosted and sold as fresh fish or red snapper being sold as tilapia. “Fresh fillets reflect more light, so we can detect the difference,” she said.
Another application of the technology is foreign object detection—things that aren’t supposed to be in our food, but somehow find their way there: plastic, paper clips, metals, etc. The software is able to group pixels with similar spectral profiles together to detect that there are foreign objects under the surface of, say, a batch of sugar.
Most traditional machine vision systems need to be fed hundreds or thousands of images in order to ‘learn’ to classify things. Hyperspectral images are different in that they contain two data sets: spatial and spectral.
Each image is divided into about 25,000 pixels. Each pixel in turn has 224 spectral values. One image is 7,000 megabytes, so it’s generating a lot of information. This means the system can be trained with very few images to group pixels with similar profiles together.
Balance of Power
“The core of what this allows is a shift from subjective supply chain measurements to much more objective data,” Ramanan said. Retailers have a lot of power in the modern supply chain, while the more vulnerable players—like farmers and other food producers—don’t have a way to provide objective measurements to guarantee the quality of their food. Developing close product specifications that can be objectively, accurately measured will help change this—in conjunction with equally important goal of reducing waste.
For example, Impact Vision is currently working with a company that sources 30–40 million avocados a year, a quarter of which get wasted due to being over-ripe or under-ripe. They’re developing a model that can classify ripeness non-invasively, and avocados would then be ripened in select bands and sent out to the restaurant level more closely in line with their specifications.
Ramanan concluded by reminding the audience that the food system and supply chain are highly complex, and there’s no individual technology that can solve all the issues they face. “There will need to be consumer-led awareness campaigns,” she said. “Supermarkets need to relax their superficial cosmetic standards. We as consumers also need to change our preferences, and change our attitudes.”
Stock media provided by Michal Moravcik / Pond5
Read Again NASA Made This Technology for Space—Now, It Will Improve What We Eat : http://ift.tt/2vipRvqBagikan Berita Ini
0 Response to "NASA Made This Technology for SpaceNow It Will Improve What We Eat"
Post a Comment