Turning mobile consumers into food safety inspectors, clinical diagnosticians and more
The latest whiz-bang application for mobile users, CellScope, comes to us from researchers at the University of California, Berkeley, whose tool enables cell phone cameras to be used as fluorescent microscopes. This means that mobile devices with cameras, like the iPhone, can be adapted to collect and transmit images from blood and sputum (snot or spit) to diagnose the presence of malaria parasites and tuberculosis.
While this fluorescent scope device currently requires a plug-in component, we're not so far away from software or camera upgrades that would make this adaptation accessible to any iPhone or android device user. Similar applications could easily include the detection of E. coli or other bacteria in food. Imagine that your ground beef smells a little off. Take a snap with your cell phone, and learn if you've got a contaminated Big Mac. How about H1N1 (aka, Swine Flu)? Sneeze, and snap a picture; diagnosis and links to related information, services or therapeutic products could be delivered in real time.
The possibilities that these types of mobile applications afford, which inform decision-making and influence behavior at the point of consumption, are endless. Are we ready for mass access and control over food quality or disease diagnostics, like the examples noted above? It won't matter if we're ready or not, it's coming. I've written before on the demise of the PC being driven by mobile applications; this is just the latest nail in that coffin, which extends mobile's reach well beyond where anyone could have predicted.
Just think about it, what won't the iPhone do in the future? Consider the components beyond basic telecommunications that enable us to send and receive audio, video and text content. The implications are staggering. Here are just a few:
- The iPhone and similar smart devices have a camera that could be used to scan bar codes to compare or validate products, as well as share snapshots from our vacations. That camera could also be adapted for use as a spectrograph and used to detect smells to rate wines or detect gas leaks. Imagine any optical device application, in fact; we'll soon see software applications bringing them right to your handheld.
- The iPhone also has a microphone. Apps like Shazam identify song titles and corresponding data. How long before stress analyzers used in lie detection or other audio-based applications follow? Can you imagine, for example, saying to your car salesperson, "Are you sure that's the best price you can give me on this deal? My phone suggests you might be lying." That day may come faster than you think.
- The iPhone's accelerometer detects motion that could be applied in various ways ranging from a pedometer to seismograph. Along with integrated location finding technologies, proximity sensors and touch screens, these smart device technology features are already generating a huge range of mash-up applications that will alter the way we interact and consume.
These types of applications are being developed by advocacy and other special interest groups, as well as product marketers, seeking to influence consumers about a wide range of life choices, where and when they make decisions.
Think about how and where your customers and other critical stakeholders make their decisions about you and your products, services or related issues. How will you enable access to accurate and effective information at these points of consumption or points of decision?
If you're still focused on that corporate blog or launching your Twitter feed, you're likely already the cyber equivalent of fossilized matter just awaiting further confirmation of your extinction with each quarterly earnings report. It's time to get moving and get mobile.
To read more from Jay Byrne, visit his official blog at jaybyrne.com.
of the comments and will not be displayed on the page.