http://money.cnn.com/gallery/technology/innovation/2012/12/17/ibm-5-in-5-computers-senses/5.html

Ok – I am all for advancement in technology – but something about this just seems a little off to me. I am all for my computer helping me to determine if I have a cold or a flu (constantly trying to figure out which one I have, or if it’s just allergies…) – but when they are talking about my computer ‘seeing’ me and such – we as a people have become so dependent on technology. Our kids are to engrossed with texting and facebooking and tweeting – they will be sitting next to the person they are texting! This just kind of seems like we’re just continuing further down that pathway. Where instead of having parental instincts (as a mother, I can tell when my child needs to diaper changed, or is hungry, or is teething, or just wants to be held), we’ll be dependent on a computer to let us know if our own children are hungry. Instead of testing their forehead to see if they have a temperature, we’ll hold them up to a computer to see if they need some Tylenol…   I’m sure that all these advancements are well-intentioned, but are well just setting ourselves up for failure in the long run? What happens if the power goes out and we no longer have the computer to depend on? Just a thought…

 

Feel it before you buy it on your phone

Feel it before you buy it on your phone

Some day soon, you’ll be able to order a wedding dress on your tablet and feel the fabric and the veil just by touching the screen.

When you feel an object, your brain registers the series of vibrations on your skin as being smooth, rough, sharp, etc. Computer sensors are becoming sophisticated enough to do that too.

Within the next five years, vibrators within smartphones will be precise enough that they could be designed to mimic the vibrations experienced when your fingers touch a particular surface. Even though you’ll just be touching glass, it will feel like you’re touching whatever object is displayed on the screen.

“We’re not talking about fuzzy screens,” said Bernie Meyerson, IBM’s (IBMFortune 500) vice president of innovation. “You’re not going to have to dry clean your Samsung.”

In some ways, computers are already simulating touch — albeit in a crude form. When you’re driving a car in a video game, the controller vibrates when the car starts to veer off the road. It may not feel precisely like a steering wheel’s vibrations when you hit gravel, but within five years, that technology is expected to become even more lifelike.

IBM’s researchers are working on just that — creating applications for the retail and healthcare sectors that use haptic, infrared or pressure-sensitive technologies to simulate touch.

——————————————————————————————

See

See

Today’s computers are very good at capturing and displaying images, but despite advances in image recognition software, computers are still pretty lousy at understanding what they’re “looking” at. Humans are still needed to tag friends, label photos and identify diseases.

In five years, all that will change, IBM says. Computers will be able to interpret images better than we can, analyzing colors, texture patterns and gaining insights from other visual media. They will even surpass doctors’ abilities to read medical imagery, including MRIs, CT scans, X-Rays and ultrasounds.

Computers of the not-too-distant future will be able to see subtleties in images that can be invisible to the human eye. For instance, computers will be able to quickly differentiate healthy from diseased tissue on an MRI and cross-reference the image with a patient’s medical history and scientific literature to make a diagnosis.

——————————————————————————————

Hear:

Hear

Imagine holding a smartphone up to your infant when she’s making a sound, and the app displaying a message: “I’m hungry.” That’s not as far-off as you might think.

In five years, computers will be able to detect elements of sounds that humans can hear but aren’t able to understand. As every parent knows, the difference between normal babbling and a message that something is wrong can be extremely subtle. Computers of the near-future will not only be able to detect whether a baby is upset, they’ll be able to determine if the child is hungry, tired, hot or in pain.

By interpreting different sound pressures, vibrations and waves, computers will be able to predict when trees are about to fall, when landslides are imminent, or when cars are about to collide before humans can.

Computers are already starting to do this: In Galway Bay, Ireland, IBM researchers are capturing underwater noise levels to understand the impact that different sounds have on sea life.

—————————————————————————————-

Taste:

Taste

Within the next five years, a computer will help you make the perfect recipe — not too sweet, not too salty, not too crunchy, but just the way you like it.

By breaking down foods to the molecular level, computers will be able to use complex algorithms to determine what flavor combinations are the most appealing. They could then develop recipes that provide the ideal flavor and texture of food. Think of it as the Watson of Top Chef.

The technology could be used to help people eat better, IBM says. By making healthy foods taste better, people might crave vegetable dishes instead of sugary and fatty junk foods.

Though computers aren’t quite there yet, they are “tasting” things today. Specially designed microchips are being used in chemical and power plants to sense biohazards in the air. IBM researchers are working to adapt that technology to analyze the chemical structures in food.

————————————————————————————————-

Smell:

Smell

Do you think you’re coming down with a cold? In five years, you’ll be able to breathe into your smartphone to find out.

IBM researchers are developing technology to analyze odors in people’s breath that identify ailments, including liver and kidney disorders, asthma, diabetes and epilepsy. By determining which odors and molecules in a person’s breath are associated with each disease, computers of the future will be able to make an instant analysis for problems that today could be misdiagnosed or go undetected by a doctor.

Computers will also be able to detect harmful bacteria that cause Staph infections in hospitals just by smelling the surroundings.

In a more rudimentary form, computers are smelling things now: Agricultural sensors smell soil to determine crop conditions, sensors in museums determine which gas levels are ideal to preserve paintings, and city sanitation departments use computers that can smell garbage and pollution to alert workers when conditions are getting dangerous.

OTHER BLOGS/CONTACT: 

dsjpurdy99.wordpress.com

dailyinspirationsforjpurdy99.blogspot.com

twitter.com/jpurdy99

jpurdy99@gmail.com

Advertisements