Computer Senses

A place to discuss the science of computers and programs, from algorithms to computability.

Formal proofs preferred.

Moderators: phlip, Moderators General, Prelates

jewish_scientist
Posts: 1036
Joined: Fri Feb 07, 2014 3:15 pm UTC

Computer Senses

Postby jewish_scientist » Sun Aug 04, 2019 6:40 pm UTC

I was reading this article and it got me thinking about how well computer can handle normal human senses besides vision and hearing.

My guess is that they would be absolute garbage at interpreting smells. Vision relies on 5 inputs (2 for location and 3 for each color channel), while we have ~400 types of smell receptors.

I am not sure how well they could handle touch though. Touch is based on 5 inputs (2 for location, temperature, "pain", and pressure). Although that is the same amount as vision, I think the main problem would be making hardware that can actually detect "pain". In terms of the sense of touch, "pain" means the detection of damage to skin cells. How can you tell a computer that skin cells have been damaged when there are no skin cells to be damaged?

I think that taste would be a fairly easy one for computers to handle however. There are 6 types of taste inputs (sweet, bitter, sour, salty, umami, and temperature). Although that is much more than vision, I think it would be easier for a computer to process taste data because there is no spacial component.

What do you think?
"You are not running off with Cow-Skull Man Dracula Skeletor!"
-Socrates

Tub
Posts: 475
Joined: Wed Jul 27, 2011 3:13 pm UTC

Re: Computer Senses

Postby Tub » Mon Aug 05, 2019 1:11 pm UTC

What makes you believe that counting inputs will tell you anything valuable about the quality of a trained machine learning algorithm?

User avatar
Xanthir
My HERO!!!
Posts: 5426
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Computer Senses

Postby Xanthir » Mon Aug 05, 2019 4:54 pm UTC

Yeah, that's not a valid metric. Vision isn't "five inputs", it's millions of inputs, from each of our retinal cells. So is smell, taste, and touch.

The hard part of vision isn't receiving inputs, it's *interpreting* them; most visual phenomena are complex and based on many non-trivial relationships between cell inputs, ranging from things like edge detection, to object integration, to motion across "frames", etc.

Smell and taste might have their own complexities that we're not currently aware of, but at least on the surface they look to be much simpler. Combinations of smells are important to us, but we don't specially interpret positions of smell sensations in our nose, or track them carefully across time, etc. It's probably generally just "X% activation of smell-sensor A, Y% activation of smell-sensor B, that should correspond to high-level smell 'hamburger'". There's lots of things to categorize, but categorization is something computers and ML are pretty good at. (Until you throw adversarial smells at them and convince them that something high in hydrogen sulfide is actually a wonderful potpurri, killing everyone in the room.)

Touch is pretty complex, but still likely simpler than vision. Probably harder to train for, tho. There's nothing particularly special about pain vs skin cells, tho. That's still nerve activation. Figuring out how to create a sensor that registers pain in all the same situations we do might hard in the corner cases, but for all the important pain shouldn't be too complex.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

elasto
Posts: 3778
Joined: Mon May 10, 2010 1:53 am UTC

Re: Computer Senses

Postby elasto » Fri Aug 09, 2019 9:25 am UTC

When they said 'inputs' I think they meant something more like 'dimensions'. So they say vision has five dimensions (data can vary across 2 dimensions for position and 3 for color) but smell has hundreds of dimensions.

I think they misunderstand how, say, computer vision works though, with neural nets decomposing images into hundreds if not thousands of dimensions internally, all of which it is capable of deriving for itself. These dimensions are at least as complex as 'input dimensions' - somewhat equivalent to when clusters of biological neurons detect edges, shadows, shapes and so on.

This vid is a pretty fun exploration of one AI approach to decomposing and composing faces - watch at least the first few minutes to gain some insight

User avatar
Xanthir
My HERO!!!
Posts: 5426
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Computer Senses

Postby Xanthir » Fri Aug 09, 2019 6:08 pm UTC

Yeah, I took their meaning correctly, it's just, as you and I say, that vision involves millions of inputs/dimensions. Every retinal cell contributes 3 dimensions: its activation level, and its 2d position on the retina. All three of those data pieces are independent of each other and of the similar data from other retinal cells, and all three are vital to distinguish if you want to see things.

You *also* have millions of smell sensors, but their positions and individual activation levels don't matter as much, so you can probably usefully abstract them into groups based on chemoreceptor, with each group contributing one dimension, their total activation. It's like if your vision was only good for detecting the average color of your visual field - a much simpler model to work with than actual vision.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

User avatar
Pfhorrest
Posts: 5478
Joined: Fri Oct 30, 2009 6:11 am UTC
Contact:

Re: Computer Senses

Postby Pfhorrest » Sun Aug 11, 2019 5:08 pm UTC

I recall reading something years ago about someone building a computer smelling/tasting machine that output a text description of what it tasted/smelled. IIRC it concluded that a human hand stuck into its sensors was pork or ham or something.
Forrest Cameranesi, Geek of All Trades
"I am Sam. Sam I am. I do not like trolls, flames, or spam."
The Codex Quaerendae (my philosophy) - The Chronicles of Quelouva (my fiction)

jewish_scientist
Posts: 1036
Joined: Fri Feb 07, 2014 3:15 pm UTC

Re: Computer Senses

Postby jewish_scientist » Thu Aug 15, 2019 3:22 am UTC

Xanthir wrote:There's nothing particularly special about pain vs skin cells, tho. That's still nerve activation. Figuring out how to create a sensor that registers pain in all the same situations we do might hard in the corner cases, but for all the important pain shouldn't be too complex.

The problem is that the nerves that detect pain fire whenever skin cells die. It makes no distinction between the cause. That means that a single dimension (thanks elasto for the correction) would have to correlate to extreme heat, cold, pressure, force, pH, etc. Having multiple sensors correspond to one dimension just sounds like a recipe for disaster.
"You are not running off with Cow-Skull Man Dracula Skeletor!"
-Socrates

User avatar
Xanthir
My HERO!!!
Posts: 5426
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Computer Senses

Postby Xanthir » Thu Aug 15, 2019 4:09 pm UTC

I'm not sure why you think that would be difficult. Once you calibrate each sensor to output a reasonable "activation" message, treating all of them the same is trivial.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))


Return to “Computer Science”

Who is online

Users browsing this forum: No registered users and 7 guests