Information selection in a multi sensory manner

I’ve been thinking about touch and
what extra it gives us people. Do we need touch at all? If yes, and most probably we do, why is it so?My thinking
follows like this. We do need tactile
, it gives an extra dimension to the flow of information we receive. I am not qualified to talk about
the psychology of touch and what biochemical processes are going on but one thing I know for certain is that
looking at a photo of a family member, viewing your lover through a computer screen is nothing like actually
hugging them, holding hands, or even just shake hands with somebody you respect. I could probably scribble
together a paragraph on how oxytocin hormones
are released when executing a touch, but for the moment let’s just assert we require and benefit from tactile
information. It is often easier to grab things and replace them physically than doing it virtually with the help
of a cursor or similar. Often it is proving to be more helpful to explain a topological concept using a 3D shape
rather than a 2D drawing.

If touch has its own specific features we take advantage of, surely all our senses can provide with something
unique, some sort of information, a set of knowledge other senses fail to deliver as quickly, as detailed, as
memorable or deliver at all. Take for instance taste. Even though a food composition
straight out of the fridge might look perfectly fine, tasting it might rapidly reveal the lunch is not as healthy
any longer as anticipated initially. This is probably where taste masters the
most and has the greatest comparative advantage respect to other senses.

As an other example take hearing. It turns out I was
watching a movie with my family a few years ago during Christmas in
the living room. While enjoying the film, we noticed a cracking
type of noise from the kitchen. It was the Advent wreath which
caught fire from one of the four candles. There was no way we could
have noticed it by any other means but hearing, since the partial
wall covered the sight of the Christmas decoration. So hearing gets
a plus there. I could also mention the example of a fire alarm
going off during the night when people are likely to sleep and not
keep an eye on lamps or similar alarming notices.

Let’s shift to smell. Imagine you are listening to music
while cooking and doing some of the washing up. If the oven was
just opposite side to the sink, as it is in my flat at the moment,
you wouldn’t see, nor you would hear anything if the dinner you are
preparing in the hot cupboard would decide to burn. However, you
might be able to smell the smoke and act accordingly. At the same
instance we could probably list similar benefits to smell as I did
considering touch or taste. It feels nice to sniff a pleasing
perfume, as well as it can be alarming of something unhealthy.

Finally the grand master, vision is back. Yet again, vision seems
to be the ultimate sense we use the most, we use the most
efficiently. However, as described above, sight can fail
sometimes. Hence I came to the conclusion we need to segment
information into subsets and assign each subset into the modality
it is best conveyed and processed. Let’s stick to the example of a
scientific graph. For instance a Gauss distribution of data plots
with some scattering and noise. Now let us exclude the most
conventional way of interpreting graphs and diagrams i.e. through
visual perception. Instead I promote looking at the
remaining information modalities. I must admit it is even difficult
for me to imagine tasting or smelling a graph but touching, hearing
and verbalising should ring the bell. Some people would say why
don’t we convert all the information into sounds? Some would ask
why don’t we turn everything into tactile formats?Yet an other
group of people might think, why don’t we describe the graph

My point of view is that why wouldn’t we do all of that at the same time. A description could give a nice overview
of what can be seen but nothing guarantees the description gives account on all features a user might want to know
about. So we should give the opportunity to explore. A general trend of the Gaussian curve could be well
demonstrated using a tactile curve additionally to tactile axes. However, considering outliers sound scanning
would be more informative most certainly. Similarly, extracting text labels and importing them into text to speech
should deliver a faster transmission of information than braille cells for example. Thus, splitting a coherent set
of information into subsets and process each of these information bits individually via the appropriate output
modality is something we might need to consider as an idea seriously when trying to overcome a sensory loss.