Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Purple air sounds like rage." Is grammatically correct and passes a wide range of linguistic filters, yet for some reason it has zero meaning. Still read it a few times, it feels like there should be some meaning there. This get's at a core problem with language. Language is based around the assumption that something meaningful is trying to be communicated so we rarely bother with much precision. You can refer to ideas as physical objects but parsing into something meaningful requires a lot of context.

Redness seems to refer to some sort of external stimulus, but both redness and red are really internal qualifications of some external stimulus. So, I can point you to a red filter that looks at color codes and compares them to FF0000. If I then run that filter on FF0000 I get an internal state of 100%. But what that does not create something with emotions and your linking them in that sentence so what's wrong? I would suggest it's not the fact that the code does not produce emotion that's the problem it's your assumption that anything that understands redness must include an emotional context. Still, if you want both a test for redness and an emotional response you could code a simple AI with competing goals who responds emotionally to increasing amounts of red in it's environment.

Granted, you could then argue that such a simplistic representation of internal state is to simplistic a response to cover the full range of emotions. But a that point your not talking about 'redness of red' in the abstract but getting into mimicking your personal emotional responses. Which could be done to some reasonable level of accuracy.

PS: I could also have completely missed what you where trying to communicate, but I hope that line of reasoning demonstrated what I was talking about. Namely that in the context of AI you can translate abstract philosophical ideas into a meaningful context but they stop being interesting somewhere in the process.



> "Purple air sounds like rage." Is grammatically correct and passes a wide range of linguistic filters, yet for some reason it has zero meaning. Still read it a few times, it feels like there should be some meaning there.

It does have some meaning, in that it implies that the one who wrote it has synesthesia.


The problem of qualia[1] is about the relation between knowledge of a phenomenon and experiences of that phenomenon. 'Emotion' has nothing to do with it, unless you consider the mental state that is the result of 'the perception of something blue' to be a distinct emotional state, in which case your usage of the word 'emotion' deviates from the common way[2] it is used.

I'm sorry if this seems unsubstantiated and unsatisfactory, but I'm afraid that you just do not yet understand the profoundness of the problem 'derrida' refers to. Your proposed way of attacking the problem is too naive.

[1] http://plato.stanford.edu/entries/qualia/ [2] http://plato.stanford.edu/entries/emotion/


Human emotion is an complex internal biochemical reaction. Suggesting it has some deeper meaning from what is actually going on is naive.

Suppose you walk into a room who's walls are covered in fresh blood. Now Suppose instead you walked into an identical room that was identical room except the blood was green. What separates those two rooms is at some level a simple color filter, but your internal responses are going to be vary different. If you could slow down time and look at the individual responses to each room you could watch as the red or green information propagated around your individual neurons.

But at what part of that does 'redness' apply? The individual rods and cones independently respond to photons but you have to go into post processing before the concept of red is separate from white or black. If you look at the way neurons work there is some computation involved and comparison between individual sensor neurons but at some point that neuron fires and guess what that's what redness means. It's the internal state of on ore more neurons looking at those signals. Now you can ask about memory but it terns out that's a recording of internal states which is not really a copy of some RGB value but a copy of some of your internal state while the event was gong on. And when it comes to language you are communicating internal physical states. You can describe your dopamine levels in flowery language but there are underlying physical processes which your are describing.

Having said all that you could talk about the platonic ideal form for redness, but just because what he said sounds like it has meaning does not mean it does. He did not understand what was going on so he built up complex ideas that don't apply in the abstract instead he is describing your internal classification of things. When I look at this picture my internal chair classifier goes off etc.

PS: I am of course greatly simplifying my description of how the brain works but from a philosophical perspective the details are not vary important.


"Purple air sounds like rage."

Take some acid, and that sentence might make perfect sense.


If someone like John Lennon had uttered that phrase, people would do their utmost to find some meaning in it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: