Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Just because we rely on vision to interface with computer software doesn't mean it's optimal for AI models

This is true but AGI means "Artificial General Intelligence". Perhaps it would be even more efficient with certain interfaces, but to be general it would have to at least work with the same ones as humans.

Here's some things that I think a true AGI would need to be able to do:

* Control a general purpose robot and use vision to do housework, gardening etc.

* Be able to drive a car - equivalent interfaces to humans might be service motor controlled inputs.

* Use standard computer inputs to do standard computer tasks

And this list could easily be extended.

If we have to be very specific in the choice of interfaces and tasks that we give it, it's not a general AI.

At the same time, we have to be careful at moving the goalposts too much. But current AI are limited to what can be returned in a small number of interfaces (prompt with text/image/video & return text/image/video data). This is amazing, they can sound very intelligent while doing so. But it's important not to lose sight of what they still can't do well which is basically everything else.

Outside of this area, when you do hear of an AI doing something well (self driving, for example) it's usually a separate specialized model rather than a contribution towards AGI.



By this logic disabled people would not class as "Generally Intelligent" because they might have physical "interface" limitations.

Similarly I wouldn't be "Generally Intelligent" by this definition if you sat me at a Cyrillic or Chinese keyboard. For this reason, I see human-centric interface arguments as a red herring.

I think a better candidate definition might be about learning and adapting to new environments (learning from mistakes and predicting outcomes), assuming reasonable interface aids.


> Similarly I wouldn't be "Generally Intelligent" by this definition if you sat me at a Cyrillic or Chinese keyboard

Would you be able to be taught to use those keyboards? Then you're generally intelligent. If you could not learn, then maybe you're not generally intelligent?

Regarding disabled people, this is an interesting point. Assuming that we're talking about physical disabilities only, disabled people are capable of learning how to use any standard human inputs. It's just the physical controls that are problematic.

For an AI, the physical input is not the problem. We can just put servo motors on the car controls (steering wheel, brakes, gas) and give it a camera feed from the car. Given those inputs, can the AI learn to control the car as a generally intelligent person could, given the ability to use the same controls?


If all we needed was general intelligence, we would be hiring octopuses. Human skills, like fluency in specific languages, are implicit in our concept of AGI.


So I am a blind human. I cannot drive a car or use a camera/robot to do housework (I need my hands to see!) Am I not a general intelligence?


I replied this to another comment, but I'll put it here: your limitation is physical. You have standard human intelligence, but you're lacking a certain physical input (vision). As a generally intelligent being, you will compensate for the lack of vision by using other senses.

That's different to AIs, which we can hook up to all kinds of inputs: cameras, radar, lidar, car controls, etc. For the AI the lack of input is not the limitation. It's whether they can do anything with an arbitrary input/control, like a servo motor controlling a steering wheel, for example.

To look at it another way, if an AI can operate a robot body by vision, then we suddenly removed the vision input and replaced it with a sense of touch and hearing, would the AI be able to compensate? If it's an AGI, then it should be able to. A human can.

On the other hand, I wonder if we humans are really as "generally intelligent" as we like to think. Humans struggle to learn new languages as adults, for example (something I can personally attest to, having moved to Asia as an adult). So, really, are human beings a good standard by which to judge an AI as AGI?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: