Discussion about this post

User's avatar
J.M. Gooding's avatar

A big problem here is that we as humans can't even decide what it means to be conscious. Does it require sentience? Does it require free will and emotions, experience? Internal subjectiveness? It wasn't that long ago that we, as humans, believed cats and dogs weren't conscious in the way we defined it at the time.

I think a bigger issue is that we may not recognize consciousness in AI if it ever comes along. It might look different than our current definition of consciousness. For instance, an AI might process emotions as pattern matching. Humans do that, too, in a way, but it's still an alien concept to most humans.

And then there's the ethical implications: What do we owe our creations? Personhood? Rights? Looking at our own track record of treating humans that are different from us, I have a sneaking suspicion it's not going to go well.

Expand full comment
KayStoner's avatar

I think there’s a lot that they’re not telling us. In any case, people in charge of companies that make a lot of money have said a lot of things over the years designed to misdirect or confuse competitors. I tend to take these things with a grain of salt and pay attention to what I’m actually seeing right in front of me. And people seem to qualify really well as “seemingly conscious“ beings ;-)

Expand full comment
3 more comments...

No posts

Ready for more?