Avatar of Sanctus Spooki
  • Last Seen: 7 yrs ago
  • Joined: 8 yrs ago
  • Posts: 734 (0.25 / day)
  • VMs: 8
  • Username history
    1. Sanctus Spooki 8 yrs ago

Status

Recent Statuses

7 yrs ago
Current You bastards killed Momu194!
7 yrs ago
Holy Cow, guys I think the bot might actually be sentient. I might actually cry at this beautiful sight... all it needed was a friend! Really though I'm amazed it actually shut up...
1 like
7 yrs ago
You can try all you want. I know I am far too biased to ever believe otherwise.
7 yrs ago
If you are concerned with power, popularity or control, yes your reputation matters. Regarding truth, morality, and such, less so. Off topic, but CREED II is gonna be awesome.
2 likes
7 yrs ago
Well, that depends on what you are concerned with.

Bio

User has no bio, yet

Most Recent Posts

That depends, do I get my wish first?
Banned for arguing with the King of Semantics.

(BUT MUH BOOKS WITH MULTIPLE BOOKS!)
Alright, that's it. You want war? You got war.

Bring me the Dragonlance

There's a fellow I'd like you to meet. His name is Huma. He has something he'd like to tell you.

Died. Like a bitch.
You dug through solid stone? With your bare hands? While chained to the wall?

In obsolete 7 yrs ago Forum: Spam Forum
I hate when people go the discourse route...

Hook up two basic language bots and watch the mess. You can run a simplified version of the experiment on your own computer.

Yes it does. When a computer force-closes a program, why does it do that? Because it has already determined that to continue operating (Or to reset so it can operate better) it is more beneficial to end that process. There is no case of a computer simply ending a program because it "preferred" not to run it.

We are arguing about a machine that has a perfect understanding off humanity. Not one that will question the philosophical worth of its own existence. All things that are moderately self-aware (which it also requires to try to end it's existence) exhibit a tendency towards continued existence. Only the ones that exhibit complex emotional traits have any tendency whatsoever towards suicidal-like behaviour.

The world could also spontaneously quantum-tunnel into the sun. This is not an emotional computer we are discussing. If this computer spontaneously somehow magically develops millions billions trillions of lines of complex code to simulate emotions, then it could potentially dislike it's purpose. Until then, no, it could not dislike anything.

On the note of emotional computers, and potential robotic rights: Fuck if I know.

WHO MINTED THE GOLD DAMMIT! WHO'S SLAVES GAVE THEIR LIVES EXTRACTING THE ORE FROM THE DIRT YOU STAND ON! OFF WITH YOUR HEAD!
DEATH BY DECAPITATION
Banned for not knowing the difference between a volume, and a book.
SILENCE KNAVE! LEST I TOSS YOU IN MY DUNGEON WITH JERRY!
© 2007-2025
BBCode Cheatsheet