new icn messageflickr-free-ic3d pan white
PhotonQ-Bicentennial Man | by PhOtOnQuAnTiQuE
Back to photostream

PhotonQ-Bicentennial Man

Andrew Martin: May one, sir? Is now a good time?

'Ma'am' Martin: What? A good time for what?

Andrew Martin: Last night, Sir taught...

Sir: No, no, no, don't blame me Andrew. Just... go ahead.

Andrew Martin: Thank you sir

Andrew Martin: [Very fast] Two cannibals were eating a clown. One turns to the other and says "Does this taste funny to you?" How do you make a hanky dance? Put a little boogie in it! What is a brunette between two blondes? A translator! Do you know why blind people don't like to sky-dive? It scares their dogs! A man with demensia is driving on the freeway. His wife calls him on the mobile phone and says "Sweetheart, I heard there's someone driving the wrong way on the freeway." He says "One? There's hundreds!" What's silent and smells like worms? Bird farts. It must have been an engineer who designed the human body. Who else would put a waste processing plant next to a recreation area? A woman goes into a doctor's office, and the doctor says "Do you mind if I numb your breasts?" "Not at all." *makes 'motor-boating' noise. "Num-num-num-num."

Andrew Martin: [Family chuckles] One did it sir!

Sir: Andrew, it was fine, but we might want to talk about appropriatness and um, and timing.

Andrew Martin: It's ten-fifteen sir.

[Family laughs hysterically] ----- From movie, Bicentennial Man.


The movie is based on the novel The Positronic Man, co-written by Isaac Asimov and Robert Silverberg which is itself based on Asimov's original novella titled The , the plot explores issues of humanity, slavery, prejudice, maturity, intellectual freedom, conformity, sex, love, and death. Bicentennial Man




Ever heard the 3 Laws of Robotics ? I learned a few more tonight :


By Asimov :

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


0. Zeroth Law. "A robot may not harm humanity, or, by inaction, allow humanity to come to harm."


4. Fourth Law by Lyuben Dilov : "A robot must establish its identity as a robot in all cases."


4. Fourth Law in Foundation's Friends : "A robot must reproduce. As long as such reproduction does not interfere with the First or Second or Third Law."


5. The Fifth Law by Nikola Kesarovski : "A robot must know it is a robot."


"Three Laws may decay into obsolescence: Robots use the Zeroth Law to rationalize away the First Law and robots hide themselves from human beings so that the Second Law never comes into play. Brin even portrays R. Daneel Olivaw worrying that, should robots continue to reproduce themselves, the Three Laws would become an evolutionary handicap and natural selection would sweep the Laws away — Asimov's careful foundation undone by evolutionary computation. However, the robots would not be evolving through mutation but through design since the robots would have to follow the Three Laws while designing, the prevalence of the laws is ensured." David Brin novel Foundation's Triumph Wikipedia


More cool stuff : Ethics of artificial intelligence , Roboethics

1 fave
Uploaded on November 20, 2010