posted on Mar, 19 2008 @ 09:32 AM
Uh oh, looks like someone didn't obey the rules!
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
This robot should be sent to the robopsychologist.
I am just joking.
Seriously though, I guess he wanted suicide, without the baggage of actually topping himself (even though technically he did) At 81 years of age, I
figure it's more self euthanasia from not wanting to be reliant on others, than an actual case of suicide from psychological problems. He was
basically at the end of his life and wanted to die with dignity (and probably fame), not in a nursing home.