Shop OBEX P1 Docs P2 Docs Learn Events
Atlas Shrugs Off Abuse — Parallax Forums

Atlas Shrugs Off Abuse

ercoerco Posts: 20,256
edited 2016-02-24 10:55 in Robotics
But robots never forget. Sleep tight, bearded middle age Tech #42861.

http://techcrunch.com/2016/02/23/i-know-i-shouldnt-feel-bad-for-a-robot-but-i-do-anyway

Comments

  • Spooky!
  • When the robot revolution comes, I am running to the sporting goods store.
  • If he ran away because of abuse, I will take him in and raise him as my own.
  • Seeing videos like this reminds me how important it is for Engineers to think about how their customers might eventually use the products they invent. How many rights abuses and big lies have we caught them in during just the past year alone?

    Sigh

    I distrust AI much less than the type of people that want back doors in my phone.

    That robot is a passive teddy bear because it was programmed to be. I picture these things being programmed to enforce curfews in the next 100 years.

    So the big question is, who is going to help me build Galt? He can avoid walls so far, but does have trouble with chair legs. Who's with me?
  • WhitWhit Posts: 4,191
    edited 2016-02-24 19:21
    Pushing this Atlas down with a stick would look and "feel" very different to us viewers, if it stood up, turned around and took the stick from the human being - and it is just a matter of programming.

    Like most everything - the morality of the robot is neutral - It is what we do with the robot that is good or evil.
  • Heater.Heater. Posts: 21,230
    Everybody get ready for ED209

    Seems this Atlas can already outperform ED209 on the stairs and getting up from the ground test.
  • Whit wrote: »
    Like most everything - the morality of the robot is neutral - It is what we do with that is good or evil.

    We do have the three laws. :)
  • Heater.Heater. Posts: 21,230
    edited 2016-02-24 16:39
    We humans have had 10 laws for 3000 years or so. Does not seem to have prevented the endless war, slaughter and atrocities since then.

    Robots may be just another tool to carry on in that tradition, like swords, cannons, guns, dynamite, nukes and the rest.

    Ah well.
  • I question whether the people who design robots like this, clearly for future government use, are naive or corrupt.

    It has to be "they pay me so much" or "they'll use it only to help and protect us" and I don't know which it is.

    68122325d859844a40c34e6c89e3924199bbb413337e8c016333c912276d11e7.jpg
  • The next robot that covets my ox is getting stoned!!
  • Heater. wrote: »
    We humans have had 10 laws for 3000 years or so. Does not seem to have prevented the endless war, slaughter and atrocities since then.

    Robots may be just another tool to carry on in that tradition, like swords, cannons, guns, dynamite, nukes and the rest.

    Ah well.

    Of course they will. But they will also be "just another tool" for saving lives, exploration, and the rest.

    But... there is a big difference between what Atlas is demonstrating and the sort of robot that the Three Rules (or Ten Rules) would apply to, namely that the robot still lacks something we would define with terms like "self-awareness", "moral compass", "sentience", etc.

    The RoboCop clip you happen to show proves the point. Yes, the ED209 can be viewed as being in violation of the Three Rules. But it was still just a tool that lacked sentience, and therefore not something that the Three Rules applied to. Its operators, on the other hand, were clearly in violation of the Ten Rules.

    It makes me think that, in the long run, even if there is a point where robots gain true sentience, most users will want something less-than-sentient. After all, who wants slaves that think for themselves!?! In that case, the Three Rules still won't apply to the robots. But the Ten Rules will still apply to the robot owners.
  • kwinnkwinn Posts: 8,697
    Publison wrote: »
    Whit wrote: »
    Like most everything - the morality of the robot is neutral - It is what we do with that is good or evil.

    We do have the three laws. :)

    Do we? Or do we have to depend on politicians for them to be included in the programming?
  • We may be getting off track.

    I believe the upright BigDog is a great stride from the four legged version. With Google's money, it should go far. Maybe put one driving a Google car? :)
  • Heater.Heater. Posts: 21,230
    edited 2016-02-24 19:07
    Seairth,

    Can you define "self-awareness", "moral compass", "sentience" ? In such a way that we can measure it and talk about it in a logical, scientific manner?

    Besides, Asimov's whole point about the 3 laws was to demonstrate how they break down. Much like the modern day laws of humans lead to contradictions and paradoxes, undecidable issues, and unintended consequences.

    Is it even possible to have a "sentient" robot if it has such rules hard wired into it's very being?

    Now, in terms of the philosophical debate about "sentience", "self awareness" or "conciousness" I'm still on the ground floor. For example:

    If I stick my hand in a fire I feel something real bad. It's pain. It's an experience.

    But, if you stick your hand in the same fire. I feel nothing.

    So here is the question, how can I tell you have any of that "self awareness" or "conciousness" stuff? How do I know you feel anything?

    Well, you scream and shout, the same way I did. So I kind of infer that there is something going on with you that is like my experience.

    It's far from proof that you are self aware. Presumably a robot could do all that screaming and shouting as well. Traditionally animals did all that screaming and shouting but us humans never ascribed any self awareness or conciousness to them.

    Bottom line is that the only self aware being I confidently know exists is me!
  • Heater. wrote: »
    We humans have had 10 laws for 3000 years or so.

    I think more like 613.

    Anywho, BD seems to revel in these videos that show abuse of their robots. I'm guessing it's part of their viral marketing approach. Had the robot been shown already prone on the ground, and getting up to full posture (demonstrating the same programming either way), I bet there wouldn't be quite so many embeds and shares.
  • WhitWhit Posts: 4,191
    My point is that robots like this will be used for great good and great evil - we need not fear them - just each other!

    The technology is neither good or evil. We are the "creative" ones...and not always constructive.
  • Heater.Heater. Posts: 21,230
    @Whit,

    I'm totally with you. Humans learn stuff, make new things. I'm not about to suggest we stop doing that because it may get used in bad ways.

    @Gordon,
    I think more like 613.
    I don't get understand. 613 years ago was was only 1403 or so. I think we have evidence enough that the Mosses story is much older.

    Also I'm not sure what you mean by "abuse of their robots". The task is to create a machine that can balance and walk and do that in the face of strange terrain and weird stuff happening. At the end of the day it's software and they are trying to break it with odd inputs. Same as we do for every program we have ever written. In this case it involves poking at it with a stick or moving it's box. Basically the same as hacking into my latest web site with some unexpected input, does it fall over or not?

    Now, I kind of agree with you. We have been seeing staged "intelligent" robot presentations since the 1950's. How much of this Boston Dynamics stuff is realistic?





  • WhitWhit Posts: 4,191
    edited 2016-02-24 20:44
    @Heater - Gordon is referring to the 613 rules that the Jewish people developed in order to fully keep the 10 Commandments.

    See https://en.wikipedia.org/wiki/613_commandments
  • Heater.Heater. Posts: 21,230
    Whit,

    Wow, after decades of hanging around with Jewish guys never have they mentioned the 613 rules.

    I think I'm in tune with some of it:

    204. Not to swear needlessly

    348. Not to tattoo the body like the idolaters

    609. To have a place outside the camp for sanitary purposes

    On the other hand:

    153. Not to eat things that swarm in the water

    Is a bit of a bugger (Oops, violation of 204) because if I had my way I would live off of prawns and such.

    How come they have been holding out on me all this time?

    Seriously, I'm not convinced you can make "good" people out of such rules. Never mind robots.





  • Publison wrote: »
    We may be getting off track.

    I believe the upright BigDog is a great stride from the four legged version. With Google's money, it should go far. Maybe put one driving a Google car? :)
    That robot will also insert a charge adapter into the car's power port and be programmed to react to anyone reaching for its power pack...
  • Whit wrote: »
    My point is that robots like this will be used for great good and great evil - we need not fear them - just each other!

    The technology is neither good or evil. We are the "creative" ones...and not always constructive.

    Yes, yes, and yes.
    Whit wrote: »
    @Heater - Gordon is referring to the 613 rules that the Jewish people developed in order to fully keep the 10 Commandments.

    Whit, somehow I knew you'd get the reference!

    Heater, BD has typically shown humans intentionally trying to push, kick, or shove their robots to show their ability to re-balance, yet there are many ways of demonstrating this. It ALWAYS creates controversy, as it did with BigDog. Humans tend to anthropomorphize machines when they are in living creature shapes, and there's an emotion reaction to the visuals of agression. BD knows this, and uses it as part of their marketing. I'm not saying it's right or wrong of them to do so, but a remark about the power of social media. A video made to generate controversy will get shared much more.

    We already know that the US military is not be interested in these robots unless they are silent; the Army already has said thanks, but no. But a completely quiet robotic soldier is going to take a lot more development, so this is why you're now seeing these demonstrations in warehouse settings. I just thought the usual "now let's kick it over!" scene a bit out of place in the rest of the video.
  • Heater.Heater. Posts: 21,230
    Gordon,

    OK. I can well believe that any company making whatever tries to exploit whatever media is available. Which now a days is "social media".

    I did read recently that this whole idea was declared not useful to the military. It makes noise, it needs fuel, squaddies can carry all that stuff just as well or better, more reliably, with less support and cheaper.

    On the other hand, has anyone else demoed such robotics feats? The closest I have seen in the Sony whatever it is called.

    It's impressive, assuming not too much of it is orchestrated.

    Mind you, their headless dog machine thing running around was somehow very disturbing.



  • BD's products are all quite disturbing. But if I would qualify for a job there I would move to Boston without any useless waiting.

    They made major progress in motion control, durability, moving in different terrain, moving fast, now even two legged.

    Sure the main goal here is to produce something to go into war. But I see no difference in working on that then working on a flight controller for a military airplane.

    It's just more fun at Boston Dynamics, I guess. Sadly there is no need for gray haired COBOL programmers down there at BD.

    @erco might have a chance there (flame throwers and such), but me? Nope.

    That 'headless dog' was a test for a transport unit, carrying weapons and tools/packs/weapons behind (or upfront?) the ground troops. So they do not need to carry it by themselves and be more mobile in action.

    There is also some video out in YT where the 'headless dog' does have a head and is throwing concrete blocks quite precise.

    Like in them extreme quiet non nuclear (small?) submarines sold by Germany worldwide since decades, the noise problem might be solvable.

    Just think about them as ground based drones and you might be shivering about the implications they bring, but they will be developed. Not just here in the US but generally. And clones of them sold on Ebay and Alibaba, like realistic sex dolls now, just as bouncers for your club, bar, golf course or so. I do fear the private use more then the military one, but basically I fear both coming to reality pretty soon.

    Sadly,

    Mike


  • ercoerco Posts: 20,256
    Heater. wrote: »
    We humans have had 10 laws for 3000 years or so. Does not seem to have prevented the endless war, slaughter and atrocities since then.

    THAT good sir, is a meaningful and memorable quote!

  • msrobots wrote: »
    Sure the main goal here is to produce something to go into war. But I see no difference in working on that then working on a flight controller for a military airplane.

    Or not. BD has been largely funded by DARPA, and there are existing contracts to fulfill. But Google's stance on continuing with military robots has been more nebulous. I think they'd like us to believe it's a way to make money while learning how to design productive commercial robots of the future. Of course, what Google has said/hinted at may not be the same as what they really intend to do. It's a matter of wait and see.


  • Keith YoungKeith Young Posts: 569
    edited 2016-02-26 18:11


    Pretty funny. Though I can see this being a thing in 50-100 years.
  • ercoerco Posts: 20,256
    When Atlas can do THIS, call me. Some incredible acrobatics in there.

  • Seeing videos like this reminds me how important it is for Engineers to think about how their customers might eventually use the products they invent. How many rights abuses and big lies have we caught them in during just the past year alone?

    Sigh

    I distrust AI much less than the type of people that want back doors in my phone.

    That robot is a passive teddy bear because it was programmed to be. I picture these things being programmed to enforce curfews in the next 100 years.

    So the big question is, who is going to help me build Galt? He can avoid walls so far, but does have trouble with chair legs. Who's with me?




    Did you mean Gort? At least I think that was his name, one of the most impressionable movies I had seen as a kid.
    484 x 500 - 60K
  • John Gault is key character in "Atlas Shruged"
    Jim
Sign In or Register to comment.