Joshua Lieberman wrote: (01)
> There is an amusing / alarming story from the filming of Lord of the
> Rings concerning the use of a massive autonomous agent animation
> program for the larger battles. It seems the first attempts at
> defining the parameters of warrior behavior resulted in most of the
> agents fleeing the battle scene. Only after concern for personal
> safety was reduced to an appalling level could the battle animations
> actually proceed in a "realistic" fashion. (02)
This comes as no surprise. For thousands of years, soldiering
has been motivated by one or more of:
(1) protection of home and family
(2) belief in a higher good, that transcends self-preservation
(3) lust for battle and loot, something to make life worth
living in a world where life was drab and short,
(4) belief that your army is invincible, and therefore, you are
unlikely to take damage, or at least that it is no more dangerous
than many other jobs
(5) force and fear, soldiers were serfs and criminals serving
time, and while the enemy might kill you if you charge, your own
MPs will surely kill you if you run.
(6) social force, soldiering is a requirement of manhood, a
sacred duty, cowardice makes you a worthless outcast. (03)
And in practice, it is the inverse of (4) -- the perception that
your army is going to lose -- that makes self-preservation the
higher priority over the other motivations involved. If you
lose, the other objectives/consequences will not be fulfilled.
And (6) regularly trumps even that -- Leonidas' 300 Spartans, the
captain who goes down with his ship, the 12-year-old drummer
beating rally-round-the-flag for a routed regiment, and so on. (04)
You don't have to program in sheer recklessness, although it is
definitely a factor; you just have to model the operating value
systems correctly. (05)
But managing conflicting directives and values is a problem for
many humans. So we shouldn't be surprised that it is a problem
for artificial intelligences. And it is a favorite topic for
study in perhaps the only part of philosophy this forum hasn't
yet talked much about -- ethics. (06)
-Ed (07)
P.S. My son observed: Players of the Lord of the Rings card game
all know that the Retreat and Cancel a Skirmish cards are among
the most effective for advancing your Fellowship. Running away
is a great tactic if you can make it work to your advantage. (08)
--
Edward J. Barkmeyer Email: edbark@xxxxxxxx
National Institute of Standards & Technology
Manufacturing Systems Integration Division
100 Bureau Drive, Stop 8263 Tel: +1 301-975-3528
Gaithersburg, MD 20899-8263 FAX: +1 301-975-4694 (09)
"The opinions expressed above do not reflect consensus of NIST,
and have not been reviewed by any Government authority." (010)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/
Subscribe/Config: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/
Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (011)
|