Post Job Free
Sign in

Control Data

Location:
Nashville, TN
Posted:
December 10, 2012

Contact this candidate

Resume:

Learning Sensory Motor Coordination for Grasping

by a Humanoid Robot

Mark Edward Cambron and Richard Alan Peters II

Vanderbilt University School of Engineering

Station B, Box 131

Nashville,TN 37235

Telephone: 615-***-****

Fax: 615-***-****

E-mail: abp6ge@r.postjobfree.com, abp6ge@r.postjobfree.com

Abstract This paper proposes a method for

the learning of Sensory-Motor Coordination (SMC)

through the teleoperation of a humanoid robot de-

signed for human-robot interaction. It is argued that

SMC in a complex environment must be learned rather

than programmed. Schema theory is reviewed as a tool

for the description of animal behavior at the level of

functional modules and higher. SMC is shown to be

necessary for the formation of schema assemblages for

the control of behavior. Examples are given of four

behavior-based robot control architectures that implic-

itly use SMC-driven schema control. It is shown that

while the robots are capable of learning, they all rely,

to a certain extent, on ad hoc choices of SMC by the

designers. A description of the humanoid robot and

the sensors it uses for reaching and grasping is given.

The proposed method of learning via teleoperation is Fig. 1. ISAC, Vanderbilt s humanoid robot.

described. Sensory data acquired through grasping is

presented.

tems that manifest it were known then analogous systems

could be implemented in robots.1 SMC in animals is not

I. Introduction completely understood, but research in it has recently ad-

ESEARCHERS at the Intelligent Robotics Labora- vanced to the point where plausible mechanisms for it have

R tory at Vanderbilt University have been developing a been described. Evidence from studies in neurophysiology

humanoid robot, ISAC, over the past several years (Fig. 1). [5], ontogenesis [6], [7], and cognitive science [8] suggests

The robot was designed expressly for research in human- that to interact e ectively and e ciently with its environ-

robot interaction [1], [2]. ISAC s control architecture is ment, an animal must learn through its own experiences

an agent-based, hybrid deliberative-reactive system. Like the reciprocal causative relationships between sensing and

a behavior-based robot, ISAC s complex behaviors result action that foster its success or survival. (cf. Sec. II.) That

from the interaction of independent computational mod- is, SMC must be learned, or at least re ned, through an

ules that operate asynchronously in parallel [3], [4]. animal s direct experience with acting in the world.

Schema theory [8] can be used to describe the functional

To interact naturally with people in a human-centered

aspects of an animal s behavior without exact speci ca-

environment, a robot must be able to coordinate sensing

tion of the biological systems that support it. Schemas

with action. That is, it must have Sensory-Motor Coor-

exist at a frame of reference higher than that of the in-

dination (SMC). It is possible to program a certain de-

dividual computational elements (neurons in the case of

gree of SMC into a robot prior to its deployment. But it

animals). A schema description of the behavior of an an-

is impossible for a programmer to anticipate every phys-

imal is inherently modular. It provides a framework for

ical contingency that may arise in a robot s interactions

the description of behaviors in terms of the interactions of

with people. This is due to the intrinsic complexity of

modules that control motion, process sensory information,

a human-centered environment. Only animals (including

people) have SMC that permits them to work e ectively 1 Implementation is possible if the functionality of the biological

in a complex natural world. If SMC in animals were well systems can be reproduced in electro-mechanical systems. Schema

understood if the structures and functions of the sys- theory suggests that it can. (cf. Sec. III.)

create and recall memories, etc. In animals, the mod- ated with it occur, the action is performed without resort

ules may more or less directly correspond to speci c net- to modeling or deliberation. Basic behaviors are canoni-

works of neurons. But this separation of function from cal in the sense that all actions exhibited by the robot are

structure a ords the possibility of realizing the behavior generated through the cooperation and competition of ba-

of an animal in a robot by substituting computers and sic behaviors operating concurrently or in sequence. At

electro-mechanical devices for neuron networks and bio- any given point in time, some of the basic behaviors will

mechanical subsystems. Behavior-based robots (BBR) [9], be enabled and others suppressed depending on the task

[10] are particularly amenable to this. BBRs act through and environmental context of the robot. Since a BBR ex-

the combination of basic behaviors, which are motor ac- hibits any and all its behaviors through the combination

tions tightly coupled to sensory stimuli both external to and sequencing of basic behaviors, a BBR is wholly depen-

the robot and internal (i.e. proprioceptic). dent on, and to a large extent de ned by, sensory motor

This paper proposes a method for the learning of coordination.

sensory-motor coordination through the teleoperation of Sensory-motor coordination is fundamental for another

a behavior-based robot. The goal of the work is to enable compelling reason. It forms a foundation for higher level

a robot to learn SMC by nding the correlations between learning and perception. In particular, the categorization

sensory events and motor control events that co-occur dur- of sensory stimuli can be accomplished through SMC [11].

ing task execution. The robot is guided by a human oper- A mobile agent can learn the sensory patterns that corre-

ator through repeated trials of a speci c task while record- spond to an obstacle by associating stimuli with its mo-

ing all its incoming sensory data. The motor and sensory tor responses, as when a characteristic stimulus pattern

data gathered throughout the trials will be analyzed to routinely accompanies the sudden inability to move. Sim-

nd representative couplings between sensory stimuli and ilarly, as Pfeifer has demonstrated, an agent can learn to

motor actions. If successful this will not only permit the distinguish between objects that it can manipulate and

robot to perform the task autonomously, but also (with an those which it cannot [12]. If the internal sensation of a

appropriate control system) enable the robot to adapt to need (a drive or a goal) having been satis ed accompanies

variations in the task or in the environment. a set of actions performed in the presence of speci c stim-

uli, that stimuli can be recognized as being bene cial to

the agent (e.g. an energy source food). Recent experi-

II. Sensory-Motor Coordination

ments by Pfeifer and others have demonstrated that such

Sensory-Motor Coordination underlies the physical be-

SMC events can be used to learn classi cations of objects

havior of an animal in response to its environment. More

and events in the environment more easily and more accu-

than a response, SMC is a feedback loop that changes

rately than can traditional machine sensing strategies such

both the animal and the environment. An animal s mo-

as model-based vision [13], [14].

tions are caused by muscle contractions. These contrac-

tions are elicited by electrochemical signals that are gen-

erated by circuits of motor neurons. When the animal III. Schema Theory

moves, it causes a relative shift in the environment. As Since the behavior of animals is mediated by their ner-

the environment shifts, energy patterns sweep across the vous systems, the understanding of their behavior from

animal s sensory organs. Sensory organs are transducers rst principles requires an understanding of nervous sys-

that, in e ect, transform external, spatio-temporally dy- tems. Neuroscience has provided a structural description

namic energy elds into electrochemical signals carried by that includes neurons (individuals and networks) and lay-

circuits of sensory neurons internal to the animal. These ers, columns, and modules in the brain [15]. But the func-

sensory signals (more or less directly) modulate the signals tion of these structures is not completely understood and it

in the original motor circuits. Learning occurs in the map- is function more than structure that determines behavior.

ping from sensory response signal to motor control signal. Functional analysis is complicated by the fact that many of

Thus, an animal senses the environment and acts. The ac- the neuronal structures participate in di erent functions.

tion changes the environment relative to the animal, which With certain exceptions there are no discernible one-to-

senses those changes and acts accordingly. one mappings of low-level structure to high-level function

SMC is likewise needed by a sensory-guided robot. The [8].

basic behaviors of a BBR are independent units of SMC. Arbib et al. employ schema theory as a framework for

They include what are commonly called re ex actions. the rigorous analysis of [animal] behavior that requires no

When a basic behavior is enabled2 and the stimuli associ- prior commitment to hypotheses on the location of each

schema (unit of functional analysis) but can be linked to a

2 A BBR typically has a suite of basic behaviors, not all of which are

operational at the same time. Depending on the task and environ- structural analysis as and when it becomes appropriate.

mental contexts, various basic behaviors will be enabled or disabled.

[8], (p. 33). Thus schemas are descriptions of functions

If a behavior is enabled made operational it will remain quiescent

that are performed by networks of neurons and the muscles

until its triggering sensory stimuli are present.

and appendages that they control. Schema theory enables object) must be learned. On the other hand, an animal

the top-down analysis of a complex behavior by providing must, to a certain extent, hit the ground running to sur-

a structure for logically dissembling it, that is it facilitates vive. Motion must precede perception so that the animal

the analytical decomposition of a complex behavior into can move at birth and so that the e ects of its motion can

sets of simpler behaviors. On the other hand, schemas also be perceived and learned. Perceptual schemas, must there-

enable the bottom-up analysis of sets co-occurring behav- fore be learned or tuned in concert with motion. Simulta-

iors. The collective behavior of a set of simple schemas can neously, motor schemas must be tuned to enable e cient

be deduced if the framework for their competition, coop- sensing. Thus, sensory-motor coordination requires the

eration, and sequencing is known. This collective behavior coupling of perceptual schemas and motor schemas into

is a higher-level schema called an assemblage. assemblages. Perceptual schemas provide goal and trajec-

tory information to the motor schemas, whereas the latter

Not only are the behaviors of animals describable by

provide a physical framework within which a perceptual

schemas but also are the control systems of behavior-based

schema can extract salient information. Arbib et al. place

robots. BBRs are, given their modular architectures, par-

motor schemas and perceptual schemas at the foundation

ticularly amenable to such description. The theory of

of animal function. Under the in uence of the environ-

behavior-based robotics is grounded on the idea that com-

ment these schemas self-organize to control an animal s

plex behavior in an agent emerges through the competition

behavior.

and cooperation of simple behaviors in the context of an

For the designers of robots the main implication of the

environment, which is precisely the idea of assemblage in

onset of motility prior to sensation in animals is that re-

schema theory.

exes are not primary. (See [8], Sec. 2.1.1, p. 13 f.f.) Put in

To the extent that function can be separated from struc-

another way, basic behaviors are not truly basic. Motion is

ture, a schema representation enables a speci c behav-

primary; it can happen without sensing. Re exes develop

ior to be performed by agents with dissimilar computa-

with the onset of sensing. Then sensory signals modulate

tional hardware. In particular, a behavior observed in

the signaling of motor circuits and re exes emerge.

an animal that can be described accurately by schemas

could be implemented on an appropriately structured

robot. Schemas, therefore, provide for comparative analy- IV. Schemas and SMC in Behavior Based Robots

sis of similar behaviors on dissimilar agents, be they bio- The following four examples of behavior-based robot

chemical or electro-mechanical. control systems depend on SMC and can be described

Arbib et al. group schemas in two categories. Motor through assemblages of schemas. Each of the architectures

schemas are the control systems which can be coordi- has basic behaviors at its foundation. In each case, the

nated to e ect the wide variety of movement. A set of basic behaviors are selected by the designer of the robot.

basic motor schemas is hypothesized to provide simple, Each of the architectures can be designed to learn, and

prototypical patterns of movement. Perceptual schemas as a result exhibit emergent SMC. The learning, however,

are those used for perceptual analysis. They embody the occurs at levels above basic behaviors.

processes whereby the system determines whether a given a) Brooks Subsumption Architecture. : Brooks sub-

domain of interaction is present in the environment. They sumption architecture controls a robot through a collection

not only serve as pattern-recognition routines but can also of augmented nite state machines (AFSM) organized into

provide the appropriate parameters concerning the current layers [9]. A subsumptive robot has no central planner or

relationship of the organism with its environment. [8], (p. controller. Each AFSM can be activated by sensory inputs

42). and produces outputs that drive actuators or are passed

Research in the ontogenesis of animals has demonstrated to the inputs of other modules. Within subsumption, the

that the ability to move exists prior to an animal s abil- AFSMs are motor schemas. The sensory inputs are percep-

ity to sense its environment. Arbib et al. state that this tual schemas. An AFSM with well-de ned sensory input

does not, however, imply that motility is an end in itself. implements a basic behavior. Assemblages are formed dy-

Rather this, motor foundation serves to group the later namically as AFSMs at one level are activated or inhibited

development of sensory maps and sensorimotor represen- by AFSMs at a higher level. Usually the basic behaviors

tations in a self-directed manner. [8], (p. 10). Thus, in in the lowest layer are preprogrammed; the sensory signals

animals the formation of the musculo-skeletal system and that trigger an AFSM are not learned. Learning can take

the neuro-circuits for motor control precedes the develop- place in a subsumption architecture, (e.g., Brooks robot,

ment of perceptual schemas. Such a development sched- Ghengis [16]) but generally this occurs in layers above the

ule makes sense. Perceptual schemas in animals, even if rst.

passed on phylogentically, must be tuned; sensory stim- b) Mataric s action-oriented representations. :

uli is required for a perceptual modality to develop. Other Mataric designed, using subsumption, a mobile robot that

perceptual schemas (e.g., a semantic description of a visual learns to navigate an environment through the use of

action-oriented representations [17]. The robot both gen- tion) and summing their outputs onto the motor con-

erates and operates from an action map of the environ- trollers [12]. The response of each behavior module to

ment. While wandering in the environment and reacting its inputs is preprogrammed. The overall robot system

to sensory input according to its basic behaviors (e.g., wall does learn, however, as it interacts with the environment,

following, object avoidance, etc.) the robot generates the guided by a value system. Values are, essentially, the

map by building up a directed graph. Each node of the preprogrammed re exes and reinforcement schemes, that

graph contains a description of the motor state at the time cause the robot to seek some sensory stimuli and to avoid

of its formation and description of the sensory data that others. Learning occurs through the adaptive modulation

was received as the robot performed the actions described of sensory signals that are fed to the behavior modules.

by the motor state. Adjacent nodes in the graph corre- Pfeifer de nes categorization of an object as the robot s

spond to adjacent areas in the environment. Once the en- appropriate interaction with the object. Through the

vironment has been mapped, the robot can reach a phys- value-based learning scheme the robot learns how to couple

ical location by activating the corresponding node of the sensing with actuation so that appropriate behaviors are

graph. The graph is searched (using spreading activation) learned for di erent stimulus patterns. Thus the objects

back from the goal node to the node that represents the that project the di erent stimulus patterns are classi ed

current position of the robot. The nodes along the short- de facto without forming an abstract model of the object.

est connecting path are enabled. The robot reaches the Pfeifer s robot learns about objects by nding the corre-

goal by moving according to the motor commands of its lations between sensory signals and behaviors that lead to

current node until its sensory input more closely matches favorable results and by decoupling behaviors from stimuli

the data from the next node. Then it executes the motor when that coupling leads to unfavorable results. Thus, an

commands from next node and proceeds successively from appropriate linkage between sensing and action at the task

node to node until the goal is reached. level is learned by trial and error.

Mataric s robot learns while acting by forming a spatio-

temporal sensory-motor description of the environment.

The map indicates the sensory and motor status of the V. Learning Basic Behaviors

robot at a particular point in space at a particular time

Behavior-based robots employ schemas implicitly. Their

relative to the current position. Thus, the robot learns how

complex behaviors emerge through the interaction of a

to sequence and basic behaviors from sensory input. This

canonical set of basic behaviors, each of which is a sensory-

is undoubtedly a form of SMC but it learns the sequencing

driven motor controller. Therefore, in a BBR high-level

of basic behaviors rather than the SMC that de nes the

behavior emerges from assemblages of perceptual schemas

basic behaviors themselves.

linked to motor schemas, just as in animals. In terms of

c) Arkin s Motor Schema. : A robot controlled by

schema theory, the practice of designing BBRs di ers from

Arkin s motor schema3 architecture follows gradients in a

the ontogenesis of animals. The designer of a BBR must

vector eld map of its environment [18]. Computational

decide ad hoc or through trial and error, exactly which

modules such collision detectors and obstacle or object rec-

coupling of sensory data to motor controller constitutes

ognizers are perceptual schemas since they compute the

a useful basic behavior. And the designer must decide

vectors at points in space that serve to impel the robot.

which basic behaviors to include in the canonical set. He

Motor schemas (in Arbib s sense) within Arkin s archi-

or she determines the perceptual to motor schema linkage

tecture are assemblages of motor controllers that respond

at the base level and decides which of these rst-order as-

individually to components of the vector eld map. A mo-

semblages to include on the robot.4 In other words, the

tor controller generally has a xed response to its input

designer programs SMC into the robot at the lowest level.

vector. The response is a function of the magnitude and

BBRs that learn, such as those described in the previ-

direction of the input vector, but that function is generally

ous section, learn at the level above basic behaviors. They

preprogrammed and does not change. Any learning that

learn which behaviors to activate and which to inhibit or

occurs happens in the perceptual schemas that compute

to suppress under various sensory conditions, or they learn

the vector eld.

an appropriate sequence of behaviors in response to sen-

d) Pfeifer s SMC-based categorization. : Pfeifer s

sory input, or they learn a control function that modulates

robots are based on an extended Braitenberg architec-

the sensory signals before they reach the basic behaviors.

ture, another type of BBR [19]. A number of basic be-

While these robots might work well, they are still subject

havior modules (Pfeifer calls these re exes ) operate in

to the errors and oversights of their designers in program-

parallel, receiving sensory inputs (including propriocep-

ming SMC into the functional base-level of the robot.

3 Motor Schema is the name that Arkin has given his control archi-

4 In some BBRs the higher-order assemblages are also completely

tecture. It makes use of both perceptual schemas and motor schemas

in the sense that Arbib describes them. speci ed by the designer. Such robots cannot learn.

S1 : base joint positions and velocities

How, then, does one enable a robot to learn SMC at the

S2 : shoulder joint positions and velocities

level of basic behaviors? There are at least two possibili-

S3 : elbow joint positions and velocities

ties:

S4 : wrist joint positions and velocities

1) Design and implement on the robot the fundamen-

S5 : Finger Positions

tal motor circuits that enable actuation. Have the

S6 : 6-axis FT sensor output

robot move randomly while sensing. Reinforce any

S7 : Proximity sensor 1 output

sensory-motor coupling (a temporal coincidence of

S8 : Proximity sensor 1 output

sensory signals and motor actions) that leads to pur-

S9 : Finger touch sensor 1

poseful motion. An approach such as this is nec-

S10 Finger touch sensor 2

essary for a fully autonomous agent, like an ani-

mal. This approach has been used successfully by

TABLE I

researchers in arti cial life [20]. Learning SMC this

Signals measured during teleoperation.

way with a robot could require much time.

2) Take advantage of the fact that a robot can be tele-

operated. When a person teleoperates a robot, the

person s SMC causes the robot to act purposefully. The other end-e ector, the PneuHand II is similar to

If the robot records all of its sensory signals dur- the rst but has two more actuators (in addition to the

ing repeated teleoperations, through signal analysis pistons), one each on the thumb and fore nger. (Fig. 2).

it should be able to identify the sensory-motor cou- The actuator is a DC micro-motor connected to a lead

plings that accompany purposeful motion. screw by a universal joint. The nut moves the drive link.

These actuators give the thumb and fore nger a contin-

Both of these approaches require signal analysis algorithms

uous range of positions between opened and closed. In a

that will detect signal correlations or coincidences. More-

typical grasp with the PneuHand II, the motors are used

over, the detected sensory-motor couplings must be used

to close the thumb and fore nger on an object. Then the

to construct basic behavior modules. Both of these prob-

pistons are engaged to close the other two ngers and to

lems are open research issues.

add strength to the grasp of the thumb and fore nger.

ISAC s sensor suite includes the stereo camera pair, the

VI. The ISAC Humanoid mics for sonic localization, and the motion detector array.

It also has, between the arms and the end e ectors (at

ISAC is an upper torso humanoid with two 6-degree-

the wrist joints) 6-axis force torque sensors. Two photo-

of-freedom arms actuated by McKibben Arti cial Muscles

electric proximity sensors are installed in the center of each

attached to the arms as antagonistic pairs and with anthro-

end e ector s palm. The sensors generate binary signals.

pomorphic end-e ectors. It has a color, stereo, active vi-

One sensor res when an object is within approximately

sion system with pan, tilt, and verge that implements the 5

100 mm of the palm, the other res at 50 mm. On the

basic oculomotor behaviors of primates: vergence, smooth

end of each digit of the end e ectors is a touch sensor with

pursuit, saccade, vestibular-ocular re ex, and opto-kinetic

binary output which res on contact.

re ex. ISAC employs sonic localization, infrared motion

detection, and speech I/O.

VII. Methods and Procedures

One of the two end e ectors is the PneuHand I. It has a

The objective of this research is to enable a ISAC to

central palm plate to which three ngers and an opposing

learn the sensory-motor control couplings that de ne a

thumb are connected [21]. Each digit consists of two pha-

canonical set of basic behaviors and to learn the sensory

langes and two joints, proximal and distal. Each digit is

signals that precede and follow behavior changes during

individually actuated by a pneumatic piston. The one ac-

task execution. The approach is to have a person teleop-

tuator per digit design was intended to decrease the mass

erate the robot through a task a number of times while

of the hand and to simplify its control. However, as a con-

the robot records the motor control sequence and the sig-

sequence of this, each digit of the PneuHand I has only two

nals from its sensors. For the experiments reported herein,

positions, opened or closed. The primary advantage of the

ISAC s task was to nd, to reach toward, and to grasp a

PneuHand I over other simple grippers is its morphology.

stationary object. This was accomplished through tele-

It has a human form factor that enables it to grasp and

operation, wherein the teleoperator controlled the action

to hold any object (within the weight limits of the arm-

of the robot from a keyboard while observing the robot s

hand system) that has a quasi-cylindrical handle (e.g. a

actions. The signals recorded are listed in Table 1.

hammer, a telephone handset, a drinking glass, etc.) The

The sequence of actions for a single trial was:

PneuHand I can grasp with its thumb and one, two, or

three ngers. But, it is does not have the dexterity to do 1) Perform an active visual analysis of the scene: Grab

things other than grasping and holding. a stereo image pair of the scene. Locate the ob-

ject through the visual attention network. Fixate on

the object. Estimate the object s 3D position in the

workspace. Extract edges from the imagery. Esti-

mate of the medial axis and the limbs of the object.

Combine the position and orientation to estimate the

pose of the object.

2) Reach toward the object: using visual servoing to-

ward the 3d position point, with a wrist position

relative to the rest of the arm chosen visually by the

operator, and with the hand open.

3) Close the hand: at an arm position close to the ob-

ject chosen visually by the operator.

Figures 3 and 4 show the parallel behaviors which com-

prise the control architecture of the teleoperated grasp

agent and the sensory-motor data ow pathways from the Fig. 2. Vanderbilt University s PnueHand2

sensing and motor control agents into the

During this process ISAC records the sensory-motor

data stream. An example of a typical data set is shown

in Figure 5. Upon the completion of the operation, the

From sensors

and effectors

From Other

operator judges its success on a three point scale: 0

Process

unsuccessful, 1 partially or almost successful, and 2

successful.

Camera

VIII. Learning SMC Object Fixation Angles

The motor control sequence within each trial will be

used to determine the motor events the times of tran- Arm

sition between continuous motor operation states. The Position

Visual Servoing

motor events from a trial will be used to partition all the

sensory signals within that trial. Since the same task is re- Gripper

Position

peated by the same operator several times there should be

GraspObject

the same number of motor events in each trial, although

the time between them will vary. After all the trials are

completed, the signals will be time warped to align the Fig. 3. Architecture of the grasp agent.

motor events across trials. Then in a time interval brack-

eting the motor event, the signals from a single sensor will

be correlated across all trials to determine if there is a

From Arm Agent

corresponding sensory event (the signal exhibits a change Arm

Proximity

consistently near the motor event.) Only the signals that Position

From Pneu Agent

Sensors

exhibit a consistent sensory event within an interval of a

motor event will be considered to be salient to that mo-

Force

tor event and analyzed further. (A signal that is constant SMC

From Arm Agent

SMC Data

Torque Sensor Data

or that changes inconsistently near a motor event across

multiple trials of the same task is presumed to be super-

FSR

uous to the SMC of that event.) Through averaging (or From Hand Agent

Hand

some nonlinear combination such as median ltering) a Data

Position

characteristic signal for that sensory event at the given

motor event will be formed. Then the signals from dif- Touch

ferent sensors will be correlated within individual trials Data Flow Data From Hand Agent

to determine which sensors react together near the mo-

Mechanism

tor events. To each motor event, the characteristic signals

Representation

from the salient sensors are coupled to form a sensory-

motor coordination event. An SMC event is, therefore, a

Fig. 4. Data ow.

motor state transition, that is either preceded or followed

by a consistent signals in more than one sensor.

IX. Resultant Data

200

Sensor Data

Figure 6 is a composite graph of the sensor data in the 0

3.5 seconds before visual servoing (VS) algorithm was com- 200

plete. The constant signals are the thumb and index nger 0 5 10 15

1000

Hand Pos

position and the ngertip touch sensors, since the hand is 0

open and not in contact with anything. The proximity sen- 1000

0 5 10 15

sors starting ring during the nal 2 seconds. Moreover, 1000

Finger Pos

dx, dy, and dz show that the hand motion is slowing down 500

as the hand approaches the target location. Therefore the 0

0 5 10 15

200

stability of position and ring of both proximity sensors

Force

0

are a strong indication that the hand has completed the 200

servoing behavior in and is in position to begin grasping. 0 5 10 15

200

Torque

Figure 7 depicts the signals during the grasp procedure. 0

Here, the signals indicate the positions of the closing n- 200

0 5 10 15

gers and show when the touch sensors re. Also, they time (s)

show the hand position stabilizing. The proximity sen- Fig. 5. Typical sensor and motor data.

sors continue to re o and on, essentially at random, due

apparently to noise.

300

X. Conclusions and Future Work

Touch 2

At the time of this writing, the rst experiments in SMC 200

data gathering during teleoperation had been performed. Touch 1

We found that the teleoperation procedure is repeatable in 100 dY

the way needed for the analysis: Having the same number dX

of motor events yet having su cient variability to detect 0

Sensor Data

dZ

true sensory events and to average out the spurious ones. Proximity 1

100

It remains to perform the analysis described in Sec. VIII. Proximity 2

Palm 100mm from Object

200

Acknowledgments Palm 50mm from Object

Thumb Position

This work has been partially funded by the DARPA Index Finger Position

300

Mobile Autonomous Robotics Systems (MARS) Program.

For more information, please visit our web page at 400

0 0.5 1 1.5 2 2.5 3

time (sec)

http://shogun.vuse.vanderbilt.edu

Fig. 6. Composite sensor data prior to end of VS.

References

[1] Kawamura, K., R. A. Peters II, D. M. Wilkes, W. A. Alford, and

T. E. Rogers, Towards Interactive Human-Humanoid Teaming:

Foundations and Achievements, IEEE Intelligent Systems, (to 400

appear), 2000.

Fingers Start to Close

[2] Kawamura, K., R. A. Peters II, S. Bagchi, M. Iskarous, and M. 300

Bishay, Intelligent robotic systems in service of the disabled,

IEEE Transactions on Rehabilitation Engineering, vol. 1, no. 3, Touch 2

pp. 14-21, March 1995 and Erratum, vol. 1, no. 11, November Proximity 2

200

1995. Touch 1

[3] Pack, R. T., IMA: The Intelligent Machine Architecture Ph. D. 100

Sensor Data

Dissertation, Vanderbilt University, May 1999. Proximity 1

[4] Peters, R. A. II, D. M. Wilkes, D. Gaines, and K. Kawamura, dZ

dX

A Software Agent Based Control System for Human-Robot 0

Interaction, Proc. 2nd Int l. Symp. on Humanoid Robotics dY

HURO 99, Tokyo, Japan, October 8-9, 1999. 100 Thumb Position

[5] Hamburger, V, The developmental history of the motor neu-

ron, Neurosciences Research Program Bulletin, no. 15, pp. 1-

37, April, 1977. Index Finger Position

200

[6] Bushnell, E. W. and J. P. Boudreau, Motor development and Fingers Stop Closing

the mind: The potential role of motor abilities as a determi- 300

nant of aspects of perceptual development, Child Development 0 0.5 1 1.5 2 2.5 3

Time (s)

vol. 64, pp. 1005-1021, 1993.

[7] Thelen, E. AND L. B. Smith, A Dynamic Systems Approach

Fig. 7. Composite sensor data prior to end of grasp.)

to the Development of Cognition and Action, MIT Press, Cam-

bridge, MA, 1994.

[8] Arbib M. A., P. Erdi, and J. Szent gothai, Neural Organiza-

a [15] Carter, R. Mapping the Mind University of California Press,

tion: Structure, Function, and Dynamics, MIT Press (Brad- Berkeley, CA, 1998.

ford), Cambridge, MA, 1998. [16] Brooks, R. A., A Robot That Walks; Emergent Behaviors from

a Carefully Evolved Network, MIT AI Lab Memo 1091, MIT

[9] Brooks, R. A., A Robust Layered Control System for a Mo-

Arti cial Intelligence Laboratory, Cambridge, MA, February,

bile Robot, IEEE Journal of Robotics and Automation, vol. 2,

1989.

no. 1, pp. 14-23, March 1986; also MIT AI Memo 864, Septem-

[17] Mataric, M. J., Navigating with a rat brain: a neurobiologi-

ber 1985.

cally inspired model for robot spatial representation,, in From

[10] Arkin, R. C., Behavior-Based Robotics, MIT Press, Cambridge,

Animals to Animats I, J.-A. Meyer and S. Wilson, eds, MIT

MA, 1998.

Press, 1991.

[11] Scheier, C. and R. Pfeifer, Classi cation as sensory-motor coor-

[18] Arkin, R. C., Motor schema-based navigation for a mobile

dination, Proc. 3rd European Conf. on Arti cial Life, pp. 656- robot: An approach to programming by behavior, Proc. IEEE

667, 1995. Conf. on Robotics and Automation, Raleigh, NC, pp. 264-271,

[12] Pfeifer, R. and C. Scheier, Sensory-motor coordination: the 1987.

metaphor and beyond, Robotics and Autonomous Systems, [19] Lambrinos, D. and C. Scheier Extended Braitenberg Archi-

Special Issue on Practice and Future of Autonomous Agents, tectures, AILab Technical Report, No. 95-10, University of

vol. 20, no. 2-4, pp. 157-178, 1997 Zurich, 1995.

[13] Lambrinos, D. and C. Scheier, Building complete autonomous [20] Terzopoulos, D., X. Tu, R. Grzeszczuk, Arti cial shes: Au-

agents: a case study on categorization, Proceedings IROS 96, tonomous locomotion, perception, behavior, and learning in a

IEEE/RSJ Int l Conf. on Intelligent Robots and Systems, simulated physical world, Arti cial Life, vol. 1, no. 4, pp. 327-

November 4-8, 1996. 351, December, 1994.

[21] Christopher, J. L., Jr., A PneuHand for Human-Like Grasping

[14] Pfeifer, R. and C. Scheier Representation in natural and ar-

on a Humanoid Robot, M. S. Thesis, Vanderbilt University, May

ti cial agents: an embodied cognitive science perspective,

1999.

Zeitschrift fr Naturforschung, vol. 53c pp. 480-503, 1998

723.dvi



Contact this candidate