engagement_test Documentation
engagement_test: engagement_test
engagement_test is a series of nodes to test the engagement stack.
engagement_test is a series of nodes to test the engagement stack.
List of nodes:
- engagement_test/loopers
- engagement_test/demo
The loopers node continuously attempts to complete a selected connection event and verbalize the success or failure of the connection event. Example: Selecting directed gaze, the robot will say "Look here" while looking at pointing at an object. The looper will then call the recognition directed gaze service and have the robot report the success of the connection even ("You
successfully looked" or "You failed to look"). This can be done for adjacency pair, directed gaze, and mutual facial gaze. This node is used to test the recognition node.
Possible choices to recognize are:
- Adjacency Pair: The robot will say "Please respond" and will then announce the success or failure of the actor nodding/shaking his head, pointing/looking at an object, or saying anything within the timeout.
- Backchannel: The robot will speak for an extended period, and any time the actor completes a backchannel event during the speech the robot will nod his head.
- Directed Gaze: The robot will say "Please look here" while pointing and looking at a plate. The robot will then announce the success or failure of the actor looking at the object within the timeout.
- Mutual Facial Gaze: The robot will say "Please look at me" while making facial gaze with the actor. The robot will then announce the success or failure of the actor completing the mutual facial gaze within the timeout.
$ loopers [standard ROS args]
Subscribes to:
- "recognition/human/backchannel": [engagement::HumanBackchannel] An actor performed a backchannel event.
Publishes to:
- "control/track": [engagement::Track] Instruct the robot to track objects.
- "control/point": [engagement::Point] Instruct the robot to point at objects.
- "control/head": [engagement::Head] Instruct the robot to nod or shake his head.
- "control/say": [engagement::Say] Instruct the robot to speak a segment of text.
- "recognition/robot/adjacency_pair": [engagement::RobotAdjacencyPair] Attempt to recognize a robot initiated adjacency pair.
- "recognition/robot/directed_gaze": [engagement::RobotDirectedGaze] Attempt to recognize a robot initiated directed gaze.
- "recognition/robot/mutual_facial_gaze": [engagement::RobotMutualFacialGaze] Attempt to recognize a robot initiated mutual facial gaze.
Reads the following parameters from the parameter server
- "looper/conf/verbosity": [int] The verbosity used for debug print statements (anything with a log level <= verbosity will be printed when in Debug mode).
- "looper/conf/output_color": [int] The color used for printing statements pertaining to the looper.
- "looper/conf/actor": [string] The actor to attempt to complete events with.
- "looper/conf/recognize": [string] The particular action to recognize (adjacency_pair, backchannel, directed_gaze, or mutual_facial_gaze).
- "looper/conf/point_to": [string] The object to point to (if using directed gaze).
This section describes the launch file used for launching this node:
- "looper.launch": Used to launch the loopers node. Contains all the parameters used for the loopers node.
$ roslaunch looper.launch
The demo node tests generation and the realizer nodes by executing samples given in the samples folders of both engagement_generation and bml_realizer.
$ demo [standard ROS args]
- "generation/turn_fragment": [engagement_srvs::ExecTurnFragment] Call the generation node with the given EBML in order to execute it.
- "bml/realize": [engagement_srvs::BMLRealization] Call the realizer node with the given BML in order to execute it.
This section describes the launch file used for launching this node:
- "demo.launch": Used to launch the demo node. Contains all the parameters used for the demo node.