examples.robots package
Submodules
examples.robots.icub_head module
iCub Head Controller and Camera Viewer using Wrapyfi for Communication
This script demonstrates the capability to control the iCub robot’s head and view its camera feed using the MiddlewareCommunicator within the Wrapyfi library. The communication follows the PUB/SUB pattern, allowing for message publishing and listening functionalities between processes or machines.
- Demonstrations:
Using Image messages for camera feed transmission.
Running publishers and listeners concurrently with the yarp.RFModule.
Using Wrapyfi for creating a port listener only.
- Requirements:
Wrapyfi: Middleware communication wrapper (refer to the Wrapyfi documentation for installation instructions)
YARP, ROS, ROS 2, ZeroMQ (refer to the Wrapyfi documentation for installation instructions)
- iCub Robot and Simulator: Ensure the robot and its simulator are installed and configured.
When running in simulation mode, the iCub_SIM must be running in a standalone terminal (refer to the Wrapyfi documentation for installation instructions)
NumPy: Used for creating arrays (installed with Wrapyfi)
SciPy: For applying smoothing filters to the facial expressions (refer to https://www.scipy.org/install.html for installation instructions)
Pexpect: To control the facial expressions using RPC
Install using pip:
pip install "scipy==1.9.0" pexpect
- Run:
# For the list of keyboard controls, refer to the comments in Keyboard Controls.
- # Alternative 1: Simulation Mode
# Ensure that the iCub_SIM is running in a standalone terminal.
# The listener displays images, and coordinates are published without utilizing Wrapyfi’s utilities.
python3 icub_head.py --simulation --get_cam_feed --control_head --control_expressions
- # Alternative 2: Physical Robot
# The listener displays images, and coordinates are published without utilizing Wrapyfi’s utilities.
python3 icub_head.py --get_cam_feed --control_head --control_expressions
- Keyboard Controls:
- Head Control:
Up/Down: Control the head pitch
Right/Left: Control the head yaw
A/D: Control the head roll (right/left)
R: Reset the head to the initial position
Esc: Quit the application
- Eye Control:
W/S: Control the eye pitch (up/down)
C/Z: Control the eye yaw (right/left)
R: Reset the eye to the initial position
Esc: Quit the application
- Facial Expressions:
0: Neutral
1: Happy
2: Sad
3: Surprise
4: Fear
5: Disgust
6: Anger
7: Contempt
8: Cunning
9: Shy
Esc: Quit the application
- Camera Feed:
Esc: Quit the application
- examples.robots.icub_head.cartesian_to_spherical(xyz=None, x=None, y=None, z=None, expand_return=None)[source]
Convert cartesian coordinates to spherical coordinates.
- Parameters:
xyz – tuple: Cartesian coordinates (x, y, z)
x – float: Cartesian coordinate x
y – float: Cartesian coordinate y
z – float: Cartesian coordinate z
expand_return – bool: Whether to return the spherical coordinates as a dictionary or not
- Returns:
tuple: Spherical coordinates (p, t, r) or dictionary: Spherical coordinates {“p”: p, “t”: t, “r”: r}
- examples.robots.icub_head.mode_smoothing_filter(time_series, default, window_length=6, min_count=None)[source]
Apply a smoothing filter to the time series using the mode of the last window_length values.
- Parameters:
time_series – list: Time series to apply the smoothing filter to
default – object: Default value to return if the mode count is less than min_count
window_length – int: Length of the window to apply the smoothing filter to
min_count – int: Minimum number of values in the window to apply the smoothing filter
- class examples.robots.icub_head.ICub(*args: Any, **kwargs: Any)[source]
Bases:
MiddlewareCommunicator
,RFModule
ICub head controller, facial expression transmitter and camera viewer. Head control can be achieved following two methods: 1. Using the control_head_gaze method, which controls the head gaze in the spherical coordinate system. 2. Using the control_gaze_at_plane method, which controls the head gaze in the cartesian coordinate system. Emotions can be controlled using the update_facial_expressions method. Camera feed can be viewed using the receive_images method.
- FACIAL_EXPRESSIONS_QUEUE_SIZE = 7
- FACIAL_EXPRESSION_SMOOTHING_WINDOW = 6
- __init__(simulation=False, headless=False, get_cam_feed=True, img_width=320, img_height=240, control_head=True, set_head_coordinates=True, head_coordinates_port='/control_interface/head_coordinates', control_eyes=True, set_eye_coordinates=True, eye_coordinates_port='/control_interface/eye_coordinates', ikingaze=False, gaze_plane_coordinates_port='/control_interface/gaze_plane_coordinates', control_expressions=False, set_facial_expressions=True, facial_expressions_port='/control_interface/facial_expressions', mware='yarp')[source]
Initialize the ICub head controller, facial expression transmitter and camera viewer.
- Parameters:
simulation – bool: Whether to run the simulation or not
headless – bool: Whether to run the headless mode or not
get_cam_feed – bool: Whether to get (listen) the camera feed or not
img_width – int: Width of the image
img_height – int: Height of the image
control_head – bool: Whether to control the head
set_head_coordinates – bool: Whether to set (publish) the head coordinates
head_coordinates_port – str: Port to receive the head coordinates for controlling the head
control_eyes – bool: Whether to control the eyes
set_eye_coordinates – bool: Whether to set (publish) the eye coordinates
eye_coordinates_port – str: Port to receive the eye coordinates for controlling the eyes
ikingaze – bool: Whether to use the iKinGazeCtrl
gaze_plane_coordinates_port – str: Port to receive the gaze plane coordinates for controlling the head/eyes
control_expressions – bool: Whether to control the facial expressions
set_facial_expressions – bool: Whether to set (publish) the facial expressions
facial_expressions_port – str: Port to receive the facial expressions for controlling the facial expressions
mware – str: Middleware to use
- MWARE = 'yarp'
- FACIAL_EXPRESSIONS_PORT = '/control_interface/facial_expressions'
- GAZE_PLANE_COORDINATES_PORT = '/control_interface/gaze_plane_coordinates'
- HEAD_COORDINATES_PORT = '/control_interface/head_coordinates'
- EYE_COORDINATES_PORT = '/control_interface/eye_coordinates'
- CAP_PROP_FRAME_WIDTH = 320
- CAP_PROP_FRAME_HEIGHT = 240
- build()[source]
Updates the default method arguments according to constructor arguments. This method is called by the module constructor. It is not necessary to call it manually.
- acquire_head_coordinates(head_coordinates_port='/control_interface/head_coordinates', cv2_key=None, _mware='yarp', **kwargs)[source]
Acquire head coordinates for controlling the iCub.
- Parameters:
head_coordinates_port – str: Port to receive head coordinates
cv2_key – int: Key pressed by the user
- Returns:
dict: Head orientation coordinates
- acquire_eye_coordinates(eye_coordinates_port='/control_interface/eye_coordinates', cv2_key=None, _mware='yarp', **kwargs)[source]
Acquire eye coordinates for controlling the iCub.
- Parameters:
eye_coordinates_port – str: Port to receive eye coordinates
cv2_key – int: Key pressed by the user
- Returns:
dict: Eye oreintation coordinates
- receive_gaze_plane_coordinates(gaze_plane_coordinates_port='/control_interface/gaze_plane_coordinates', _mware='yarp', **kwargs)[source]
Receive gaze plane (normalized x,y) coordinates for controlling the iCub.
- Parameters:
gaze_plane_coordinates_port – str: Port to receive gaze plane coordinates
- Returns:
dict: Gaze plane coordinates
- wait_for_gaze(reset=True, _mware='yarp')[source]
Wait for the gaze actuation to complete. :param reset: bool: Whether to reset the gaze location (centre) :param _mware: str: Middleware to use :return: dict: Gaze waiting log for a given time step
- reset_gaze(_mware='yarp')[source]
Reset the eyes and head to their original position.
- Parameters:
_mware – str: Middleware to use
- Returns:
dict: Gaze reset log for a given time step
- update_head_gaze_speed(pitch=10.0, roll=10.0, yaw=20.0, head=0.8, _mware='yarp', **kwargs)[source]
Control the iCub head speed.
- Parameters:
pitch – float->pitch[deg/s]: Pitch speed
roll – float->roll[deg/s]: Roll speed
yaw – float->yaw[deg/s]: Yaw speed
head – float->speed[0,1]: Neck trajectory speed in normalized units (only when using iKinGazeCtrl)
_mware – str: Middleware to use
- Returns:
dict: Head orientation speed log for a given time step
- update_eye_gaze_speed(pitch=10.0, yaw=10.0, vergence=20.0, eye=0.5, _mware='yarp', **kwargs)[source]
Control the iCub eye speed.
- Parameters:
pitch – float->pitch[deg/s]: Pitch speed
yaw – float->yaw[deg/s]: Yaw speed
vergence – float->vergence[deg/s]: Speed of vergence shift between the eyes
eye – float->speed[0,1]: Eye trajectory speed in normalized units (only when using iKinGazeCtrl)
_mware – str: Middleware to use
- Returns:
dict: Eye orientation speed log for a given time step
- control_head_gaze(pitch=0.0, roll=0.0, yaw=0.0, order='xyz', _mware='yarp', **kwargs)[source]
Control the iCub head relative to previous coordinates following the roll,pitch,yaw convention (order=xyz) (initialized at 0 looking straight ahead).
- Parameters:
pitch – float->pitch[deg]: Pitch angle
roll – float->roll[deg]: Roll angle
yaw – float->yaw[deg]: Yaw angle
order – str: Euler axis order. Only accepts xyz (roll, pitch, yaw)
_mware – str: Middleware to use
- Returns:
dict: Head orientation coordinates log for a given time step
- control_eye_gaze(pitch=0.0, yaw=0.0, vergence=0.0, _mware='yarp', **kwargs)[source]
Control the iCub eyes relative to previous coordinates (initialized at 0 looking straight ahead).
- Parameters:
pitch – float->pitch[deg]: Pitch angle
yaw – float->yaw[deg]: Yaw (version) angle
vergence – float->yaw[deg]: Vergence angle between the eyes
_mware – str: Middleware to use
- Returns:
dict: Eye orientation coordinates log for a given time step
- control_gaze_at_plane(x=0.0, y=0.0, limit_x=0.3, limit_y=0.3, control_eyes=True, control_head=True, _mware='yarp', **kwargs)[source]
Gaze at specific point in a normalized plane in front of the iCub.
- Parameters:
x – float->x[-1,1]: x coordinate in the plane limited to the range of -1 (left) and 1 (right)
y – float->y[-1,1]: y coordinate in the plane limited to the range of -1 (bottom) and 1 (top)
limit_x – float->limit_x[0,1]: x coordinate limit in the plane
limit_y – float->limit_y[0,1]: y coordinate limit in the plane
control_eyes – bool: Whether to control the eyes of the robot directly
control_head – bool: Whether to control the head of the robot directly
- Returns:
dict: Gaze coordinates log for a given time step
- acquire_facial_expressions(facial_expressions_port='/control_interface/facial_expressions', cv2_key=None, _mware='yarp', **kwargs)[source]
Acquire facial expressions from the iCub.
- Parameters:
facial_expressions_port – str: Port to acquire facial expressions from
cv2_key – int: Key to press to set the facial expression
- Returns:
dict: Facial expressions log for a given time step
- update_facial_expressions(expression, part=False, smoothing='mode', _mware='yarp', **kwargs)[source]
Control facial expressions of the iCub.
- Parameters:
expression – str: Expression to be controlled
expression – str or tuple(str->part, str->emotion) or list[str] or list[tuple(str->part, str->emotion)]: Expression/s abbreviation or matching lookup table entry. If a list is provided, the actions are executed in sequence
part – str: Abbreviation describing parts to control (refer to iCub documentation) ( mou, eli, leb, reb, all, raw, LIGHTS)
smoothing – str: Name of smoothing filter to avoid abrupt changes in emotional expressions
- Returns:
Emotion log for a given time step
- receive_images(cam_world_port, cam_left_port, cam_right_port, img_width=320, img_height=240, _rgb=True)[source]
Receive images from the iCub.
- Parameters:
cam_world_port – str: Port to receive images from the world camera
cam_left_port – str: Port to receive images from the left camera
cam_right_port – str: Port to receive images from the right camera
img_width – int: Width of the image
img_height – int: Height of the image
_rgb – bool: Whether the image is RGB or not
- Returns:
Images from the iCub