Why?
Robots stir visions of ceaseless motion. The study of their movement is a key to the expressivity they project, a language bound to their unique forms and limitations. To grasp this nuanced communication, to choreograph movements both clear and compelling, engineers and artists must forge a shared understanding. Whether in solitude or synchronized groups, robotic expressivity lies in the artistry of motion, a collaboration born of both form and function.
This page collects all the instructions required to design expressive motions in the context of the tutorial prepared by INIT Robot team (Alexandra Mercader, Ali Imran and David St-Onge). This tutorial is presented for the first time at ICRA 2024 (tutorial page).
Getting ready
The tutorial team prepared several stations preset with the sotfware ecosystem required for all the tasks. If you wish to run it on your own computer, we also made public a Docker container.
-
Windows
Prerequisite
- Windows Subsystem for Linux (WSL):
wsl --install
- Docker Desktop
- vcXsrv for X11-forwarding (GUI)
Running
- Launch vcXsrv with access control disabled
- Find you IP and launch the container:
docker run -ti --rm -e DISPLAY=xxx.xxx.xxx.xxx:0.0 -env="QT_X11_NO_MITSHM=1" --net=host aimrantw∖icra24_tutorial:init_robots_tutorial
- Windows Subsystem for Linux (WSL):
-
Linux
Prerequisite
- Clone the repository
git clone https://git.initrobots.ca/aimran/icra24_docker.git
- Install docker
cd icra_docker && ./docker_install.sh
- Pull our container
sudo docker pull aimrantw/icra24_tutorial:init_robots_tutorial
- If you have an NVidia card and want GPU support, run
./docker_gpu_setup.sh
Running
- For CPU-based docker:
./docker_run_cpu.sh
- For GPU-based docker:
./docker_gpu_run.sh
- Clone the repository
For more detailed installation instructions and easier copy-pasting have a look at our GitLab repository README.
If you are here as part of a tutorial session, before starting, please fill the consent form provided by the research team.
Single Arm expressive motion
The first part of the tutorial aims at the design of expressive motions for a single 6 degrees of freedom robotic arm using a custom remote controller and an editing tool, Blender.
Actions
- Pickup a team and a context/scenario: healthcare assistance or emergency ?
- Brainstorm and explore with the PS4 remote.
- Record your motion(s) and edit it on Blender to accelerate/slow down, replicate trajectory, add preset postures, etc.
- Present your motion to the other and assess the expressivity of other teams' motion.
Teleoperation
- To run our custom remote controller, you must be inside our Docker.
- Launch our ROS infrastructure to either teleoperate a real arm or a simulated one with
roslaunch blender-animator gen3_lite_[real/sim]_arm_teleop.launch
- When ready, record with:
rosbag record -O /mnt/shared/creation.bag /mobile_manip/joint_states
- Then import and launch Blender with:
rosrun blender-animator bag2blender.py /mnt/shared/creation.bag
. - You can play the motion directly from Blender after launching:
roslaunch blender-animator gen3_lite_[real/sim]_blender_animation.launch
.
To get printed instructions: 1-page PDF summary.
The original ROS repository, including all scripts for Blender usage, is publically available: Blender animator on INIT GitLab.
After watching over the other participants motion, please fill the survey.
Choreographic swarm
The second part of the tutorial aims at the design of expressive motions for a swarm of 5 robotic arms using a domain-specific language, Buzz.
Actions
- Pickup a team and a context/scenario: party or emergency ?
- Brainstorm and explore with the Buzz tutorial script.
- Present your motion to the other and assess the expressivity of other teams' motion.
Buzz scripting
- To run our Buzz stack, you must be inside our Docker.
- Launch our ROS swarm simulation infrastructure with
roslaunch dingo_buzz gen3_lite_swarm_sim.launch
- Edit the Buzz script with:
gedits tutorial.bzz
- Then launch the swarm behavior:
roslaunch dingo_buzz pybuzz.launch
.
To get printed instructions: 1-page PDF summary.
Buzz virtual machine to build and run this code is publically available: GitHub Buzz.
After watching over the other participants motion, please fill the survey.
Finally, at the end of the tutorial, please fill the final survey.