In human-robot collaboration, robots expressing hesitancy is a critical factor that shapes human coordination strategies, attention allocation, and safety-related judgments. However, designing hesitant robot motion that generalizes is challenging because the observer's inference is highly dependent on embodiment and context.
To address these challenges, we introduce and open-source a multi-modal, dancer-generated dataset of hesitant motion where we focus on specific context-embodiment pairs (i.e., manipulator/ human upperlimb approaching a Jenga Tower, and anthropomorphic whole body motion in free space).
The dataset includes (i) kinesthetic teaching demonstrations on a Franka Emika Panda reaching from a fixed start configuration to a fixed target with three graded hesitancy levels (slight, significant, extreme) and (ii) synchronized RGB-D motion capture of dancers performing the same reaching behavior using their upper limb across three hesitancy levels, plus full human body sequences for extreme hesitancy.
Formats, topics, and structures mapped out for researchers.
We provide full state logs directly from the franka_states ROS topic alongside
convenient formats for downstream learning.
dataset/
├── robot/
│ ├── slight/
│ │ ├── traj_01_q.csv # Robot Joint angles q(t)
│ │ ├── traj_01_dq.csv # Velocities
│ │ ├── traj_01_tau.csv # Efforts/Torques
│ │ ├── traj_01_ee_p.csv # EE Position
│ │ └── traj_01_ee_r.csv # EE Rotation
│ ├── significant/
│ └── extreme/
└── all_trajs_q.npz # Batched NPZ for quick loading
Motion capture from 2x RealSense cameras and OpenPose, yielding timestamped 2D/3D keypoints and confidence scores.
frame, joint_index, X, Y, Z, confidence
0, 0, 120.5, 45.2, 500.1, 0.95
0, 1, 118.0, 60.1, 480.2, 0.88
...
MIN_CONF = 0.30.
Lower confidence measurements are treated as missing data to prevent spurious joints.
The dataset is curated for immediate use with standard python scientific tooling (NumPy, Pandas).
import numpy as np
import pandas as pd
# Load batched robot joint angles
robot_data = np.load('all_trajs_q.npz')
q_slight = robot_data['slight'] # Array of shape (N_trajectories, Timesteps, 7)
# Load human 3D keypoints
human_kp = pd.read_csv('keypoints_3d.csv')
high_conf_mask = human_kp['confidence'] >= 0.30
valid_kps = human_kp[high_conf_mask]