This paper describes a multimodal body-machine interface (BoMI) to help individuals with upper-limb disabilities using advanced assistive technologies, such as robotic arms. The proposed system uses a wearable and wireless body sensor network (WBSN) supporting up to six sensor nodes to measure the natural upper-body gesture of the users and translate it into control commands. Natural gesture of the head and upper-body parts, as well as muscular activity, are measured using inertial measurement units (IMUs) and surface electromyography (sEMG) using custom-designed multimodal wireless sensor nodes. An IMU sensing node is attached to a headset worn by the user. It has a size of 2.9 cm 2.9 cm, a maximum power consumption of 31 mW, and provides angular precision of 1. Multimodal patch sensor nodes, including both IMU and sEMG sensing modalities are placed over the user able-body parts to measure the motion and muscular activity. These nodes have a size of 2.5 cm 4.0 cm and a maximum power consumption of 11 mW. The proposed BoMI runs on a Raspberry Pi. It can adapt to several types of users through different control scenarios using the head and shoulder motion, as well as muscular activity, and provides a power autonomy of up to 24 h. JACO, a 6-DoF assistive robotic arm, is used as a testbed to evaluate the performance of the proposed BoMI. Ten able-bodied subjects performed ADLs while operating the AT device, using the Test d'Évaluation des Membres Supérieurs de Personnes Âgées to evaluate and compare the proposed BoMI with the conventional joystick controller. It is shown that the users can perform all tasks with the proposed BoMI, almost as fast as with the joystick controller, with only 30% time overhead on average, while being potentially more accessible to the upper-body disabled who cannot use the conventional joystick controller. Tests show that control performance with the proposed BoMI improved by up to 17% on average, after three trials.