Skip to main content

Table 1 Overview of records reviewed

From: Relying on more sense for enhancing lower limb prostheses control: a review

Study

Type / Group

Sensor selection

Sensor placement

Concept description

Vallery et al.

IES / 1

2 x angle & angular

C: hip & knee

Mapping function for control of knee prototype

(P, 2011) [10]

 

velocity sensors

 

with estimated contralateral limb motion data.

Bernal-Torres et al.

IES / 1

1 x IMU

C: thigh

Active biomimic polycentric knee prototype with

(H, 2018) [11, 12]

   

contralateral echo-control strategy.

Su et al.

IES / 1

3 x IMUs

C: thigh, shank &

Intent recognition system based on

(P, 2019)[13]

  

ankle

convolutional neural network classification.

CYBERLEGs

IES / 1

2 x pressure insoles

B: shoes inlays

Finite-state control of a powered ankle-knee

project series 1

 

7 x IMUs

B: thighs, shanks,

coupled prototype using whole-body aware

(P, 2017) [1518]

  

feet & 1 x trunk

noninvasive, distributed wireless sensor control.

Hu et at.

IES / 2

4 x IMUs

B: thighs & shank

Classification error reduction through fusion of

(P, 2018) [1921]

 

4 x GONIOs

B: knee & ankle

bilateral lower-limb neuromechanical signals,

Extended by:

 

14 x EMGs

B: leg muscles

providing feasibility & benchmark datasets.

Krausz et al.

EES / 2

1 x IMU

On the waist in

Adding vision features to the prior

(H, 2019) [22]

 

1 x depth camera

a belt construction

concept improving the classification.

Hu et al.

IES / 3

1 x IMU

I: thigh

Bilateral gait segmentation from ipsilateral depth

(H, 2018)[23]

 

1 x depth camera

 

sensor with the contralateral leg in field of view.

Zhang et al.

IES / 3

1 x depth camera

On the waist

Depth signal from legs as input to an

(H, 2018) [25]

  

with tilt angle

oscillator-based gait phase estimator.

Scandaroli et al.

EES / 4

2 x gyroscopes

Built into a

Infrared distance sensor setup for estimation

(T, 2010) [27]

 

4 x infrared sensors

foot prototype

of foot orientation with respect to ground.

Ishikawa et al.

EES / 4

2 x infrared sensors

Left & right on

Infrared distance sensor setup for estimation

(H, 2018) [28]

 

1 x IMU

one normal shoe

of foot clearance with respect to ground.

Kleiner et al.

EES / 5

1 x motion tracking

I: between artificial

Concept and prototype of a foresighted

(T, 2011) [29]

 

1 x laser scanner

ankle & knee joint

control system using a 2D laser scanner.

Huang’s group 2

EES / 5

1 x IMU

I: lateral side

Terrain recognition based on laser distance,

(P, 2016) [3033]

 

1 x laser sensor

of the trunk

motion estimation and geometric constrains.

Carvalho et al.

EES / 5

1 x laser sensor

On the waist

Terrain recognition based on laser distance

(H, 2019) [36]

  

with 45° tilt angle

information and geometric constrains.

Sahoo et al.

EES / 5

3/4 x range sensors

I: On the shank &

Array of distance sensors for geometry-based

(H, 2019) [37]

 

1 x force resistor

on the heel of the foot

obstacle recognition in front of the user.

Varol et al. and

EES / 5

1 x depth camera

I: shank

Intent recognition framework using a single

Massalin et al.

   

depth camera and a cubic kernel support

(H, 2018) [38, 39]

   

vector machine for real-time classification.

Laschowski et al.

EES / 5

1 x color camera

Wearable

Terrain identification based on color images

(H, 2019) [40]

  

chest-mounting

and deep convolutional network classification.

Yan et al.

EES / 5

1 x depth camera

On the trunk

Locomotion mode estimation based on depth

(H, 2018) [41]

  

in 1.06m height

feature extraction and finite-state classification.

Diaz et al.

EES / 5

1 x IMU

I: foot & shin

Terrain context identification and inclination

(H, 2018) [43]

 

1 x color camera

 

estimation based on color image classification.

Krausz et al.

EES / 5

1 x depth camera

Fixed in 1.5m height

Stair segmentation strategy from depth

(H, 2015) [45]

 

1 x accelerometer

with -50° tilt angle

sensing information of the environment.

Kleiner et al.

EES / 5

1 x IMU

I: thigh

Stair detection algorithm through fusion of

(P, 2018) [46]

 

1 x radar sensor

 

motion trajectory and radar distance data.

Zhang et al.

EES / 5

1 x IMU

I: knee lateral

Environmental feature extraction based on

(P, 2019) [47, 48]

 

1 x depth camera

 

neural network depth scene classification.

  1. 1Publications through CYBERLEG: Amrozic et al. [15, 16], Gorsic et al. [17] and through CYBERLEG++: Parri et al. [18]
  2. 2Research group from Huang: F. Zhang et al. [30], X. Zhang et al. [31], Wang et at. [32] and Liu et al. [33]