Prerequisites

library(multree)

Intro

Goal

  • Construct a neural decision forest using the multree package.

Conclusion

  • The multree can used on raw signals.

Outline

  • Prep Data
  • Train model
  • Predict
  • Inspect

Data - Signals

Hand Extension Recording

Matrix

home <- "DavidHolds"
hand.ext <- read.table("DavidHolds/Extension_1", sep= ",")[, c(4:10, 3)]
head(hand.ext)
##   emg.sig1 emg.sig2 emg.sig3 emg.sig4 emg.sig5 emg.sig6 emg.sig7 emg.sig8
## 1       -2       -6       -1      -31       19      -25       -8       -3
## 2        4        8        6       36       17        0        3        3
## 3       -2       -3       14       98      114       37        3        2
## 4        0        5        9       20        0       10        2        2
## 5        2        1      -15      -76      -63      -28       -7       -1
## 6        4        6        6       -4     -110       -4        5        4

Vectorized

v <- do.call(c, hand.ext)
plot(1:length(v), v)
abline(v=1:7*50, col = "red")

Prep Data

  • X matrix with rows = one gesture, columns = sample time.
  • Y vector with gesture labels.
df <- c()
for (file in list.files(home)) {
  a  <- do.call(c, read.table(paste0(home, "/", file), sep= ",")[, c(4:10, 3)])
  df <- rbind(df, c(unlist(strsplit(file, "_"))[1], a))
}

Y <- as.factor(df[,1])
X <- as.data.frame(apply(df[,2:ncol(df)], 1:2, as.numeric))
table(Y)
## Y
##  Extension    Flexion HandClosed   HandOpen NoMovement  Pronation Supination 
##         30         30         30         30         30         29         30

Training/Test Set

Here I’m just constructing a training and testing data set.

test <- sample(1:nrow(X), floor(1/4*nrow(X)))

Forest Presets

So the multree package can be trained on vectorized signals.

feature.space = list(window = "dbars", w = 50, k = 8, low = 10, high= 45, d=2, features = "e")

Let’s describe each of these variables:

Train Neural Decision Forest

Now, let’s train a neural decision forest with 70 trees.

fit <- mulforest(Y[-test], X[-test,], "rnet", .99, feature.space = feature.space, size = 70)

While this is training, we can go over each of the parts:

  • Y the vector with class labels.
  • X the data matrix, rows assigned a label in Y.
  • rnet denotes we want a randomized neural network trained.
  • .99 denotes a leaf is formed with at least 99 percent class purity.
  • feature.space constructed in previous section.
  • size the number of trees in a forest.

Prediction

We now make predictions,

p  <- mf.predict(fit, X[test,])

We breakdown the accuracy in a confusion table:

table(Predicted = p, True = Y[test])
##             True
## Predicted    Extension Flexion HandClosed HandOpen NoMovement Pronation Supination
##   Extension          4       0          0        0          0         0          0
##   Flexion            0       9          0        0          0         0          0
##   HandClosed         1       0          5        0          0         0          0
##   HandOpen           0       1          0        9          0         1          0
##   NoMovement         0       0          0        0          7         0          0
##   Pronation          0       0          0        0          0         6          0
##   Supination         0       0          0        0          0         0          9
print(sum(p == Y[test])/length(Y[test]))
## [1] 0.9423077

Investigating

Using the force.graph command we may look at individual tree properties.

force.graph(fit$forest[[1]])