library(multree)
multree
package.multree
can used on raw signals.Hand Extension Recording
home <- "DavidHolds"
hand.ext <- read.table("DavidHolds/Extension_1", sep= ",")[, c(4:10, 3)]
head(hand.ext)
## emg.sig1 emg.sig2 emg.sig3 emg.sig4 emg.sig5 emg.sig6 emg.sig7 emg.sig8
## 1 -2 -6 -1 -31 19 -25 -8 -3
## 2 4 8 6 36 17 0 3 3
## 3 -2 -3 14 98 114 37 3 2
## 4 0 5 9 20 0 10 2 2
## 5 2 1 -15 -76 -63 -28 -7 -1
## 6 4 6 6 -4 -110 -4 5 4
v <- do.call(c, hand.ext)
plot(1:length(v), v)
abline(v=1:7*50, col = "red")
X
matrix with rows = one gesture, columns = sample time.Y
vector with gesture labels.df <- c()
for (file in list.files(home)) {
a <- do.call(c, read.table(paste0(home, "/", file), sep= ",")[, c(4:10, 3)])
df <- rbind(df, c(unlist(strsplit(file, "_"))[1], a))
}
Y <- as.factor(df[,1])
X <- as.data.frame(apply(df[,2:ncol(df)], 1:2, as.numeric))
table(Y)
## Y
## Extension Flexion HandClosed HandOpen NoMovement Pronation Supination
## 30 30 30 30 30 29 30
Here I’m just constructing a training and testing data set.
test <- sample(1:nrow(X), floor(1/4*nrow(X)))
So the multree
package can be trained on vectorized signals.
feature.space = list(window = "dbars", w = 50, k = 8, low = 10, high= 45, d=2, features = "e")
Let’s describe each of these variables:
window
denotes we want to duplicate bars over each window.w
denotes how many data points are in a signal frame or window.k
denotes how many signals.low
denotes the smallest random window size.high
denotes the larges random window size.d
denotes the number of random windows.features
denotes the statistic calculated in a random window: i.e. the e
is standard deviation.Now, let’s train a neural decision forest with 70
trees.
fit <- mulforest(Y[-test], X[-test,], "rnet", .99, feature.space = feature.space, size = 70)
While this is training, we can go over each of the parts:
Y
the vector with class labels.X
the data matrix, rows assigned a label in Y
.rnet
denotes we want a randomized neural network trained..99
denotes a leaf is formed with at least 99 percent class purity.feature.space
constructed in previous section.size
the number of trees in a forest.We now make predictions,
p <- mf.predict(fit, X[test,])
We breakdown the accuracy in a confusion table:
table(Predicted = p, True = Y[test])
## True
## Predicted Extension Flexion HandClosed HandOpen NoMovement Pronation Supination
## Extension 4 0 0 0 0 0 0
## Flexion 0 9 0 0 0 0 0
## HandClosed 1 0 5 0 0 0 0
## HandOpen 0 1 0 9 0 1 0
## NoMovement 0 0 0 0 7 0 0
## Pronation 0 0 0 0 0 6 0
## Supination 0 0 0 0 0 0 9
print(sum(p == Y[test])/length(Y[test]))
## [1] 0.9423077
Using the force.graph
command we may look at individual tree properties.
force.graph(fit$forest[[1]])