Author: Wolfgang Konen, wolfgang.konen@fh-koeln.de
Created: Nov’2009
Last modified: Nov’2009
This document is a summary of changes in
sfa-tk.V2 as compared to sfa-tk.V1.0.1:
- lcov_pca2.m: same interface as lcov_pca.m but better numerical
stability in the case where the covariance matrix object becomes singular.
- sfa_step.m: new parameters method and dbg, handed over to sfa2_step.m. Added default settings
for method and dbg, depending on step.
- sfa2_step.m:
- In step ‘sfa’: method=’GEN_EIG’
is the original [Berkes03]-code; method=’SVD_SFA’ is the new [Konen09b]-method using lcov_pca2.m, along the lines of
[WisSej02]. If dbg>0,
then both branches are executed and their results are compared with sfa2_dbg.m. If method=’SVD_SFA’, the matrices DSF and SF are reduced to those dimensions with
eigenvalue ą 0 in D.
- Added check and warning w.r.t.
ill-conditionedness of matrix B for method=’GEN_EIG’. Always print rank(B).
- In step ‘expansion’: method=’TIMESERIES’ is the original Berkes-code where the goal is to minimize the
time difference signal. The method=’CLASSIF’ is new for classification purposes, along the lines of [Berkes05].
Each data chunk is assumed to be a set of patterns for the same
class, and the goal is to minimize the pairwise pattern difference for
all pairs existing in this chunk of data. (I.e. if the data of a certain
class label are presented in several chunks, then only the pairs among
each chunk are formed. Therefore it is recommended to add each class
label in one chunk, if possible.)
- in step ‘preprocessing -> expansion’:
New branch for SFA_STRUCTS{hdl}.pp_type=’PCA2’: similar to ‘PCA’, but use lcov_pca2.m instead of lcov_pca.m for whitening of input
data and throw away those dimensions with eigenvalue close to zero.
- sfa2_dbg.m: called from sfa2_step.m if dbg>0 and method=’SVD_SFA’. Is
only needed for debugging purposes.
- Several checks are performed on the
results from ’GEN_EIG’ and ’SVD_SFA’. See rich documented source code for
further details.
- The part with OLD_VERSION==1 is only for
debugging comparisions, it contains the old, no longer recommended
version of ‘SVD_SFA’. Note that in case OLD_VERSION==1 the call to step
‘sfa’ in sfa_step.m (or sfa2_step.m) has to be
sfa_step(hdl, x, 'sfa',
method);
instead of
sfa_step(hdl, [], 'sfa',
method);
and the data in x have to
contain all training data in one chunk
- sfa2.m:
- additional check with matrix Cslow,
which prints a warning, if some of the slow (training) signals have
variance <>1.
- added parameter method
(=’GEN_EIG’,’SVD_SFA’) to the input of sfa2.
- added parameter pp_type
(=’PCA’,’PCA2’ ,’PCAVAR’) to the input of sfa2.
- gaussCreate.m: create an Gaussian classifier object
- gaussClassifier.m:
train or apply a Gaussian classifier HDL, which
has been created before with gaussCreate.m.
Remark: if the data entered during training are not normally distributed,
the inversion of the full covariance matrix can become numerically
difficult. It is the recommended to call gaussClassifier.m with aligned=1 which zeros the off-diagonal elements of the covariance matrix
(and aligns the Gaussian ellipsoid with the coordinate axes). This is
always numerically robust.
- gauss_clearall.m: clear all
Gaussian classifier objects.
- drive1.m: demo for driving force experiments [Konen09a, Wis03c].
Additional parameter noiseperc allows to inject noise into the time series.
- class_demo1.m: demo for classification experiments ŕ la [Berkes05] on
synthetic data.
- class_demo2.m: demo for classification experiments ŕ la [Berkes05] on
real data (UCI repository or gesture data).
- class_CVdemo2.m: demo for classification experimentsŕ la [Berkes05]
with cross validataion on real data (UCI repository or
gesture data).
Literature
- [Wis03c] Wiskott, L.
(12. December 2003). Estimating driving forces of nonstationary
time series with slow feature analysis. arXiv.org e-Print archive, http://arxiv.org/abs/cond-mat/0312317.
- [Berkes05] Pietro
Berkes: Pattern recognition with Slow Feature Analysis. Cognitive
Sciences EPrint Archive (CogPrint) 4104, http://cogprints.org/4104/ (2005).
- [Berkes03] Berkes,
P. SFA-TK: Slow Feature Analysis Toolkit for Matlab (v.1.0.1).
http://itb.biologie.hu-berlin.de/~berkes/software/sfa-tk/sfa-tk.shtml
or
http://people.brandeis.edu/~berkes/software/sfa-tk/index.html.
- [Konen09a] Konen, W.
(2009). How slow is slow? SFA detects signals that are slower than the
driving force. arXiv.org e-Print archive, http://arXiv.org,
in preparation.
- [Konen09b] Konen, W.
(2009). On the numeric stability of the SFA implementation sfa-tk. arXiv.org e-Print archive, http://arXiv.org,
in preparation.