Comparison to original libsvm

At the current state, libsvmTL provides the following features:

libsvmTL LIBSVM
Feature Vectors
  • sparse Feature Vector
  • dense Feature Vector
  • your own Feature vector class (it is just a template parameter)
  • sparse storage Feature Vector
Kernels
  • linear
  • radial basis function
  • polynomial
  • sigmoid
  • your own Kernel class (it is just a template parameter)
  • linear
  • radial basis function
  • polynomial
  • sigmoid
  • your own kernel by modifying svm.cpp
One-class SVM's / Regression
  • (not yet, work in progress) one-class SVM using a hyper plane
  • (not yet, work in progress) one-class SVM using a hyper sphere
  • your own One-class SVM implementation (it is just a template parameter)
  • epsilon-SVR
  • nu-SVR
  • probability estimates for SVR
  • one-class SVM using a hyperplane
Two-class SVM's
  • C-SVC
  • nu-SVC
  • (planned) Probability estimating SVM
  • your own (guess what, yeah, it is just a template parameter)
  • C-SVC
  • nu-SVC
  • probability estimates for C-SVC and nu-SVC
Multi-class Algorithms
  • One vs. one
  • One vs. rest
  • your own (it is just a template parameter)
  • One vs. one
  • (One vs. rest implementation available in LIBSVM Tools)
Data storage (training/test data, models, results)
  • ASCII file (dense or sparse storage)
  • NetCDF
  • std::map based Container (keeps all data in memory)
  • (planned) Interface to Matlab "Data Structures"
  • your own (it is just a template parameter)
  • Sparse vector format
cross validation
  • optimized cross validation (uses chached kernel matrices, just retrains Two-class SVM's, whose support vectors belong to left out feature vectors)
  • leave-one-out validation -- just use cross validation with nfold = number of training vectors
  • basic cross validation,
grid search
  • integrated optimized grid search. (e.g., reuses cached kernel matrix from previous grid point, if only non-kernel-parameters changed)
  • any parameter (e.g. tolerance of termination criterion, etc) can be used as grid axis
  • grid search via python script, which executes shell command "svm-train" for each grid point
Full Kernel Matrix caching (for fast cross validation and grid search)
  • integrated (via Kernel Wrapper)
  • not directly available (you may use the "precomputed kernel Matrices" extension provided in LIBSVM Tools)
Feature scaling
  • integrated (via Kernel Wrapper). sScale factors are stored in the model and will be applied on-the-fly to test data)
  • via external program "svm-scale". Scale factors are stored in an extra file and must be manually applied to test data

Missing Features

Some of the features of the Chih-Jen Lin's LIBSVM are not integrated yet. These are: