Table 4

Methods for performing ICA that we compared

Algorithm

Variations

Abbreviation

Description

Reference

Software


Natural Gradient Maximum Likelihood Estimation

-

NMLE

Natural gradient is applied to MLE for efficient learning

[28,29]

[72]

Extended Information Maximization

-

ExtIM

NMLE for separating mix of super- and sub-Gaussian sources

[32]

[73]

Fast Fixed-Point

Kurtosis with deflation

FP

Maximizing non-Gaussianity

[31]

[74]

Symmetric orthogonalization

Fpsym

Tanh nonlinearity with symmetric orthogonalization

Fpsymth

Joint Approximate Diagonalization of Eigenmatrices

-

JADE

Using higher-order cumulant tensor

[30]

[75]

Nonlinear ICA

Gaussian RBF kernel

NICAgauss

Kernel-based approach

[34,37,50]

[50]

Using polynomial kernel

NICApoly


Eight methods are based on five algorithms. The method's name, variations, abbreviation, short description, references and software that we use, are listed.

Lee and Batzoglou Genome Biology 2003 4:R76   doi:10.1186/gb-2003-4-11-r76

Open Data