Isotropic Position
Get Isotropic Position essential facts below. View Videos or join the Isotropic Position discussion. Add Isotropic Position to your PopFlock.com topic list for future reference or share this resource on social media.
Isotropic Position

In the fields of machine learning, the theory of computation, and random matrix theory, a probability distribution over vectors is said to be in isotropic position if its covariance matrix is equal to the identity matrix.

## Formal definitions

Let ${\textstyle D}$ be a distribution over vectors in the vector space ${\textstyle \mathbb {R} ^{n}}$. Then ${\textstyle D}$ is in isotropic position if, for vector ${\textstyle v}$ sampled from the distribution,

${\displaystyle \mathbb {E} \,vv^{T}=\mathrm {Id} .}$

A set of vectors is said to be in isotropic position if the uniform distribution over that set is in isotropic position. In particular, every orthonormal set of vectors is isotropic.

As a related definition, a convex body ${\textstyle K}$ in ${\textstyle \mathbb {R} ^{n}}$ is called isotropic if it has volume ${\textstyle |K|=1}$, center of mass at the origin, and there is a constant ${\textstyle \alpha >0}$ such that

${\displaystyle \int _{K}\langle x,y\rangle ^{2}dx=\alpha ^{2}|y|^{2},}$

for all vectors ${\textstyle y}$ in ${\textstyle \mathbb {R} ^{n}}$; here ${\textstyle |\cdot |}$ stands for the standard Euclidean norm.