next up previous
Next: Information Science I Up: Mathematics Previous: Problem 1 - Differentiable

Problem 2 - Algebra

Let $ A$ be an $ m \times n$ real matrix ($ m < n$) whose rank is $ m$. Define $ K(A)$ in the $ n$-dimensional real vector space $ \mathbb{R}^n$ as follows:

$\displaystyle K(A) = \{ \boldsymbol{x} \; \vert \; A \boldsymbol{x} = \boldsymbol{0}, \boldsymbol{x} \in \mathbb{R}^n \}.$

Answer the following questions:

1.
Show that $ K(A)$ is a vector subspace. Give its rank.
2.
Let $ I(A)$ be the space spanned by the vectors obtained by transposing the rows of $ A$. Show that the vectors in $ I(A)$ and the vectors in $ K(A)$ are orthogonal.
3.
For a vector $ \boldsymbol{z} \in
\mathbb{R}^n$, derive the form of $ \boldsymbol{x} \in K(A)$ and $ \boldsymbol{y} \in I(A)$ that achieve, repectively, the minimum in the following:

$\displaystyle \min_{\boldsymbol{x} \in K(A)} \vert\vert\boldsymbol{z} - \boldsymbol{x}\vert\vert,$

$\displaystyle \min_{\boldsymbol{y} \in I(A)} \vert\vert\boldsymbol{z} - \boldsymbol{y}\vert\vert.$

Here, $ \vert\vert.\vert\vert$ represents the $ L_2$ norm and $ \vert\vert\boldsymbol{x}-\boldsymbol{y}\vert\vert$ is the Euclidean distance between $ \boldsymbol{x}$ and $ \boldsymbol{y}$.


1.
We know that $ K(A) \subset
\mathbb{R}^n$ and that $ \boldsymbol{0}\in K(A)$. Moreover, Thus, $ K(A)$ is a vector subspace of $ \mathbb{R}^n$.

According to the rank theorem, we have $ \dim(\hbox{ker}(A)) + \dim(\hbox{im}(A)) = \dim(
\mathbb{R}^n) = n$, where $ A$ is in fact the linear application associated with $ A$ understood as a matrix. By definition, $ \dim(\hbox{im}(A)) = \hbox{rk}(A) = m$. Thus, $ \dim(\hbox{ker}(A)) = \dim(K(A)) = n - m > 0$.

2.
Let assume that $ x\in K(A)$. Then:

$\displaystyle Ax = \left( \begin{tabular}{c} $\mathcal{L}_1$ \\  $\vdots$ \\  $...
... \right) = \left( \begin{tabular}{c} 0 \\  $\vdots$ \\  0 \end{tabular} \right)$

Let assume that $ y\in I(A)$. Then:

$\displaystyle y = \sum_{i=1}^m \lambda_i \mathcal{L}_i^T$

We form the inner product to see that it is 0:

$\displaystyle <x\vert y> = \sum_{i=1}^m \lambda_i <\mathcal{L}_i^T \vert x> = 0$

That is, $ K(A)$ and $ I(A)$ are orthogonal.
3.
Both

$\displaystyle \min_{\boldsymbol{x} \in K(A)} \vert\vert\boldsymbol{z} - \boldsymbol{x}\vert\vert\quad\hbox{and}$

$\displaystyle \min_{\boldsymbol{y} \in I(A)} \vert\vert\boldsymbol{z} - \boldsymbol{y}\vert\vert$

are reached for $ x = p_{K(A)}(z)$ and $ y = p_{I(A)}(z)$ were $ p_{K(A)}$ and $ p_{I(A)}$ are projections on $ K(A)$ and $ I(A)$ respectively. As $ \mathbb{R}^n = \hbox{ker}(A)\oplus\hbox{im}(A)$, $ z$ can be written as $ z = z_{K(A)} + z_{I(A)}$ where actually $ z_{I(A)}=p_{I(A)}(z)$ and $ z_{K(A)} = p_{K(A)}(z)$.


next up previous
Next: Information Science I Up: Mathematics Previous: Problem 1 - Differentiable
Reynald AFFELDT
2000-06-08