Tag: applications of matrices and determinants

Questions Related to applications of matrices and determinants

Matrices $A$ and $B$ will be inverse of each other only if

  1. $AB=BA$

  2. $AB=0,BA=I$

  3. $AB=BA=0$

  4. $AB=BA=I$


Correct Option: D
Explanation:

We know that if $A$ is a square of order $m$, and if there exists another square matrix $B$ of the same order $m$, such that $AB=I$, then $B$ is said to be the inverse of $A$. 

In this case, it is clear that $A$ is the inverse of $B$.
Thus , matrices $A$ and $B$ will be inverses of each other only if $AB=BA=I.$


 Let A$= \left[\begin{array}{lll}1 & 0 & 0\2 & 1 & 0\3 & 2 & 1\end{array}\right]$.If $\mathrm{u _1}$ and $\mathrm{u} _{2}$ are column matrices such that  $\mathrm{Au _{1}}=\left[\begin{array}{l}1\0\0\end{array}\right]$ and  $\mathrm{Au _{2}}=\left[\begin{array}{l}0\1\0\end{array}\right]$ then $\mathrm{u _{1}+u _{2}}$ is equal to:

  1. $\left[\begin{array}{l}

    -1\

    1\

    0

    \end{array}\right]$

  2. $\left[\begin{array}{l}

    -1\

    1\

    -1

    \end{array}\right]$

  3. $\left[\begin{array}{l}

    -1\

    -1\

    0

    \end{array}\right]$

  4. $\left[\begin{array}{l}

    1\

    -1\

    -1

    \end{array}\right]$


Correct Option: D
Explanation:
Given:Matrices are 
$A=\left[ \begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right]$

$A{u} _{1}=\left[ \begin{matrix} 1  \\  0 \\ 0 \end{matrix} \right]$ and 
$A{u} _{2}=\left[ \begin{matrix} 0  \\  1 \\ 0 \end{matrix} \right]$

To find:Matric ${u} _{1}+{u} _{2}$

Since both $A{u} _{1}$ and $A{u} _{2}$ are given, hence adding them, we get

$A{u} _{1}+A{u} _{2}=\left[ \begin{matrix} 1  \\  0 \\ 0 \end{matrix} \right]+\left[ \begin{matrix} 0  \\  1 \\ 0 \end{matrix} \right]$

$A\left({u} _{1}+{u} _{2}\right)=\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix} \right]$

Since,$A$ is a non-singular matrix,we have
$\left|A\right|\neq\,0$

Hence multiplying both sides by ${A}^{-1}$ from RHS we get

${A}^{-1}A\left({u} _{1}+{u} _{2}\right)={A}^{-1}\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$

${u} _{1}+{u} _{2}={\left[ \begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right]}^{-1}\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$     ..........$(1)$

Now, $\left|A\right|=\left|\begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right|$

$=1\left| \begin{matrix} 1 & 0   \\ 2 & 1  \end{matrix} \right|-0+0$(by expanding the determinant along row $1$)

$\Rightarrow\,\left|A\right|=1$

Now, co-factor matrix of $A$ (i.e., the matrix in which every element is replaced by corresponding co-factor)

$=\left[\begin{matrix} \left| \begin{matrix} 1 & 0   \\ 2 & 1  \end{matrix} \right| & -\left| \begin{matrix} 2 & 0   \\ 3 & 1  \end{matrix} \right|  & \left| \begin{matrix} 2 & 1 \\ 3 & 2 \end{matrix} \right| \\ -\left| \begin{matrix} 0 & 0   \\ 2 & 1  \end{matrix} \right| & \left| \begin{matrix} 1 & 0   \\ 3 & 1  \end{matrix} \right|  & -\left| \begin{matrix} 1 & 0   \\ 3 & 2 \end{matrix} \right| \\ \left| \begin{matrix} 0 & 0   \\ 1 & 0  \end{matrix} \right| & -\left| \begin{matrix} 1 & 0   \\ 2 & 0  \end{matrix} \right|  & \left| \begin{matrix} 1 & 0   \\ 2 & 1  \end{matrix} \right| \end{matrix} \right]$

$=\left[\begin{matrix} 1 & -2 & 1 \\ 0 & 1  & -2 \\ 0 & 0 & 1 \end{matrix} \right]$
$\therefore\,adj\left(A\right)={\left[\begin{matrix} 1 & -2 & 1 \\ 0 & 1  & -2 \\ 0 & 0 & 1 \end{matrix} \right]}^{T}=\left[\begin{matrix} 1 & 0 & 0 \\ -2 & 1  & 0 \\ 1 & -2 & 1 \end{matrix} \right]$

$\Rightarrow\,{A}^{-1}=\dfrac{adj\left(A\right)}{\left|A\right|}$

$=\left[\begin{matrix} 1 & 0  & 0 \\ -2 & 1  & 0 \\ 1 & -2  & 1 \end{matrix} \right|\,\,\,\because\left|A\right|=1$

From eqn$(1)$ we get

${u} _{1}+{u} _{2}={\left[\begin{matrix} 1 & 0  & 0 \\ 2 & 1  & 0 \\ 3 & 2  & 1 \end{matrix} \right]}^{-1}\times\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$ 

$=\left[\begin{matrix} 1 & 0  & 0 \\ -2 & 1  & 0 \\ 1 & -2  & 1 \end{matrix} \right]\times\left[ \begin{matrix} 1  \\  1 \\ 0 \end{matrix}\right]$ 

$=\left[ \begin{matrix} 1+0+0  \\  -2+1+0 \\ 1-2+0 \end{matrix}\right]$ 

$\therefore\,{u} _{1}+{u} _{2}=\left[ \begin{matrix} 1  \\  -1 \\ -1 \end{matrix}\right]$ 

If a $3\times 3$ matrix $A$ has its inverse equal to $A$, then ${A}^{2}$ is equal to

  1. $\begin{bmatrix} 0 & 1 & 0 \ 1 & 1 & 1 \ 0 & 1 & 0 \end{bmatrix}$

  2. $\begin{bmatrix} 1 & 0 & 1 \ 0 & 0 & 0 \ 1 & 0 & 1 \end{bmatrix}$

  3. $\begin{bmatrix} 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 \end{bmatrix}$

  4. $\begin{bmatrix} 1 & 1 & 1 \ 1 & 1 & 1 \ 1 & 1 & 1 \end{bmatrix}$


Correct Option: C
Explanation:

Given $A^{-1}=A$

$AA^{-1}=I$
$\implies A.A=I$
$\implies A^{2}=I=\begin{bmatrix}1&0&0\0&1&0\0&0&1\end{bmatrix}$

If $A$ is an $3\times 3$ non -singular matrix that $AA'=A'A$ and $B=A^{-1}A'$,then $BB'$ equal ?

  1. $I+B^{-1}$

  2. $(B^{-1})$

  3. $I+B$

  4. $I$


Correct Option: B

 ${( -A )}^{ -1 }$ is always equal to (where $A$ is $nth$ order square matrix)

  1. ${ (-1) }^{ n }{ A }^{ -1 }$

  2. ${ -A }^{ -1 }$

  3. ${( -1) }^{ n-1 }{ A }^{ -1 }$

  4. none of these


Correct Option: B
Explanation:

We know that if $A^{-1}$ exist then $(cA)^{-1}=\dfrac{1}{c}A^{-1}$ where c is  a constant
Hence $(-A)^{-1}=\dfrac{1}{-1}A^{-1}=-A^{-1}$

If $A\left( \alpha ,\beta  \right) =\left[ \begin{matrix} \cos { \alpha  }  & \sin { \alpha  }  & 0 \ -\sin { \alpha  }  & \cos { \alpha  }  & 0 \ 0 & 0 & { e }^{ \beta  } \end{matrix} \right]$, then $A{ \left( \alpha ,\beta  \right)  }^{ -1 }$ is equal to 

  1. $A{ \left( -\alpha ,-\beta \right) }$

  2. $A{ \left( -\alpha ,\beta \right) }$

  3. $A{ \left(\alpha ,-\beta \right) }$

  4. $A{ \left(\alpha ,\beta \right) }$


Correct Option: A

Let $a, b, c$ are non real number satisfying equation $x^{5}=1$ and $S$ be the set of all non-invertible matrices of the from $\begin{bmatrix} 1 & a & b \ w & 1 & c \ { w }^{ 2 } & w & 1 \end{bmatrix}$ where $w={ e }^{ \dfrac { 12\pi  }{ 5 }  }$. The number of distinct matrices in set $S$ is 

  1. $1$

  2. $28$

  3. $32$

  4. $4$


Correct Option: A

If is an invertible matrix, then det $\displaystyle :\left ( A^{-1} \right )$ is equal to

  1. $\displaystyle :det\left ( A \right )$

  2. $\displaystyle :\frac{1}{det\left ( A \right )}$

  3. $1$

  4. none of these


Correct Option: B
Explanation:

We know that $|A^{n}|=|A|^{n}$  n be any integer
$\Rightarrow |A^{-1}|=|A|^{-1}=\displaystyle \frac{1}{|A|}$

If $\displaystyle [A]\neq 0 $ then which of the following is not true?

  1. $\displaystyle (A^{2})^{-1}= (A^{-1})^{2}$

  2. $\displaystyle (A')^{-1}= (A^{-1})^{'}$

  3. $\displaystyle A^{-1}= \left | A \right |^{-1}$

  4. None of these


Correct Option: C
Explanation:

We know, $(A^{n})^{-1}=(A^{-1})^{n}$
So, $(A^{2})^{-1}=(A^{-1})^{2}$
Hence, option A is correct.

We know that inverse of transpose of matrix is equal to transpose of inverse of matrix
$(A^{-1})' =(A')^{-1}$
Hence, option B is correct

For option C,
In the LHS, there is a matrix and in RHS , its a determinant i.e. a single value.
So, option C is incorrect.

Which of the following matrix is inverse of itself

  1. $\begin{bmatrix} 1 & 1 & 1 \ 1 & 1 & 1 \ 1 & 1 & 1 \end{bmatrix}$

  2. $\begin{bmatrix} 1 & 0 & 0 \ 0 & 1 & 0 \ 0 & 0 & 1 \end{bmatrix}$

  3. $\begin{bmatrix} 1 & 0 & 1 \ 0 & 0 & 0 \ 1 & 0 & 1 \end{bmatrix}$

  4. $\begin{bmatrix} 0 & 1 & 0 \ 1 & 1 & 1 \ 0 & 1 & 0 \end{bmatrix}$


Correct Option: B
Explanation:

Inverse of unit matrix also unit matrix.

Ans: B