Tag: maths

Questions Related to maths

The number of $2\times 2$ matrices $A=\left[ \begin{matrix} a & b \ c & d \end{matrix} \right] $ for which ${ \left[ \begin{matrix} a & b \ c & d \end{matrix} \right]  }^{ -1 }$ $=\left[ \begin{matrix} \frac { 1 }{ a }  & \frac { 1 }{ b }  \ \frac { 1 }{ c }  & \frac { 1 }{ d }  \end{matrix} \right] $, $(a,b,c,d\ \epsilon \ R)$ is

  1. $0$

  2. $1$

  3. $2$

  4. $Infinite$


Correct Option: A
Explanation:

If $A=\begin{bmatrix} a & b \ c & d \end{bmatrix}\quad { \begin{bmatrix} a & b \ c & d \end{bmatrix} }^{ -1 }=\begin{bmatrix} 1/a & 1/b \ 1/c & 1/d \end{bmatrix}$                          Not possible for any values of $a, b, c$

$A=\begin{bmatrix} a & b \ c & d \end{bmatrix}\quad =Adj{ \begin{bmatrix} d & -c \ -b & a \end{bmatrix} }\ \Rightarrow { A }^{ -1 }=\begin{bmatrix} d & -c \ -b & a \end{bmatrix}$

Let A=$\left( {\begin{array}{{20}{c}}{ - 5}&{ - 8}&{ - 7}\3&5&4\2&3&3\end{array}} \right),B = \left( {\begin{array}{{20}{c}}x\y\z\end{array}} \right)$. If AB is scalar $\left( { \ne 0} \right)$ multiple of B, then x+y=

  1. $z$

  2. $-z$

  3. $0$

  4. $2z$


Correct Option: B
Explanation:
$A=\begin{pmatrix} -5 & -8 & -7 \\ 3 & 5 & 4 \\ 2 & 3 & 3 \end{pmatrix}\quad B=\begin{pmatrix} x \\ y \\ z \end{pmatrix}$
Given $AB=k\ B$
$AB\Rightarrow \begin{pmatrix} -5 & -8 & -7 \\ 3 & 5 & 4 \\ 2 & 3 & 3 \end{pmatrix}\begin{pmatrix} x \\ y \\ z \end{pmatrix}\quad k=\begin{pmatrix} x \\ y \\ z \end{pmatrix}$
$\Rightarrow \begin{pmatrix} -5x & -8y & -7z \\ 3x & +5y & +4z \\ x2 & +3y & +3z \end{pmatrix}=\quad k\begin{pmatrix} x \\ y \\ z \end{pmatrix}$
On adding all the elements on left and right side 
$\Rightarrow \ (-5x-8y-7z)+(3x+5y+47)+(2x+3y+3z)$
$=k(x+y+z)$
$\Rightarrow \ D=k(x+y+z)$
$k\neq 0$
$\Rightarrow \ x+y+z=0$
$\Rightarrow \ x+y=-z$

If $A^{-1} = \alpha I + \beta I$ where $\alpha, \beta \in R$, then $\alpha + \beta$ is equal to (where $A^{-1}$ denotes inverse of matrix $A$)-

  1. $1$

  2. $\dfrac{4}{3}$

  3. $\dfrac{5}{3}$

  4. $\dfrac{1}{3}$


Correct Option: A

If $A=\begin{bmatrix} \alpha & 0 \ 1 & 1 \end{bmatrix}$ and $B=\begin{bmatrix} 1 & 0 \ 5 & 1 \end{bmatrix}$, find the values of $\alpha$ for which $A^2=B$.

  1. $\pm 1$

  2. $4$

  3. $0$

  4. No value


Correct Option: D
Explanation:

We have,

$A^2=B$
$\begin{bmatrix} \alpha & 0 \ 1 & 1 \end{bmatrix} \begin{bmatrix} \alpha & 0 \ 1 & 1 \end{bmatrix} =\begin{bmatrix} 1 & 0 \ 5 & 1 \end{bmatrix}$

$\begin{bmatrix} \alpha^2 +0 & 0+0 \ \alpha +1 & 0+1 \end{bmatrix}=\begin{bmatrix} 1 & 0 \ 5 & 1 \end{bmatrix}$

$\begin{bmatrix} \alpha^2 & 0 \ \alpha +1 & 1 \end{bmatrix} =\begin{bmatrix} 1 & 0 \ 5 & 1 \end{bmatrix}$

$\alpha^2=1$ and $\alpha +1=5$
$\alpha =\pm 1$ and $\alpha =4$, which is not possible.
Hence, there is no value of $\alpha$ for which $A^2=B$ is true.

Let $P=\begin{bmatrix} \cos { \dfrac { \pi  }{ 9 }  }  & \sin { \dfrac { \pi  }{ 9 }  }  \ -\sin { \dfrac { \pi  }{ 9 }  }  & \cos { \dfrac { \pi  }{ 9 }  }  \end{bmatrix}$ and $\alpha,\ \beta,\ \gamma$ be non-zero real numbers such that $\alpha P^{6}+\beta P^{3}+\gamma 1$ is the zero matrix. Then, $(\alpha^{2}+\beta^{2}+\gamma^{2})^{(\alpha-\beta)(\beta-\gamma)(\gamma-\alpha)}$ is

  1. $\pi$

  2. $\dfrac {\pi}{2}$

  3. $0$

  4. $1$


Correct Option: C

Consider three matrices $A=\begin{bmatrix} 2 & 1 \ 4 & 1 \end{bmatrix}, B=\begin{bmatrix} 3 & 4 \ 2 & 3 \end{bmatrix}$ and $C=\begin{bmatrix} 3 & -4 \ -2 & 3 \end{bmatrix}$. Then the value of the sum $tr(A)+tr\left(\dfrac{ABC}{2}\right)+tr\left(\dfrac{A(BC)^{2}}{4}\right)+tr\left(\dfrac{A(BC)^{3}}{8}\right)+....+\infty$ is 

  1. $6$

  2. $9$

  3. $12$

  4. $3$


Correct Option: B

If $A(\theta) = \begin{bmatrix}\sin  \theta & i  \cos  \theta\ i  \cos  \theta & \sin  \theta\end{bmatrix}$, then which of the following is not true?

  1. $A(\theta)^{-1} = A(\pi - \theta)$

  2. $A(\theta) + A(\pi + \theta)$ is a null matrix

  3. $A(\theta)$ is invertible for all $\theta \in R$

  4. $A(\theta)^{-1} = A(- \theta)$


Correct Option: A,B,C
Explanation:

Finding inverse of the matrix $A(\theta)= \begin{bmatrix} \sin\theta & i\cos\theta \ i\cos\theta & \sin\theta\end{bmatrix}$


Determinant of $A(\theta)$ is $|A(\theta)|=\sin^2\theta-i^2\cos^2\theta$
                                                    $= \sin^2\theta+\cos^2\theta$
                                                    $=  1$

Therefore $A(\theta)$ is a non-singular matrix. So , it is invertible of all $\theta \in R$

$A(\theta)^{-1} = \begin{bmatrix} \sin\theta & -i\cos\theta \-i\cos\theta & \sin\theta \end{bmatrix}$

Now. $A(\pi -\theta)=\begin{bmatrix} \sin(\pi-\theta) & i\cos(\pi-\theta) \i\cos(\pi-\theta) & \sin(\pi-\theta) \end{bmatrix}$
                          $=\begin{bmatrix} \sin\theta  & -i\cos\theta \ -i\cos\theta & \sin\theta \end{bmatrix}$
                          $= A(\theta)^{-1}$

Now, $A(\pi+\theta)= \begin{bmatrix} \sin(\pi+\theta) & i\cos(\pi+\theta) \i\cos(\pi+\theta) & \sin(\pi+\theta) \end{bmatrix} $
                          $= \begin{bmatrix} -\sin\theta & -i\cos\theta \-i\cos\theta & -\sin\theta \end{bmatrix}$
                          $= -A(\theta)$

Therefore, $A(\theta) + A(\pi+\theta)=0$.

Hence, the correct options are $(A), (B)$ and $(C)$.

Write the following transformation in matrix form
$\quad x _1 = \displaystyle\frac{\sqrt 3}{2}y _1 + \displaystyle\frac{1}{2}y _2; \quad x _2 = -\displaystyle\frac{1}{2}y _1 + \displaystyle\frac{\sqrt 3}{2}y _2$.
Hence find the transformation in matrix form which expresses $y _1, y _2$ in terms of $x _1, x _2$.

  1. $y _1 = \displaystyle\frac{\sqrt 3}{2}x _1 + \displaystyle\frac{1}{2}x _2; \quad y _2 = \displaystyle\frac{1}{2}x _1 + \displaystyle\frac{\sqrt 3}{2}x _2$

  2. $y _1 = \displaystyle\frac{\sqrt 3}{2}x _1 - \displaystyle\frac{1}{2}x _2; \quad y _2 = \displaystyle\frac{1}{2}x _1 + \displaystyle\frac{\sqrt 3}{2}x _2$

  3. $y _1 = \displaystyle\frac{\sqrt 3}{2}x _1 - \displaystyle\frac{1}{2}x _2; \quad y _2 = \displaystyle\frac{1}{2}x _1 - \displaystyle\frac{\sqrt 3}{2}x _2$

  4. None of these


Correct Option: B
Explanation:

$ \displaystyle  { x } _{ 1 }=\frac { \sqrt { 3 }  }{ 2 } { y } _{ 1 }+\frac { 1 }{ 2 } { y } _{ 2 }  $ and $\displaystyle { x } _{ 2 }=\frac { -1 }{ 2 } { y } _{ 1 }+\frac { \sqrt { 3 }  }{ 2 } { y } _{ 2 } $ 
We observe $ \displaystyle \frac { \sqrt { 3 }  }{ 2 } { x } _{ 1 }-\frac { 1 }{ 2 } { x } _{ 2 }=\frac { 3 }{ 4 } { y } _{ 1 }+\frac { \sqrt { 3 }  }{ 2 } .\frac { 1 }{ 2 } { y } _{ 2 }+\frac { 1 }{ 4 } { y } _{ 1 }-\frac { \sqrt { 3 }  }{ 2 } \frac { 1 }{ 2 } { y } _{ 2 } $
$ \displaystyle \Rightarrow \frac { \sqrt { 3 }  }{ 2 } { x } _{ 1 }-\frac { 1 }{ 2 } { x } _{ 2 }={ y } _{ 1 } $
Similarly $ \displaystyle \frac { 1 }{ 2 } { x } _{ 1 }+\frac { \sqrt { 3 }  }{ 2 } { x } _{ 2 }=\frac { 1 }{ 4 } { y } _{ 2 }+\frac { 3 }{ 4 } { y } _{ 2 }={ y } _{ 2 } $
$ \displaystyle \therefore { y } _{ 1 }=\frac { \sqrt { 3 }  }{ 2 } { x } _{ 1 }-\frac { 1 }{ 2 } { x } _{ 2 };{ y } _{ 2 }=\frac { 1 }{ 2 } { x } _{ 1 }+\frac { \sqrt { 3 }  }{ 2 } { x } _{ 2 }  $ 

If $
A=\left[ \begin{array}{ll}{x} & {1} \ {1} & {0}\end{array}\right]
 $ and $
A^{2}=I
 $, $
A^{-1}
 $ is equal to ...............

  1. $

    \left[ \begin{array}{ll}{0} & {1} \ {1} & {0}\end{array}\right]

    $

  2. $

    \left[ \begin{array}{ll}{1} & {0} \ {0} & {1}\end{array}\right]

    $

  3. $

    \left[ \begin{array}{ll}{1} & {1} \ {1} & {1}\end{array}\right]

    $

  4. $

    \left[ \begin{array}{ll}{0} & {0} \ {0} & {0}\end{array}\right]

    $


Correct Option: A
Explanation:
$A=\left[\begin{matrix} x & 1 \\ 1 & 0  \end{matrix}\right]$
Given: ${A}^{2}=I$ where $I$ is $2\times 2$ identity matrix
Let us find ${A}^{2}$
$=\left[\begin{matrix} x & 1 \\ 1 & 0  \end{matrix}\right]\left[\begin{matrix} x & 1 \\ 1 & 0  \end{matrix}\right]$
$=\left[\begin{matrix} {x}^{2}+x & x+0 \\ x+0 & 1+0  \end{matrix}\right]$
Given ${A}^{2}=I$
$\Rightarrow \left[\begin{matrix} {x}^{2}+x & x+0 \\ x+0 & 1+0  \end{matrix}\right]=\left[\begin{matrix} 1 & 0 \\ 0 & 1  \end{matrix}\right]$
Equating,we get
${x}^{2}+x=1,x=0$
Put $x=0$ in $A$
$A=\left[\begin{matrix} 0 & 1 \\ 1 & 0 \end{matrix}\right]$
We have ${A}^{2}=I$
Pre-multiply ${A}^{-1}$ both sides,we get
${A}^{-1}{A}^{2}={A}^{-1}I$
$\Rightarrow A={A}^{-1}$
Hence,${A}^{-1}=\left[\begin{matrix} 0 & 1 \\ 1 & 0 \end{matrix}\right]$

If A and B are any $2\times2$ matrices, then det. (A+B) =0 implies

  1. None of these

  2. det A=0 and det B=0

  3. det A=0 or det B=0

  4. det A=0 + det B=0


Correct Option: A