Tag: time series

Questions Related to time series

The method of least squares dictates that we choose a regression line where the sum of the square of deviations of the points from the line is: 

  1. Maximum

  2. Minimum

  3. Zero

  4. Positive


Correct Option: B
Explanation:

$\Rightarrow$  The method of least squares dictates that we choose a regression line where the sum of the square of deviations of the points from the line is $:Minimum$

$\Rightarrow$  A process by which we estimate the value of dependent variable on the basis of one or more independent variables is regression.
$\Rightarrow$  More specifically, regression analysis helps one understand how the typical value of the dependent variable (or 'criterion variable') changes when any one of the independent variables is varied, while the other independent variables are held fixed.

In simple linear regression, the numbers of unknown constants are: 

  1. One

  2. Two

  3. Three

  4. Four


Correct Option: B
Explanation:

$\Rightarrow$  In simple linear regression, the number of unknown constants are $:Two$

$\Rightarrow$  The line of regression of $y$ on $x$ is given by $y=a+bx$ where $a$ and $b$ are unknown constants known as intercept and slope of the equation. This is used to predict the unknown value of variable $y$ when value of variable $x$ is known.

If one regression coefficient is greater than one, then other will be:

  1. Less than one

  2. More than one

  3. Equal to one

  4. None of these


Correct Option: A
Explanation:

Both the regression coefficients $(b_{xy},b_{yx})$ must have the same sign. i.e., if one of them is positive other should positive or if one of them is negative other should be negative.

If one regression coefficient is greater than one, then other coefficient should be less than one.

The purpose of simple linear regression analysis is to: 

  1. Predict one variable from another variable

  2. Replace points on a scatter diagram by a straight-line

  3. Measure the degree to which two variables are linearly associated

  4. Obtain the expected value of the independent random variable for a given value of the dependent

    variable


Correct Option: A
Explanation:

The regression model gives the relation between two or more variables.

The linear regression model gives the relation between two or more variables using a straight line.

Using the linear regression analysis we can estimate the value of one variable using another variable. 

Ayushi used the data from a scatterplot to determine a regression model showing the relationship between the population in the area where she lived and the number of years, $x$, after she was born. The result was an exponential growth equation of the form $y={x} _{0}{\left(1+r\right)}^{x}$. Then ${x} _{0}$ most likely represents

  1. The population in the year that she was born

  2. The rate of change of the population over time

  3. The maximum population reached during her lifetime

  4. The number of years after her birth when the population reached its maximum


Correct Option: A

The independent variable in a regression line is: 

  1. Non-random variable

  2. Random variable

  3. Qualitative variable

  4. None of these


Correct Option: A
Explanation:

A linear regression line has an equation of the form $Y=a+bX$


where $Y$ is called dependent variable or response.
$X$ is called independent variable or predictors or explanatory variable 
$a$ is the y-intercept and
$b$ is the slope of the line.

The independent variable is a non-random variable. The non-random variable don't admit a probability measure.

In regression analysis, if observed cost value is $50$ and predicted cost value is $7$ then disturbance term is

  1. $53$

  2. $37$

  3. $43$

  4. None of these


Correct Option: C
Explanation:

Given that observed value is $50$ and

predicted value is $7$
The disturbance term is $observed - predicted=50-7=43$

State true or false: The coefficient of correlation between two variables $x$ and $y$ is:

$r=\cfrac { { \sigma }^{ 2 }x+{ \sigma }^{ 2 }y-y }{ 2{ \sigma }_{ x }{ \sigma } _{ y } } $

  1. True

  2. False


Correct Option: A
Explanation:
Coefficient of correlation is to express the degree of linear relationship between the two variables.
The coefficient of correlation between two variables $x$ and $y$ is given by

$r=\dfrac{\sigma^2x+\sigma^2y-y}{2\sigma_{x}\sigma_{y}}$

where $\sigma_x$ is the standard deviation of $x$ and
$\sigma_y$ is the standard deviation of $y$

The sum of the difference between the actual values of $Y$ and its values obtained from the fitted regression line is always:

  1. Zero

  2. Positive

  3. Negative

  4. Minimum


Correct Option: A
Explanation:

Let the actual values be $y_1,y_2,...,y_n$

Let the values obtained from the fitted regression analysis be $\hat y_1,\hat y_2,...,\hat y_n$

Since, the values are obtained from the fitted regression line.
Therefore, the actual and obtained values are almost equal.
Therefore, $y_1=\hat y_1$, $y_2=\hat y_2$ , ... , $y_n=\hat y_n$

sum of the difference between the actual and obtained values is $(y_1-\hat y_1)+(y_2-\hat y_2)+...+(y_n-\hat y_n)=0$

If all the actual and estimated values of $Y$ are same on the regression line, the sum of squares of error will be:

  1. Zero

  2. Minimum

  3. Maximum

  4. Unknown


Correct Option: A
Explanation:

Let the actual values be $y_1,y_2,...,y_n$

Let the estimated values be $\hat y_1,\hat y_2,...,\hat y_n$

Error is $actual-estimated$

Given that actual and estimated values are equal.
Therefore, $y_1=\hat y_1$, $y_2=\hat y_2$ , ... , $y_n=\hat y_n$

sum of squares of error is $(y_1-\hat y_1)^2+(y_2-\hat y_2)^2+...+(y_n-\hat y_n)^2=0$