# ISS (Statistical Services) Statistics Paper II (Old Subjective Pattern): Questions 1 - 8 of 39

Get 1 year subscription: Access detailed explanations (illustrated with images and videos) to **39** questions. Access all new questions we will add tracking exam-pattern and syllabus changes. View Sample Explanation or View Features.

Rs. 200.00 or

## Question number: 1

» Statistical Quality Control » Sequential Sampling Plans

Appeared in Year: 2015

### Describe in Detail

Distinguish between the single sampling plan and double sampling plan. Discuss how the O. C curves can be used for comparing two sampling plans.

### Explanation

A single sampling plan in which a decision about the acceptance or rejection of a lot is based on one sample that has been inspected where double sampling plan when a decision about the acceptance or rejection of a lot has not been reached after single sample inspection from a submitted lot, a decision will always be reached when the second sampl…

… (259 more words) …

## Question number: 2

» Hypotheses Testing » Likelihood Ratio Test » ASN Function

Appeared in Year: 2015

### Describe in Detail

Derive the likelihood ratio test for comparing the means of k independent homoscedastic normal populations.

### Explanation

Given that there are k independent homoscedastic normal populations that is the variance is same i. e. ; i = 1, 2, …, k. We have to test

In the __X__ population the sample is = { (x _{i1}, x _{i2}, …, x _{ini})

The parameter space is

The parameter space under H _{0} is

The likelihood function is

The unrestricted MLE is

The restricted MLE is

The…

… (41 more words) …

## Question number: 3

» Estimation » Optimal Properties » Complete Statistics

Appeared in Year: 2014

### Describe in Detail

Define completeness. Verify whether Bin (1, p) is complete.

### Explanation

** Completeness**: It is a property of a statistic in relation to a model for a set of observed data. In essence, it is a condition which ensures that the parameters of the probability distribution representing the model can all be estimated on the basis of the statistic: it ensures that the distributions corresponding to different values of the parame…

… (263 more words) …

## Question number: 4

» Estimation » Estimation Methods » Methods of Moments

Appeared in Year: 2014

### Describe in Detail

For the Pareto distribution with pdf

Show that method of moments fails if 0 < λ < 1. State the method of moments estimator when λ > 1. Is it consistent? Justify your answer.

### Explanation

Let X _{1 }, X _{2 }, …, X _{n } be a simple random sample of Pareto random variables with density

The mean and variance are respectively

In this we have only one parameter λ. Thus, we will only need to determine the first moment

To find the method of moments estimator for λ, we solve λ as a function of the mean µ.

Consequently, a method of m…

… (79 more words) …

## Question number: 5

» Estimation » Estimation Methods » Methods of Moments

Appeared in Year: 2014

### Describe in Detail

X _{1}, X _{2}, …, X _{n} be a random sample from U (0, θ). Obtain the moment estimator of θ. Also find its variance.

### Explanation

Let X _{1}, X _{2}, …, X _{n} be a random sample from U (0, θ). We known that

The estimating equation is

The above equation is solving for the parameter, we get the estimator by using method of moments

The variance of this estimator is

Here the sample are independent and identical distributed and

So, using this

… (0 more words) …

## Question number: 6

» Linear Models » Theory of Linear Estimation » Gauss-Markoff Setup

Appeared in Year: 2014

### Describe in Detail

Define estimability of a linear parametric function in a Gauss Markov model. State and prove a necessary and sufficient condition for estimability.

### Explanation

** Estimability **: The linear parametric function c’β is an estimable function if there exists a vector

a ** R **

^{ n }such that

If X is of full column rank then all linear combinations of β are estimable, since is unique, that is

Suppose we are dealing with the model of estimability. Then a necessary and sufficient condition for a line…

… (148 more words) …

## Question number: 7

» Estimation » Estimation Methods » Maximum Likelihood

Appeared in Year: 2014

### Describe in Detail

X _{1 }, X _{2 }, …, X _{n } are i. i. d. random variables from N (θ, 1) where θ is an integer. Obtain MLE of θ.

### Explanation

X _{1 }, X _{2 }, …, X _{n } are i. i. d. random variables from N (θ, 1). The density function of X is

The likelihood function is

The log likelihood function is

Differentiate this with respect to θ and equating to zero,

Thus, the ML estimator of θ is the sample mean which is unbiased, consistent and BAN.

… (0 more words) …

## Question number: 8

» Hypotheses Testing » Hypothesis » Composite

Appeared in Year: 2014

### Describe in Detail

A sample of size n from normal distribution N (θ, σ ^{2 }) with σ ^{2 } =4 was observed. 95 % confidence interval for θ was computed from the above sample. Find the value of n if the confidence interval is (9.02, 10.98).

### Explanation

The Margin of error is defined as

Where z _{α/2 } is the critical value = 1.96

σ is the standard deviation = 2

E is the margin of error=

… (0 more words) …