# Estimation-Optimal Properties (ISS (Statistical Services) Statistics Paper II (Old Subjective Pattern)): Questions 1 - 6 of 8

Get 1 year subscription: Access detailed explanations (illustrated with images and videos) to **39** questions. Access all new questions we will add tracking exam-pattern and syllabus changes. View Sample Explanation or View Features.

Rs. 200.00 or

## Question number: 1

» Estimation » Optimal Properties » Complete Statistics

Appeared in Year: 2014

### Describe in Detail

Define completeness. Verify whether Bin (1, p) is complete.

### Explanation

** Completeness**: It is a property of a statistic in relation to a model for a set of observed data. In essence, it is a condition which ensures that the parameters of the probability distribution representing the model can all be estimated on the basis of the statistic: it ensures

## Question number: 2

» Estimation » Optimal Properties » Sufficient Estimator

Appeared in Year: 2014

### Describe in Detail

Show that is not a sufficient estimator of the Bernoulli parameter θ.

### Explanation

Let X _{i } is a ith random variable follows Bernoulli distribution with parameter θ. Then, the random variable is defined as

For i = 1, 2, …, n

Now

So, the pmf of Z is

e conditional distribution of (X _{1 }, X _{2 }, …, X

## Question number: 3

» Estimation » Optimal Properties » Confidence Interval Estimation

Appeared in Year: 2015

### Describe in Detail

Let y _{1}, y _{2, …, } y _{n} be a random sample from N (µ, σ ^{2}) where µ and σ ^{2} are both unknown. Obtain a confidence interval of µ with confidence coefficient (1-α)

### Explanation

When population mean and population standard deviation in not know. If is the samplemean and replace σ by its estimate s and t _{α/2} be the critical value of the student t-test such that have of the area on the left hand side and other half on the right

## Question number: 4

» Estimation » Optimal Properties » Sufficient Estimator

Appeared in Year: 2015

### Describe in Detail

Obtain the sufficient statistics for the following distribution.

(i)

(ii)

### Explanation

By using factorization theorem, the condition is that

where h (x) is free from θ and g (. ) depends on __X__ only through T.

(i)

The joint pdf of random sample is

Let T= . By factorization theorem

h (x) =1,

So, h

## Question number: 5

» Estimation » Optimal Properties » Cramer-Raoinequality

Appeared in Year: 2015

### Describe in Detail

Stating the regularity conditions, give the Cramer-Rao lower bound for the variance of an unbiased estimator of a parameter. Give an example, each, of a situation where the regularity conditions (i) does not hold (ii) holds

### Explanation

Suppose that X _{1}, …, X _{n} is a sample from a distribution with joint pdf f _{n} (__x__, θ) and T (__X__) is an estimator. Also assume that f _{n} () satisfies the conditions that allow

(i) Interchange of differentiation and integration operations i.

## Question number: 6

» Estimation » Optimal Properties » Rao-Blackwell Theorem

Appeared in Year: 2015

### Describe in Detail

Explain how the Rao-Blackwell theorem helps one to find a uniformly minimum variance unbiased estimator (UMVUE) of an unknown parameter. What is the relevance of the Lehman-Scheffe theorem in this scenario? If X _{1}, X _{2}, …, X _{n} are Bin (1, p) variates, find the UMVUE of p.

### Explanation

Let U be an unbiased estimator of θ and T be a sufficient statistic for θ, then E (U|T) is free from θ and it is an estimation. Using the identity , we have

True for all θ. Then is an unbiased for θ.

Next, we find