# Statistical Inference and Hypothesis Testing (ISS Statistics Paper II (New 2016 MCQ Pattern)): Questions 139 - 140 of 222

Get 1 year subscription: Access detailed explanations (illustrated with images and videos) to **246** questions. Access all new questions we will add tracking exam-pattern and syllabus changes. View Sample Explanation or View Features.

Rs. 200.00 or

## Question number: 139

» Statistical Inference and Hypothesis Testing » Factorization Theorem

### Question

Which of the following definitions of Sufficiency of a estimator is known as Fisher-Neyman Factorization Theorem?

### Choices

Choice (4) | Response | |
---|---|---|

a. | An estimator is said to be sufficient for if it provides all the information contained in the sample about the parametric function | |

b. | If is a estimator of parameter based on a sample of size n from the population with density such that the conditional distribution of given T is independent of then T is sufficient estimator of . | |

c. | A statistic is sufficient estimator of parameter if and only if the likelihood function (joint p. d. f. of the sample) can be expressed as where is the p. d. f. of statistic and is a function of sample observations only independent of | |

d. | All of the above |

## Question number: 140

» Statistical Inference and Hypothesis Testing » Factorization Theorem

### Question

Factorization theorem for sufficiency is known as

### Choices

Choice (4) | Response | |
---|---|---|

a. | Rao-Blackwell Theorem | |

b. | Fisher-Neyman Theorem | |

c. | Bernoulli Theorem | |

d. | Cramer-Rao Theorem |