Statistical Inference and Hypothesis Testing-Factorization Theorem (ISS Statistics Paper II (New 2016 MCQ Pattern)): Questions 1 - 2 of 5

Get 1 year subscription: Access detailed explanations (illustrated with images and videos) to 246 questions. Access all new questions we will add tracking exam-pattern and syllabus changes. View Sample Explanation or View Features.

Rs. 200.00 or

Question number: 1

» Statistical Inference and Hypothesis Testing » Factorization Theorem

MCQ▾

Question

T=t(x1,x2,..,xn) is sufficient for θ if and only if the joint density function f(x1,x2,..,xn;θ) , of the sample values can be expressed in the form

f(x1,x2,..,xn;θ)=gθ[t(x)].h(x) where gθ[t(x)] depends on θ and x only through the value of t(x), and h(x) is independent of θ . This Theorem is known as

Choices

Choice (4) Response

a.

Blackwell Theorem

b.

Rao-Blackwell Theorem

c.

Factorization Theorem

d.

Cramer-Rao Theorem

Question number: 2

» Statistical Inference and Hypothesis Testing » Factorization Theorem

MCQ▾

Question

Which of the following definitions of Sufficiency of a estimator is known as Fisher-Neyman Factorization Theorem?

Choices

Choice (4) Response

a.

An estimator Tn is said to be sufficient for τ(θ) if it provides all the information contained in the sample about the parametric function τ(θ)

b.

If T=t(x1,x2,.,xn) is a estimator of parameter θ, based on a sample x1,x2,.,xn of size n from the population with density f(x,θ) such that the conditional distribution of x1,x2,.,xn given T is independent of θ, then T is sufficient estimator of θ .

c.

A statistic T=t(x1,x2,.,xn) is sufficient estimator of parameter θ if and only if the likelihood function (joint p. d. f. of the sample) can be expressed as L=i=1nf(xi,θ)=g(t,θ).k(x1,x2,.,xn) where g(t,θ) is the p. d. f. of statistic t and k(x1,x2,.,xn) is a function of sample observations only independent of θ.

d.

All of the above

f Page
Sign In