Ship-Ship-Hooray! Free Shipping on $25+ Details >

by Morris H. Degroot and Mark J. Schervish

Edition: 3RD 02Copyright: 2002

Publisher: Addison-Wesley Longman, Inc.

Published: 2002

International: No

This title is currently not available in digital format.

Well, that's no good. Unfortunately, this edition is currently out of stock. Please check back soon.

Available in the Marketplace starting at $29.99

Price | Condition | Seller | Comments |
---|

Probability & Statistics was written for a one or two semester probability and statistics course offered primarily at four-year institutions and taken mostly by sophomore and junior level students, majoring in mathematics or statistics. Calculus is a prerequisite, and a familiarity with the concepts and elementary properties of vectors and matrices is a plus. The revision of this well-respected text presents a balanced approach of the classical and Bayesian methods and now includes a new chapter on simulation (including Markov chain Monte Carlo and the Bootstrap), expanded coverage of residual analysis in linear models, and more examples using real data.

**Features:**

- NEW! A new chapter on simulation has been added. This includes methods for simulating specific distributions, importance sampling, Markov chain Monte Carlo, and the bootstrap.
- NEW! Expanded coverage of residual analysis in linear models.
- NEW! More examples now use real data.
- NEW! New sections or subsections on conditionally independent events and random variables, the log normal distribution, quantiles, prediction and prediction intervals, improper priors, Bayes tests, power functions, M-estimators, residual plots in linear models, and Bayesian analysis of simple linear regression are now included.
- NEW! Brief introductions and summaries have been added to each technical section. The introductory paragraphs give readers a hint about what they are going to encounter. The summaries list the most important ideas.
- NEW! The author has added special notes where it is useful to briefly summarize or make a connection to a point made elsewhere in the text.
- NEW! Some material has been reorganized. Independence is now introduced after conditional probability. The first five chapters of the text are devoted to probability and can serve as the text for a one-semester course on probability.
- In addition to examples using current data, some elementary concepts of probability are illustrated by famous examples such as the birthday problem, the tennis tournament problem, the matching problem, and the collector's problem.
- Included as a special feature are sections on Markov chains, the Gambler's Ruin problem, and utility and preferences among gambles. These topics are treated in a completely elementary fashion, and can be omitted without loss of continuity if time is limited.
- Optional sections of the book are indicated by an asterisk in the Table of Contents.
- Chapters 6 through 10 are devoted to statistical inference. Both classical and Bayesian statistical methods are developed in an integrated presentation which will be useful to students when applying the concepts to the real world.

**DeGroot, Morris H. :**

Schervish, Mark J. : Carnegie-Mellon University

**1. Introduction to Probability. **

The History of Probability.

Interpretations of Probability.

Experiments and Events.

Set Theory.

The Definition of Probability.

Finite Sample Spaces.

Counting Methods.

Combinatorial Methods.

Multinomial Coefficients.

The Probability of a Union of Events.

Statistical Swindles.

Supplementary Exercises.

**2. Conditional Probability. **

The Definition of Conditional Probability.

Independent Events.

Bayes' Theorem.

Markov Chains.

The Gambler's Ruin Problem.

Supplementary Exercises.

**3. Random Variables and Distribution. **

Random Variables and Discrete Distributions.

Continuous Distributions.

The Distribution Function.

Bivariate Distributions.

Marginal Distributions.

Conditional Distributions.

Multivariate Distributions.

Functions of a Random Variable.

Functions of Two or More Random Variables.

Supplementary Exercises.

**4. Expectation. **

The Expectation of a Random Variable.

Properties of Expectations.

Variance.

Moments.

The Mean and The Median.

Covariance and Correlation.

Conditional Expectation.

The Sample Mean.

Utility.

Supplementary Exercises.

**5. Special Distributions. **

Introduction.

The Bernoulli and Binomial Distributions.

The Hypergeometric Distribution.

The Poisson Distribution.

The Negative Binomial Distribution.

The Normal Distribution.

The Central Limit Theorem.

The Correction for Continuity.

The Gamma Distribution.

The Beta Distribution.

The Multinomial Distribution.

The Bivariate Normal Distribution.

Supplementary Exercises.

**6. Estimation. **

Statistical Inference.

Prior and Posterior Distributions.

Conjugate Prior Distributions.

Bayes Estimators.

Maximum Likelihood Estimators.

Properties of Maximum Likelihood Estimators.

Sufficient Statistics.

Jointly Sufficient Statistics.

Improving an Estimator.

Supplementary Exercises.

**7. Sampling Distributions of Estimators. **

The Sampling Distribution of a Statistic.

The Chi-Square Distribution.

Joint Distribution of the Sample Mean and Sample Variance.

The t Distribution.

Confidence Intervals.

Bayesian Analysis of Samples from a Normal Distribution.

Unbiased Estimators.

Fisher Information.

Supplementary Exercises.

**8. Testing Hypotheses. **

Problems of Testing Hypotheses.

Testing Simple Hypotheses.

Uniformly Most Powerful Tests.

Two-Sided Alternatives.

The t Test.

Comparing the Means of Two Normal Distributions.

The F Distribution.

Bayes Test Procedures.

Foundational Issues.

Supplementary Exercises.

**9. Categorical Data and Nonparametric Methods. **

Tests of Goodness-of-Fit.

Goodness-of-Fit for Composite Hypotheses.

Contingency Tables.

Tests of Homogeneit.

Simpson's Paradox.

Kolmogorov-Smirnov Test.

Robust Estimation.

Sign and Rank Tests.

Supplementary Exercises.

**10. Linear Statistical Models. **

The Method of Least Squares.

Regression.

Statistical Inference in Simple Linear Regression.

Bayesian Inference in Simple Linear Regression.

The General Linear Model and Multiple Regression.

Analysis of Variance.

The Two-Way Layout.

The Two-Way Layout with Replications.

Supplementary Exercises.

**11. Simulation. **

Why is Simulation Useful?

Simulating Specific Distributions.

Importance Sampling.

Markov Chain Monte Carlo.

The Bootstrap.

Supplementary Exercises.

shop us with confidence

Summary

Probability & Statistics was written for a one or two semester probability and statistics course offered primarily at four-year institutions and taken mostly by sophomore and junior level students, majoring in mathematics or statistics. Calculus is a prerequisite, and a familiarity with the concepts and elementary properties of vectors and matrices is a plus. The revision of this well-respected text presents a balanced approach of the classical and Bayesian methods and now includes a new chapter on simulation (including Markov chain Monte Carlo and the Bootstrap), expanded coverage of residual analysis in linear models, and more examples using real data.

**Features:**

- NEW! A new chapter on simulation has been added. This includes methods for simulating specific distributions, importance sampling, Markov chain Monte Carlo, and the bootstrap.
- NEW! Expanded coverage of residual analysis in linear models.
- NEW! More examples now use real data.
- NEW! New sections or subsections on conditionally independent events and random variables, the log normal distribution, quantiles, prediction and prediction intervals, improper priors, Bayes tests, power functions, M-estimators, residual plots in linear models, and Bayesian analysis of simple linear regression are now included.
- NEW! Brief introductions and summaries have been added to each technical section. The introductory paragraphs give readers a hint about what they are going to encounter. The summaries list the most important ideas.
- NEW! The author has added special notes where it is useful to briefly summarize or make a connection to a point made elsewhere in the text.
- NEW! Some material has been reorganized. Independence is now introduced after conditional probability. The first five chapters of the text are devoted to probability and can serve as the text for a one-semester course on probability.
- In addition to examples using current data, some elementary concepts of probability are illustrated by famous examples such as the birthday problem, the tennis tournament problem, the matching problem, and the collector's problem.
- Included as a special feature are sections on Markov chains, the Gambler's Ruin problem, and utility and preferences among gambles. These topics are treated in a completely elementary fashion, and can be omitted without loss of continuity if time is limited.
- Optional sections of the book are indicated by an asterisk in the Table of Contents.
- Chapters 6 through 10 are devoted to statistical inference. Both classical and Bayesian statistical methods are developed in an integrated presentation which will be useful to students when applying the concepts to the real world.

Author Bio

**DeGroot, Morris H. :**

Schervish, Mark J. : Carnegie-Mellon University

Table of Contents

**1. Introduction to Probability. **

The History of Probability.

Interpretations of Probability.

Experiments and Events.

Set Theory.

The Definition of Probability.

Finite Sample Spaces.

Counting Methods.

Combinatorial Methods.

Multinomial Coefficients.

The Probability of a Union of Events.

Statistical Swindles.

Supplementary Exercises.

**2. Conditional Probability. **

The Definition of Conditional Probability.

Independent Events.

Bayes' Theorem.

Markov Chains.

The Gambler's Ruin Problem.

Supplementary Exercises.

**3. Random Variables and Distribution. **

Random Variables and Discrete Distributions.

Continuous Distributions.

The Distribution Function.

Bivariate Distributions.

Marginal Distributions.

Conditional Distributions.

Multivariate Distributions.

Functions of a Random Variable.

Functions of Two or More Random Variables.

Supplementary Exercises.

**4. Expectation. **

The Expectation of a Random Variable.

Properties of Expectations.

Variance.

Moments.

The Mean and The Median.

Covariance and Correlation.

Conditional Expectation.

The Sample Mean.

Utility.

Supplementary Exercises.

**5. Special Distributions. **

Introduction.

The Bernoulli and Binomial Distributions.

The Hypergeometric Distribution.

The Poisson Distribution.

The Negative Binomial Distribution.

The Normal Distribution.

The Central Limit Theorem.

The Correction for Continuity.

The Gamma Distribution.

The Beta Distribution.

The Multinomial Distribution.

The Bivariate Normal Distribution.

Supplementary Exercises.

**6. Estimation. **

Statistical Inference.

Prior and Posterior Distributions.

Conjugate Prior Distributions.

Bayes Estimators.

Maximum Likelihood Estimators.

Properties of Maximum Likelihood Estimators.

Sufficient Statistics.

Jointly Sufficient Statistics.

Improving an Estimator.

Supplementary Exercises.

**7. Sampling Distributions of Estimators. **

The Sampling Distribution of a Statistic.

The Chi-Square Distribution.

Joint Distribution of the Sample Mean and Sample Variance.

The t Distribution.

Confidence Intervals.

Bayesian Analysis of Samples from a Normal Distribution.

Unbiased Estimators.

Fisher Information.

Supplementary Exercises.

**8. Testing Hypotheses. **

Problems of Testing Hypotheses.

Testing Simple Hypotheses.

Uniformly Most Powerful Tests.

Two-Sided Alternatives.

The t Test.

Comparing the Means of Two Normal Distributions.

The F Distribution.

Bayes Test Procedures.

Foundational Issues.

Supplementary Exercises.

**9. Categorical Data and Nonparametric Methods. **

Tests of Goodness-of-Fit.

Goodness-of-Fit for Composite Hypotheses.

Contingency Tables.

Tests of Homogeneit.

Simpson's Paradox.

Kolmogorov-Smirnov Test.

Robust Estimation.

Sign and Rank Tests.

Supplementary Exercises.

**10. Linear Statistical Models. **

The Method of Least Squares.

Regression.

Statistical Inference in Simple Linear Regression.

Bayesian Inference in Simple Linear Regression.

The General Linear Model and Multiple Regression.

Analysis of Variance.

The Two-Way Layout.

The Two-Way Layout with Replications.

Supplementary Exercises.

**11. Simulation. **

Why is Simulation Useful?

Simulating Specific Distributions.

Importance Sampling.

Markov Chain Monte Carlo.

The Bootstrap.

Supplementary Exercises.

Publisher Info

Publisher: Addison-Wesley Longman, Inc.

Published: 2002

International: No

Published: 2002

International: No

Probability & Statistics was written for a one or two semester probability and statistics course offered primarily at four-year institutions and taken mostly by sophomore and junior level students, majoring in mathematics or statistics. Calculus is a prerequisite, and a familiarity with the concepts and elementary properties of vectors and matrices is a plus. The revision of this well-respected text presents a balanced approach of the classical and Bayesian methods and now includes a new chapter on simulation (including Markov chain Monte Carlo and the Bootstrap), expanded coverage of residual analysis in linear models, and more examples using real data.

**Features:**

- NEW! A new chapter on simulation has been added. This includes methods for simulating specific distributions, importance sampling, Markov chain Monte Carlo, and the bootstrap.
- NEW! Expanded coverage of residual analysis in linear models.
- NEW! More examples now use real data.
- NEW! New sections or subsections on conditionally independent events and random variables, the log normal distribution, quantiles, prediction and prediction intervals, improper priors, Bayes tests, power functions, M-estimators, residual plots in linear models, and Bayesian analysis of simple linear regression are now included.
- NEW! Brief introductions and summaries have been added to each technical section. The introductory paragraphs give readers a hint about what they are going to encounter. The summaries list the most important ideas.
- NEW! The author has added special notes where it is useful to briefly summarize or make a connection to a point made elsewhere in the text.
- NEW! Some material has been reorganized. Independence is now introduced after conditional probability. The first five chapters of the text are devoted to probability and can serve as the text for a one-semester course on probability.
- In addition to examples using current data, some elementary concepts of probability are illustrated by famous examples such as the birthday problem, the tennis tournament problem, the matching problem, and the collector's problem.
- Included as a special feature are sections on Markov chains, the Gambler's Ruin problem, and utility and preferences among gambles. These topics are treated in a completely elementary fashion, and can be omitted without loss of continuity if time is limited.
- Optional sections of the book are indicated by an asterisk in the Table of Contents.
- Chapters 6 through 10 are devoted to statistical inference. Both classical and Bayesian statistical methods are developed in an integrated presentation which will be useful to students when applying the concepts to the real world.

**DeGroot, Morris H. :**

Schervish, Mark J. : Carnegie-Mellon University

**1. Introduction to Probability. **

The History of Probability.

Interpretations of Probability.

Experiments and Events.

Set Theory.

The Definition of Probability.

Finite Sample Spaces.

Counting Methods.

Combinatorial Methods.

Multinomial Coefficients.

The Probability of a Union of Events.

Statistical Swindles.

Supplementary Exercises.

**2. Conditional Probability. **

The Definition of Conditional Probability.

Independent Events.

Bayes' Theorem.

Markov Chains.

The Gambler's Ruin Problem.

Supplementary Exercises.

**3. Random Variables and Distribution. **

Random Variables and Discrete Distributions.

Continuous Distributions.

The Distribution Function.

Bivariate Distributions.

Marginal Distributions.

Conditional Distributions.

Multivariate Distributions.

Functions of a Random Variable.

Functions of Two or More Random Variables.

Supplementary Exercises.

**4. Expectation. **

The Expectation of a Random Variable.

Properties of Expectations.

Variance.

Moments.

The Mean and The Median.

Covariance and Correlation.

Conditional Expectation.

The Sample Mean.

Utility.

Supplementary Exercises.

**5. Special Distributions. **

Introduction.

The Bernoulli and Binomial Distributions.

The Hypergeometric Distribution.

The Poisson Distribution.

The Negative Binomial Distribution.

The Normal Distribution.

The Central Limit Theorem.

The Correction for Continuity.

The Gamma Distribution.

The Beta Distribution.

The Multinomial Distribution.

The Bivariate Normal Distribution.

Supplementary Exercises.

**6. Estimation. **

Statistical Inference.

Prior and Posterior Distributions.

Conjugate Prior Distributions.

Bayes Estimators.

Maximum Likelihood Estimators.

Properties of Maximum Likelihood Estimators.

Sufficient Statistics.

Jointly Sufficient Statistics.

Improving an Estimator.

Supplementary Exercises.

**7. Sampling Distributions of Estimators. **

The Sampling Distribution of a Statistic.

The Chi-Square Distribution.

Joint Distribution of the Sample Mean and Sample Variance.

The t Distribution.

Confidence Intervals.

Bayesian Analysis of Samples from a Normal Distribution.

Unbiased Estimators.

Fisher Information.

Supplementary Exercises.

**8. Testing Hypotheses. **

Problems of Testing Hypotheses.

Testing Simple Hypotheses.

Uniformly Most Powerful Tests.

Two-Sided Alternatives.

The t Test.

Comparing the Means of Two Normal Distributions.

The F Distribution.

Bayes Test Procedures.

Foundational Issues.

Supplementary Exercises.

**9. Categorical Data and Nonparametric Methods. **

Tests of Goodness-of-Fit.

Goodness-of-Fit for Composite Hypotheses.

Contingency Tables.

Tests of Homogeneit.

Simpson's Paradox.

Kolmogorov-Smirnov Test.

Robust Estimation.

Sign and Rank Tests.

Supplementary Exercises.

**10. Linear Statistical Models. **

The Method of Least Squares.

Regression.

Statistical Inference in Simple Linear Regression.

Bayesian Inference in Simple Linear Regression.

The General Linear Model and Multiple Regression.

Analysis of Variance.

The Two-Way Layout.

The Two-Way Layout with Replications.

Supplementary Exercises.

**11. Simulation. **

Why is Simulation Useful?

Simulating Specific Distributions.

Importance Sampling.

Markov Chain Monte Carlo.

The Bootstrap.

Supplementary Exercises.