Subset selection in regression - Info and Reading Options
By Miller, Alan J.

"Subset selection in regression" was published by Chapman & Hall/CRC in 2002 - Boca Raton, it has 238 pages and the language of the book is English.
“Subset selection in regression” Metadata:
- Title: Subset selection in regression
- Author: Miller, Alan J.
- Language: English
- Number of Pages: 238
- Publisher: Chapman & Hall/CRC
- Publish Date: 2002
- Publish Location: Boca Raton
“Subset selection in regression” Subjects and Themes:
- Subjects: ➤ Least squares - Regression analysis - Statistics - Probabilities - Least-Squares Analysis - Analyse de régression - Moindres carrés - MATHEMATICS - Probability & Statistics - Regressieanalyse - Lineaire regressie - Kleinste-kwadratenmethode
Edition Specifications:
- Pagination: xvii, 238 p. :
Edition Identifiers:
- The Open Library ID: OL3558840M - OL4811038W
- Online Computer Library Center (OCLC) ID: 71014453
- Library of Congress Control Number (LCCN): 2002020214
- ISBN-10: 1584881712
- All ISBNs: 1584881712
AI-generated Review of “Subset selection in regression”:
"Subset selection in regression" Table Of Contents:
- 1- Machine generated contents note: 1 Objectives
- 2- 1.1 Prediction, explanation, elimination or what?
- 3- 1.2 How many variables in the prediction formula?
- 4- 1.3 Alternatives to using subsets
- 5- 1.4 'Black box' use of best-subsets techniques
- 6- 2 Least-squares computations
- 7- 2.1 Using sums of squares and products matrices
- 8- 2.2 Orthogonal reduction methods
- 9- 2.3 Gauss-Jordan v. orthogonal reduction methods
- 10- 2.4 Interpretation of projections
- 11- Appendix A. Operation counts for all-subsets regression
- 12- A.1 Garside's Gauss-Jordan algorithm
- 13- A.2 Planar rotations and a Hamiltonian cycle
- 14- A.3 Planar rotations and a binary sequence
- 15- A.4 Fast planar rotations
- 16- 3 Finding subsets which fit well
- 17- 3.1 Objectives and limitations of this chapter
- 18- 3.2 Forward selection
- 19- 3.3 Efroymson's algorithm
- 20- 3.4 Backward elimination
- 21- 3.5 Sequential replacement algorithms
- 22- 3.6 Replacing two variables at a time
- 23- 3.7 Genierating all subsets
- 24- 3.8 Using branch-and-bound techniques
- 25- 3.9 Grouping variables
- 26- 3.10 Ridge regression and other alternatives
- 27- 3.11 The nonnegative garrote and the lasso
- 28- 3.12 Some examples
- 29- 3.13 Conclusions and recommendations
- 30- Appendix A. An algorithm for the lasso
- 31- 4 Hypothesis testing
- 32- 4.1 Is there any information in the remaining variables?
- 33- 4.2 Is one subset better than another?
- 34- 4.2.1 Applications of Spj-tvoll's method
- 35- 4.2.2 Using other confidence ellipsoids
- 36- Appendix A.Spjotvoll's method - detailed description
- 37- 5 When to stop?
- 38- 5.1 What criterion should we use?
- 39- 5.2 Prediction criteria
- 40- 5.2.1 Mean squared errors of prediction (MSEP)
- 41- 5.2.2 MSEP for the fixed model
- 42- 5.2.3 MSEP for the random model
- 43- 5.2.4 A simulation with random predictors
- 44- 5.3 Cross-validation and the P SS statistic
- 45- 5.4 Bootstrapping
- 46- 5.5 Likelihood and information-based stopping rules
- 47- 5.5.1 Minimum description length (MDL)
- 48- Appendix A. Approximate equivaence of stppingules
- 49- A.1 F-to-enter
- 50- A.2 Adjusted R2 or Fisher's A-statistic
- 51- A.3 Akaikesinformatibn criterion (AIC)
- 52- 6 Estatmaion of regression eficients
- 53- 6.1 Selection bias
- 54- 6.2 Choice between two varies
- 55- 6.3 Selection rduction
- 56- 6.3.1 Monte C o et tionfias i f d lection
- 57- 6.3.2 Shrinkage methods
- 58- 6.3.3 Using the jack-knife
- 59- 6.3.4 Independent; data sets ;
- 60- 6.4 Conditional likiood estimations
- 61- 6.5 Estimationofpopulation means
- 62- 6.6 Estimating least-squares projections ;
- 63- Appendix A. Changing projections to equate sums of squares
- 64- 7 Bayesian mnethods
- 65- 7.1 Bayesian introduction
- 66- 7.2 'Spike and slab'prior
- 67- 7.3 Normal prior for regression coefficients
- 68- 7.4 Model averaging
- 69- 7.5 Picking the best model
- 70- 8 Conclusions and some recommendations
- 71- References
- 72- Index.
Read “Subset selection in regression”:
Read “Subset selection in regression” by choosing from the options below.
Search for “Subset selection in regression” downloads:
Visit our Downloads Search page to see if downloads are available.
Borrow "Subset selection in regression" Online:
Check on the availability of online borrowing. Please note that online borrowing has copyright-based limitations and that the quality of ebooks may vary.
- Is Online Borrowing Available: Yes
- Preview Status: full
- Check if available: The Open Library & The Internet Archive
Find “Subset selection in regression” in Libraries Near You:
Read or borrow “Subset selection in regression” from your local library.
- The WorldCat Libraries Catalog: Find a copy of “Subset selection in regression” at a library near you.
Buy “Subset selection in regression” online:
Shop for “Subset selection in regression” on popular online marketplaces.
- Ebay: New and used books.