Downloads & Free Reading Options - Results
Bounds For Selection by Laurent Hyafil
Read "Bounds For Selection" by Laurent Hyafil through these free online access and download options.
Books Results
Source: The Internet Archive
The internet Archive Search Results
Available books for downloads and borrow from The internet Archive
1Risk Bounds For Embedded Variable Selection In Classification Trees
By Servane Gey and Tristan Mary-Huard
The problems of model and variable selections for classification trees are jointly considered. A penalized criterion is proposed which explicitly takes into account the number of variables, and a risk bound inequality is provided for the tree classifier minimizing this criterion. This penalized criterion is compared to the one used during the pruning step of the CART algorithm. It is shown that the two criteria are similar under some specific margin assumptions. In practice, the tuning parameter of the CART penalty has to be calibrated by hold-out. Simulation studies are performed which confirm that the hold-out procedure mimics the form of the proposed penalized criterion.
“Risk Bounds For Embedded Variable Selection In Classification Trees” Metadata:
- Title: ➤ Risk Bounds For Embedded Variable Selection In Classification Trees
- Authors: Servane GeyTristan Mary-Huard
- Language: English
Edition Identifiers:
- Internet Archive ID: arxiv-1108.0757
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 8.96 Mbs, the file-s for this book were downloaded 57 times, the file-s went public at Sat Sep 21 2013.
Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Risk Bounds For Embedded Variable Selection In Classification Trees at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
2Bounds For Pach's Selection Theorem And For The Minimum Solid Angle In A Simplex
By Roman Karasev, Jan Kynčl, Pavel Paták, Zuzana Patáková and Martin Tancer
We estimate the selection constant in the following geometric selection theorem by Pach: For every positive integer $d$ there is a constant $c_d > 0$ such that whenever $X_1,..., X_{d+1}$ are $n$-element subsets of $\mathbb{R}^d$, then we can find a point $\mathbf{p} \in \mathbb{R}^d$ and subsets $Y_i \subseteq X_i$ for every $i \in [d+1]$, each of size at least $c_d n$, such that $\mathbf{p}$ belongs to all {\em rainbow} $d$-simplices determined by $Y_1,..., Y_{d+1}$, that is, simplices with one vertex in each $Y_i$. We show a super-exponentially decreasing upper bound $c_d\leq e^{-(1/2-o(1))(d \ln d)}$. The ideas used in the proof of the upper bound also help us prove Pach's theorem with $c_d \geq 2^{-2^{d^2 + O(d)}}$, which is a lower bound doubly exponentially decreasing in $d$ (up to some polynomial in the exponent). For comparison, Pach's original approach yields a triply exponentially decreasing lower bound. On the other hand, Fox, Pach, and Suk recently obtained a hypergraph density result implying a proof of Pach's theorem with $c_d \geq2^{-O(d^2\log d)}$. In our construction for the upper bound, we use the fact that the minimum solid angle of every $d$-simplex is super-exponentially small. This fact was previously unknown and might be of independent interest. For the lower bound, we improve the "separation" part of the argument by showing that in one of the key steps only $d+1$ separations are necessary, compared to $2^d$ separations in the original proof. We also provide a measure version of Pach's theorem.
“Bounds For Pach's Selection Theorem And For The Minimum Solid Angle In A Simplex” Metadata:
- Title: ➤ Bounds For Pach's Selection Theorem And For The Minimum Solid Angle In A Simplex
- Authors: Roman KarasevJan KynčlPavel PatákZuzana PatákováMartin Tancer
“Bounds For Pach's Selection Theorem And For The Minimum Solid Angle In A Simplex” Subjects and Themes:
- Subjects: Mathematics - Metric Geometry - Combinatorics
Edition Identifiers:
- Internet Archive ID: arxiv-1403.8147
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 0.34 Mbs, the file-s for this book were downloaded 22 times, the file-s went public at Sat Jun 30 2018.
Available formats:
Archive BitTorrent - Metadata - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Bounds For Pach's Selection Theorem And For The Minimum Solid Angle In A Simplex at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
3SOCP Relaxation Bounds For The Optimal Subset Selection Problem Applied To Robust Linear Regression
By Salvador Flores
This paper deals with the problem of finding the globally optimal subset of h elements from a larger set of n elements in d space dimensions so as to minimize a quadratic criterion, with an special emphasis on applications to computing the Least Trimmed Squares Estimator (LTSE) for robust regression. The computation of the LTSE is a challenging subset selection problem involving a nonlinear program with continuous and binary variables, linked in a highly nonlinear fashion. The selection of a globally optimal subset using the branch and bound (BB) algorithm is limited to problems in very low dimension, tipically d
“SOCP Relaxation Bounds For The Optimal Subset Selection Problem Applied To Robust Linear Regression” Metadata:
- Title: ➤ SOCP Relaxation Bounds For The Optimal Subset Selection Problem Applied To Robust Linear Regression
- Author: Salvador Flores
- Language: English
“SOCP Relaxation Bounds For The Optimal Subset Selection Problem Applied To Robust Linear Regression” Subjects and Themes:
- Subjects: Statistics - Optimization and Control - Statistics Theory - Mathematics
Edition Identifiers:
- Internet Archive ID: arxiv-1505.08134
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 7.38 Mbs, the file-s for this book were downloaded 36 times, the file-s went public at Wed Jun 27 2018.
Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find SOCP Relaxation Bounds For The Optimal Subset Selection Problem Applied To Robust Linear Regression at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
4Tight Lower Bounds For Differentially Private Selection
By Thomas Steinke and Jonathan Ullman
A pervasive task in the differential privacy literature is to select the $k$ items of "highest quality" out of a set of $d$ items, where the quality of each item depends on a sensitive dataset that must be protected. Variants of this task arise naturally in fundamental problems like feature selection and hypothesis testing, and also as subroutines for many sophisticated differentially private algorithms. The standard approaches to these tasks---repeated use of the exponential mechanism or the sparse vector technique---approximately solve this problem given a dataset of $n = O(\sqrt{k}\log d)$ samples. We provide a tight lower bound for some very simple variants of the private selection problem. Our lower bound shows that a sample of size $n = \Omega(\sqrt{k} \log d)$ is required even to achieve a very minimal accuracy guarantee. Our results are based on an extension of the fingerprinting method to sparse selection problems. Previously, the fingerprinting method has been used to provide tight lower bounds for answering an entire set of $d$ queries, but often only some much smaller set of $k$ queries are relevant. Our extension allows us to prove lower bounds that depend on both the number of relevant queries and the total number of queries.
“Tight Lower Bounds For Differentially Private Selection” Metadata:
- Title: ➤ Tight Lower Bounds For Differentially Private Selection
- Authors: Thomas SteinkeJonathan Ullman
“Tight Lower Bounds For Differentially Private Selection” Subjects and Themes:
- Subjects: Cryptography and Security - Data Structures and Algorithms - Computing Research Repository
Edition Identifiers:
- Internet Archive ID: arxiv-1704.03024
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 0.34 Mbs, the file-s for this book were downloaded 20 times, the file-s went public at Sat Jun 30 2018.
Available formats:
Archive BitTorrent - Metadata - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Tight Lower Bounds For Differentially Private Selection at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
5Nearly Instance Optimal Sample Complexity Bounds For Top-k Arm Selection
By Lijie Chen, Jian Li and Mingda Qiao
In the Best-$k$-Arm problem, we are given $n$ stochastic bandit arms, each associated with an unknown reward distribution. We are required to identify the $k$ arms with the largest means by taking as few samples as possible. In this paper, we make progress towards a complete characterization of the instance-wise sample complexity bounds for the Best-$k$-Arm problem. On the lower bound side, we obtain a novel complexity term to measure the sample complexity that every Best-$k$-Arm instance requires. This is derived by an interesting and nontrivial reduction from the Best-$1$-Arm problem. We also provide an elimination-based algorithm that matches the instance-wise lower bound within doubly-logarithmic factors. The sample complexity of our algorithm strictly dominates the state-of-the-art for Best-$k$-Arm (module constant factors).
“Nearly Instance Optimal Sample Complexity Bounds For Top-k Arm Selection” Metadata:
- Title: ➤ Nearly Instance Optimal Sample Complexity Bounds For Top-k Arm Selection
- Authors: Lijie ChenJian LiMingda Qiao
“Nearly Instance Optimal Sample Complexity Bounds For Top-k Arm Selection” Subjects and Themes:
- Subjects: Learning - Machine Learning - Statistics - Data Structures and Algorithms - Computing Research Repository
Edition Identifiers:
- Internet Archive ID: arxiv-1702.03605
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 0.38 Mbs, the file-s for this book were downloaded 19 times, the file-s went public at Sat Jun 30 2018.
Available formats:
Archive BitTorrent - Metadata - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Nearly Instance Optimal Sample Complexity Bounds For Top-k Arm Selection at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
6Model Confidence Bounds For Variable Selection
By Yang Li, Zhibing He, Yuetian Luo, Davide Ferrari and Yichen Qin
We introduce the model confidence bounds (MCBs) for variable selection in the context of nested parametric models. Similarly to the endpoints in the familiar confidence interval for parameter estimation, the MCBs identify two nested models (upper and lower confidence bound models) containing the true model at a given level of confidence. Instead of trusting a single selected model by a given selection method, the MCBs width and composition enable the practitioner to assess the overall model uncertainty. The MCBs methodology is implemented by a fast bootstrap algorithm which is shown to yield the correct asymptotic coverage under rather general conditions. A new graphical tool -- the model uncertainty curve (MUC) -- is introduced to visualize the variability of model selection and compare different model selection procedures. Our Monte Carlo simulations and real data examples confirm the validity and illustrate the advantages of the proposed method.
“Model Confidence Bounds For Variable Selection” Metadata:
- Title: ➤ Model Confidence Bounds For Variable Selection
- Authors: Yang LiZhibing HeYuetian LuoDavide FerrariYichen Qin
“Model Confidence Bounds For Variable Selection” Subjects and Themes:
- Subjects: Methodology - Statistics
Edition Identifiers:
- Internet Archive ID: arxiv-1611.09509
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 0.51 Mbs, the file-s for this book were downloaded 22 times, the file-s went public at Fri Jun 29 2018.
Available formats:
Archive BitTorrent - Metadata - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Model Confidence Bounds For Variable Selection at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
7Bounds For Selection
By Hyafil, Laurent
We introduce the model confidence bounds (MCBs) for variable selection in the context of nested parametric models. Similarly to the endpoints in the familiar confidence interval for parameter estimation, the MCBs identify two nested models (upper and lower confidence bound models) containing the true model at a given level of confidence. Instead of trusting a single selected model by a given selection method, the MCBs width and composition enable the practitioner to assess the overall model uncertainty. The MCBs methodology is implemented by a fast bootstrap algorithm which is shown to yield the correct asymptotic coverage under rather general conditions. A new graphical tool -- the model uncertainty curve (MUC) -- is introduced to visualize the variability of model selection and compare different model selection procedures. Our Monte Carlo simulations and real data examples confirm the validity and illustrate the advantages of the proposed method.
“Bounds For Selection” Metadata:
- Title: Bounds For Selection
- Author: Hyafil, Laurent
- Language: English
Edition Identifiers:
- Internet Archive ID: boundsforselecti651hyaf
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 27.67 Mbs, the file-s for this book were downloaded 182 times, the file-s went public at Tue Mar 12 2013.
Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - Cloth Cover Detection Log - DjVu - DjVuTXT - Djvu XML - Dublin Core - Item Tile - MARC - MARC Binary - MARC Source - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Bounds For Selection at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
8Lower Bounds On Active Learning For Graphical Model Selection
By Jonathan Scarlett and Volkan Cevher
We consider the problem of estimating the underlying graph associated with a Markov random field, with the added twist that the decoding algorithm can iteratively choose which subsets of nodes to sample based on the previous samples, resulting in an active learning setting. Considering both Ising and Gaussian models, we provide algorithm-independent lower bounds for high-probability recovery within the class of degree-bounded graphs. Our main results are minimax lower bounds for the active setting that match the best known lower bounds for the passive setting, which in turn are known to be tight in several cases of interest. Our analysis is based on Fano's inequality, along with novel mutual information bounds for the active learning setting, and the application of restricted graph ensembles. While we consider ensembles that are similar or identical to those used in the passive setting, we require different analysis techniques, with a key challenge being bounding a mutual information quantity associated with observed subsets of nodes, as opposed to full observations.
“Lower Bounds On Active Learning For Graphical Model Selection” Metadata:
- Title: ➤ Lower Bounds On Active Learning For Graphical Model Selection
- Authors: Jonathan ScarlettVolkan Cevher
“Lower Bounds On Active Learning For Graphical Model Selection” Subjects and Themes:
- Subjects: ➤ Machine Learning - Mathematics - Information Theory - Learning - Statistics - Computing Research Repository - Social and Information Networks
Edition Identifiers:
- Internet Archive ID: arxiv-1607.02413
Downloads Information:
The book is available for download in "texts" format, the size of the file-s is: 0.65 Mbs, the file-s for this book were downloaded 25 times, the file-s went public at Fri Jun 29 2018.
Available formats:
Archive BitTorrent - Metadata - Text PDF -
Related Links:
- Whefi.com: Download
- Whefi.com: Review - Coverage
- Internet Archive: Details
- Internet Archive Link: Downloads
Online Marketplaces
Find Lower Bounds On Active Learning For Graphical Model Selection at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
Source: The Open Library
The Open Library Search Results
Available books for downloads and borrow from The Open Library
1Bounds for selection
By Laurent Hyafil

“Bounds for selection” Metadata:
- Title: Bounds for selection
- Author: Laurent Hyafil
- Language: English
- Publisher: ➤ Dept. of Computer Science, University of Illinois at Urbana-Champaign
- Publish Date: 1974
- Publish Location: Urbana - Urbana, Illinois
“Bounds for selection” Subjects and Themes:
- Subjects: Sorting (Electronic computers)
Edition Identifiers:
- The Open Library ID: OL33117309M - OL5171716M
- Online Computer Library Center (OCLC) ID: 1256724
- Library of Congress Control Number (LCCN): 74623738
Access and General Info:
- First Year Published: 1974
- Is Full Text Available: Yes
- Is The Book Public: Yes
- Access Status: Public
Online Access
Online Borrowing:
- Borrowing from Open Library: Borrowing link
- Borrowing from Archive.org: Borrowing link
Online Marketplaces
Find Bounds for selection at online marketplaces:
- Amazon: Audiable, Kindle and printed editions.
- Ebay: New & used books.
Buy “Bounds For Selection” online:
Shop for “Bounds For Selection” on popular online marketplaces.
- Ebay: New and used books.