Downloads & Free Reading Options - Results

Algorithms And Complexity by Italian Conference On Algorithms And Complexity (4th 2000 Rome%2c Italy)

Read "Algorithms And Complexity" by Italian Conference On Algorithms And Complexity (4th 2000 Rome%2c Italy) through these free online access and download options.

Search for Downloads

Search by Title or Author

Books Results

Source: The Internet Archive

The internet Archive Search Results

Available books for downloads and borrow from The internet Archive

1DTIC ADA1022253: Research In Complexity Theory And Combinatorial Algorithms

By

Since October 1, 1979, research in Complexity Theory and Combinatorial Algorithms at the Department of Computer Science at the University of Illinois was supported by the Office of Naval Research. During this period of time, research work was carried out in the areas of Computational Complexity Theory, Scheduling Algorithms, Graph Algorithms, Dynamic Programming, and Fault- Tolerance Computing. We summarize here our accomplishments and our future plans, and we wish to request continued support for the period of October 1, 1980 - September 30, 1982 from ONR for research in these areas. Scheduling to meet deadlines -- The problem of scheduling jobs to meet their deadlines was studied. Given a set of jobs each of which is specified by three parameters, ready time, deadline, and computation time, we want to schedule them on a computer system so that, if possible, all deadlines will be met. Furthermore, if indeed all deadlines can be met, we want to know the possibility of completing the executing of each job so that there will be a 'slack time' between the time of completion and the deadline. In particular, the following model is used: There is a single processor in the computing system. Each job consists of an infinite stream of periodic and identical requests. A request is ready when it arrives and should be completed prior to the arrival of the next request of the same job. The execution of a job can be interrupted and be resumed later on.

“DTIC ADA1022253: Research In Complexity Theory And Combinatorial Algorithms” Metadata:

  • Title: ➤  DTIC ADA1022253: Research In Complexity Theory And Combinatorial Algorithms
  • Author: ➤  
  • Language: English

“DTIC ADA1022253: Research In Complexity Theory And Combinatorial Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 9.11 Mbs, the file-s for this book were downloaded 45 times, the file-s went public at Sun Feb 02 2020.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA1022253: Research In Complexity Theory And Combinatorial Algorithms at online marketplaces:


2DTIC ADA1022255: Research In Complexity Theory And Combinatorial Algorithms

By

Since October 1, 1979, research in Complexity Theory and Combinatorial Algorithms at the Department of Computer Science at the University of Illinois was supported by the Office of Naval Research. During this period of time, research work was carried out in the areas of Computational Complexity Theory, Scheduling Algorithms, Graph Algorithms, Dynamic Programming, and Fault- Tolerance Computing. We summarize here our accomplishments and our future plans, and we wish to request continued support for the period of October 1, 1980 - September 30, 1982 from ONR for research in these areas. Scheduling to meet deadlines -- The problem of scheduling jobs to meet their deadlines was studied. Given a set of jobs each of which is specified by three parameters, ready time, deadline, and computation time, we want to schedule them on a computer system so that, if possible, all deadlines will be met. Furthermore, if indeed all deadlines can be met, we want to know the possibility of completing the executing of each job so that there will be a 'slack time' between the time of completion and the deadline. In particular, the following model is used: There is a single processor in the computing system. Each job consists of an infinite stream of periodic and identical requests. A request is ready when it arrives and should be completed prior to the arrival of the next request of the same job. The execution of a job can be interrupted and be resumed later on.

“DTIC ADA1022255: Research In Complexity Theory And Combinatorial Algorithms” Metadata:

  • Title: ➤  DTIC ADA1022255: Research In Complexity Theory And Combinatorial Algorithms
  • Author: ➤  
  • Language: English

“DTIC ADA1022255: Research In Complexity Theory And Combinatorial Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 9.11 Mbs, the file-s for this book were downloaded 94 times, the file-s went public at Sun Feb 02 2020.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA1022255: Research In Complexity Theory And Combinatorial Algorithms at online marketplaces:


3State Space Search : Algorithms, Complexity, And Applications

By

Since October 1, 1979, research in Complexity Theory and Combinatorial Algorithms at the Department of Computer Science at the University of Illinois was supported by the Office of Naval Research. During this period of time, research work was carried out in the areas of Computational Complexity Theory, Scheduling Algorithms, Graph Algorithms, Dynamic Programming, and Fault- Tolerance Computing. We summarize here our accomplishments and our future plans, and we wish to request continued support for the period of October 1, 1980 - September 30, 1982 from ONR for research in these areas. Scheduling to meet deadlines -- The problem of scheduling jobs to meet their deadlines was studied. Given a set of jobs each of which is specified by three parameters, ready time, deadline, and computation time, we want to schedule them on a computer system so that, if possible, all deadlines will be met. Furthermore, if indeed all deadlines can be met, we want to know the possibility of completing the executing of each job so that there will be a 'slack time' between the time of completion and the deadline. In particular, the following model is used: There is a single processor in the computing system. Each job consists of an infinite stream of periodic and identical requests. A request is ready when it arrives and should be completed prior to the arrival of the next request of the same job. The execution of a job can be interrupted and be resumed later on.

“State Space Search : Algorithms, Complexity, And Applications” Metadata:

  • Title: ➤  State Space Search : Algorithms, Complexity, And Applications
  • Author:
  • Language: English

“State Space Search : Algorithms, Complexity, And Applications” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 574.89 Mbs, the file-s for this book were downloaded 14 times, the file-s went public at Tue Jul 25 2023.

Available formats:
ACS Encrypted PDF - Cloth Cover Detection Log - DjVuTXT - Djvu XML - Dublin Core - Item Tile - JPEG Thumb - JSON - LCP Encrypted EPUB - LCP Encrypted PDF - Log - MARC - MARC Binary - Metadata - OCR Page Index - OCR Search Text - PNG - Page Numbers JSON - RePublisher Final Processing Log - RePublisher Initial Processing Log - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - Title Page Detection Log - chOCR - hOCR -

Related Links:

Online Marketplaces

Find State Space Search : Algorithms, Complexity, And Applications at online marketplaces:


4Computational Complexity Of Sequential And Parallel Algorithms

By

Since October 1, 1979, research in Complexity Theory and Combinatorial Algorithms at the Department of Computer Science at the University of Illinois was supported by the Office of Naval Research. During this period of time, research work was carried out in the areas of Computational Complexity Theory, Scheduling Algorithms, Graph Algorithms, Dynamic Programming, and Fault- Tolerance Computing. We summarize here our accomplishments and our future plans, and we wish to request continued support for the period of October 1, 1980 - September 30, 1982 from ONR for research in these areas. Scheduling to meet deadlines -- The problem of scheduling jobs to meet their deadlines was studied. Given a set of jobs each of which is specified by three parameters, ready time, deadline, and computation time, we want to schedule them on a computer system so that, if possible, all deadlines will be met. Furthermore, if indeed all deadlines can be met, we want to know the possibility of completing the executing of each job so that there will be a 'slack time' between the time of completion and the deadline. In particular, the following model is used: There is a single processor in the computing system. Each job consists of an infinite stream of periodic and identical requests. A request is ready when it arrives and should be completed prior to the arrival of the next request of the same job. The execution of a job can be interrupted and be resumed later on.

“Computational Complexity Of Sequential And Parallel Algorithms” Metadata:

  • Title: ➤  Computational Complexity Of Sequential And Parallel Algorithms
  • Author:
  • Language: English

“Computational Complexity Of Sequential And Parallel Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 624.54 Mbs, the file-s for this book were downloaded 77 times, the file-s went public at Fri Nov 16 2018.

Available formats:
ACS Encrypted EPUB - ACS Encrypted PDF - Abbyy GZ - Cloth Cover Detection Log - Contents - DjVuTXT - Djvu XML - Dublin Core - Item Tile - JSON - LCP Encrypted EPUB - LCP Encrypted PDF - Log - MARC - MARC Binary - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Computational Complexity Of Sequential And Parallel Algorithms at online marketplaces:


5Complexity Of And Algorithms For Borda Manipulation

By

We prove that it is NP-hard for a coalition of two manipulators to compute how to manipulate the Borda voting rule. This resolves one of the last open problems in the computational complexity of manipulating common voting rules. Because of this NP-hardness, we treat computing a manipulation as an approximation problem where we try to minimize the number of manipulators. Based on ideas from bin packing and multiprocessor scheduling, we propose two new approximation methods to compute manipulations of the Borda rule. Experiments show that these methods significantly outperform the previous best known %existing approximation method. We are able to find optimal manipulations in almost all the randomly generated elections tested. Our results suggest that, whilst computing a manipulation of the Borda rule by a coalition is NP-hard, computational complexity may provide only a weak barrier against manipulation in practice.

“Complexity Of And Algorithms For Borda Manipulation” Metadata:

  • Title: ➤  Complexity Of And Algorithms For Borda Manipulation
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 5.81 Mbs, the file-s for this book were downloaded 96 times, the file-s went public at Mon Sep 23 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Complexity Of And Algorithms For Borda Manipulation at online marketplaces:


6Gröbner Bases Of Bihomogeneous Ideals Generated By Polynomials Of Bidegree (1,1): Algorithms And Complexity

By

Solving multihomogeneous systems, as a wide range of structured algebraic systems occurring frequently in practical problems, is of first importance. Experimentally, solving these systems with Gr\"obner bases algorithms seems to be easier than solving homogeneous systems of the same degree. Nevertheless, the reasons of this behaviour are not clear. In this paper, we focus on bilinear systems (i.e. bihomogeneous systems where all equations have bidegree (1,1)). Our goal is to provide a theoretical explanation of the aforementionned experimental behaviour and to propose new techniques to speed up the Gr\"obner basis computations by using the multihomogeneous structure of those systems. The contributions are theoretical and practical. First, we adapt the classical F5 criterion to avoid reductions to zero which occur when the input is a set of bilinear polynomials. We also prove an explicit form of the Hilbert series of bihomogeneous ideals generated by generic bilinear polynomials and give a new upper bound on the degree of regularity of generic affine bilinear systems. This leads to new complexity bounds for solving bilinear systems. We propose also a variant of the F5 Algorithm dedicated to multihomogeneous systems which exploits a structural property of the Macaulay matrix which occurs on such inputs. Experimental results show that this variant requires less time and memory than the classical homogeneous F5 Algorithm.

“Gröbner Bases Of Bihomogeneous Ideals Generated By Polynomials Of Bidegree (1,1): Algorithms And Complexity” Metadata:

  • Title: ➤  Gröbner Bases Of Bihomogeneous Ideals Generated By Polynomials Of Bidegree (1,1): Algorithms And Complexity
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 17.22 Mbs, the file-s for this book were downloaded 83 times, the file-s went public at Sun Sep 22 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Gröbner Bases Of Bihomogeneous Ideals Generated By Polynomials Of Bidegree (1,1): Algorithms And Complexity at online marketplaces:


7Convex Optimization: Algorithms And Complexity

By

This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms. Starting from the fundamental theory of black-box optimization, the material progresses towards recent advances in structural optimization and stochastic optimization. Our presentation of black-box optimization, strongly influenced by Nesterov's seminal book and Nemirovski's lecture notes, includes the analysis of cutting plane methods, as well as (accelerated) gradient descent schemes. We also pay special attention to non-Euclidean settings (relevant algorithms include Frank-Wolfe, mirror descent, and dual averaging) and discuss their relevance in machine learning. We provide a gentle introduction to structural optimization with FISTA (to optimize a sum of a smooth and a simple non-smooth term), saddle-point mirror prox (Nemirovski's alternative to Nesterov's smoothing), and a concise description of interior point methods. In stochastic optimization we discuss stochastic gradient descent, mini-batches, random coordinate descent, and sublinear algorithms. We also briefly touch upon convex relaxation of combinatorial problems and the use of randomness to round solutions, as well as random walks based methods.

“Convex Optimization: Algorithms And Complexity” Metadata:

  • Title: ➤  Convex Optimization: Algorithms And Complexity
  • Author:

“Convex Optimization: Algorithms And Complexity” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 1.13 Mbs, the file-s for this book were downloaded 136 times, the file-s went public at Sat Jun 30 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Convex Optimization: Algorithms And Complexity at online marketplaces:


8Exploring Heuristic Algorithms For The Knapsack Problem: A Comparative Analysis Of Program Complexity And Computational Efficiency

By

Abstract The knapsack problem is an optimization problem in computer science which involves determining the most valuable combination of items that can be packed into a knapsack (a container) with a limited capacity (weight or volume); the goal is to maximize the total profit of the items included in the knapsack without exceeding its capacity. This study extensively analyzes the knapsack problem, exploring the application of three prevalent heuristics: greedy, dynamic programming, and FPTAS algorithms implemented in Python. The study aims to assess how these algorithms perform differently, focusing on program complexity and computational speed. Our main objective is to compare these algorithms and determine the most effective one for solving the knapsack problem as well as to be chosen by the researchers and developers when dealing similar problem in real-world applications. Our methodology involved solving the knapsack problem using the three algorithms within a unified programming environment. We conducted experiments using varying input datasets and recorded the time complexities of the algorithms in each trial. Additionally, we performed Halstead complexity measurements to derive the volume of each algorithm for this study. Subsequently, we compared program complexity in Halstead metrics and computational speed for the three approaches. The research findings reveal that the Greedy algorithm demonstrates superior computational efficiency compared to both Dynamic Programming (D.P) and FPTAS algorithms across various test cases. To advance understanding of the knapsack problem, future research should focus on investigating the performance of other programming languages in addressing combinatorial optimization problems, which would provide valuable insights into language choice impact. Additionally, integrating parallel computing techniques could accelerate solution processes for large-scale problem instances.

“Exploring Heuristic Algorithms For The Knapsack Problem: A Comparative Analysis Of Program Complexity And Computational Efficiency” Metadata:

  • Title: ➤  Exploring Heuristic Algorithms For The Knapsack Problem: A Comparative Analysis Of Program Complexity And Computational Efficiency
  • Author: ➤  
  • Language: English

“Exploring Heuristic Algorithms For The Knapsack Problem: A Comparative Analysis Of Program Complexity And Computational Efficiency” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 14.52 Mbs, the file-s for this book were downloaded 10 times, the file-s went public at Sat Sep 14 2024.

Available formats:
Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Exploring Heuristic Algorithms For The Knapsack Problem: A Comparative Analysis Of Program Complexity And Computational Efficiency at online marketplaces:


9Joint User Grouping And Linear Virtual Beamforming: Complexity, Algorithms And Approximation Bounds

By

In a wireless system with a large number of distributed nodes, the quality of communication can be greatly improved by pooling the nodes to perform joint transmission/reception. In this paper, we consider the problem of optimally selecting a subset of nodes from potentially a large number of candidates to form a virtual multi-antenna system, while at the same time designing their joint linear transmission strategies. We focus on two specific application scenarios: 1) multiple single antenna transmitters cooperatively transmit to a receiver; 2) a single transmitter transmits to a receiver with the help of a number of cooperative relays. We formulate the joint node selection and beamforming problems as cardinality constrained optimization problems with both discrete variables (used for selecting cooperative nodes) and continuous variables (used for designing beamformers). For each application scenario, we first characterize the computational complexity of the joint optimization problem, and then propose novel semi-definite relaxation (SDR) techniques to obtain approximate solutions. We show that the new SDR algorithms have a guaranteed approximation performance in terms of the gap to global optimality, regardless of channel realizations. The effectiveness of the proposed algorithms is demonstrated via numerical experiments.

“Joint User Grouping And Linear Virtual Beamforming: Complexity, Algorithms And Approximation Bounds” Metadata:

  • Title: ➤  Joint User Grouping And Linear Virtual Beamforming: Complexity, Algorithms And Approximation Bounds
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 15.48 Mbs, the file-s for this book were downloaded 67 times, the file-s went public at Wed Sep 18 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Joint User Grouping And Linear Virtual Beamforming: Complexity, Algorithms And Approximation Bounds at online marketplaces:


10Analysis Of The Computational Complexity Of Solving Random Satisfiability Problems Using Branch And Bound Search Algorithms

By

The computational complexity of solving random 3-Satisfiability (3-SAT) problems is investigated. 3-SAT is a representative example of hard computational tasks; it consists in knowing whether a set of alpha N randomly drawn logical constraints involving N Boolean variables can be satisfied altogether or not. Widely used solving procedures, as the Davis-Putnam-Loveland-Logeman (DPLL) algorithm, perform a systematic search for a solution, through a sequence of trials and errors represented by a search tree. In the present study, we identify, using theory and numerical experiments, easy (size of the search tree scaling polynomially with N) and hard (exponential scaling) regimes as a function of the ratio alpha of constraints per variable. The typical complexity is explicitly calculated in the different regimes, in very good agreement with numerical simulations. Our theoretical approach is based on the analysis of the growth of the branches in the search tree under the operation of DPLL. On each branch, the initial 3-SAT problem is dynamically turned into a more generic 2+p-SAT problem, where p and 1-p are the fractions of constraints involving three and two variables respectively. The growth of each branch is monitored by the dynamical evolution of alpha and p and is represented by a trajectory in the static phase diagram of the random 2+p-SAT problem. Depending on whether or not the trajectories cross the boundary between phases, single branches or full trees are generated by DPLL, resulting in easy or hard resolutions.

“Analysis Of The Computational Complexity Of Solving Random Satisfiability Problems Using Branch And Bound Search Algorithms” Metadata:

  • Title: ➤  Analysis Of The Computational Complexity Of Solving Random Satisfiability Problems Using Branch And Bound Search Algorithms
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 22.36 Mbs, the file-s for this book were downloaded 82 times, the file-s went public at Wed Sep 18 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Analysis Of The Computational Complexity Of Solving Random Satisfiability Problems Using Branch And Bound Search Algorithms at online marketplaces:


11Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.

By

This thesis is aimed at determining the worst case asymptotical time complexity behavior of algorithms for relational operations that work on extensionally or intensionally represented binary relations. Those relational operations come from a relational language being designed at Naval Postgraduate School. One particular extensional representation technique and two intensional representation techniques are proposed. The above analysis in turn determines the feasibility of implementing a subset of the relational language on conventional architectures.

“Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.” Metadata:

  • Title: ➤  Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.
  • Author:
  • Language: English

“Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 2122.30 Mbs, the file-s for this book were downloaded 62 times, the file-s went public at Mon Feb 01 2021.

Available formats:
Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms. at online marketplaces:


12Microsoft Research Audio 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms

By

How much can randomness help computation? Motivated by this general question and by volume computation, one of the few instances where randomness probably helps, we analyze a notion of dispersion and connect it to asymptotic convex geometry. We obtain a nearly quadratic lower bound on the complexity of randomized volume algorithms for convex bodies in R n (the current best algorithm has complexity roughly n 4 and is conjectured to be n 3 ). Our main tools, dispersion of random determinants and dispersion of the length of a random point from a convex body, are of independent interest and applicable more generally; in particular, the latter is closely related to the variance hypothesis from convex geometry. This geometric dispersion also leads to lower bounds for matrix problems and property testing. This is joint work with Luis Rademacher. ©2006 Microsoft Corporation. All rights reserved.

“Microsoft Research Audio 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms” Metadata:

  • Title: ➤  Microsoft Research Audio 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms
  • Author:
  • Language: English

“Microsoft Research Audio 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "audio" format, the size of the file-s is: 47.56 Mbs, the file-s for this book were downloaded 5 times, the file-s went public at Sat Nov 23 2013.

Available formats:
Archive BitTorrent - Item Tile - Metadata - Ogg Vorbis - PNG - VBR MP3 -

Related Links:

Online Marketplaces

Find Microsoft Research Audio 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms at online marketplaces:


13Algebraic Diagonals And Walks: Algorithms, Bounds, Complexity

By

The diagonal of a multivariate power series F is the univariate power series Diag(F) generated by the diagonal terms of F. Diagonals form an important class of power series; they occur frequently in number theory, theoretical physics and enumerative combinatorics. We study algorithmic questions related to diagonals in the case where F is the Taylor expansion of a bivariate rational function. It is classical that in this case Diag(F) is an algebraic function. We propose an algorithm that computes an annihilating polynomial for Diag(F). We give a precise bound on the size of this polynomial and show that generically, this polynomial is the minimal polynomial and that its size reaches the bound. The algorithm runs in time quasi-linear in this bound, which grows exponentially with the degree of the input rational function. We then address the related problem of enumerating directed lattice walks. The insight given by our study leads to a new method for expanding the generating power series of bridges, excursions and meanders. We show that their first N terms can be computed in quasi-linear complexity in N, without first computing a very large polynomial equation.

“Algebraic Diagonals And Walks: Algorithms, Bounds, Complexity” Metadata:

  • Title: ➤  Algebraic Diagonals And Walks: Algorithms, Bounds, Complexity
  • Authors:

“Algebraic Diagonals And Walks: Algorithms, Bounds, Complexity” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.75 Mbs, the file-s for this book were downloaded 27 times, the file-s went public at Thu Jun 28 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Algebraic Diagonals And Walks: Algorithms, Bounds, Complexity at online marketplaces:


14Dispersion Of Mass And The Complexity Of Randomized Geometric Algorithms

By

How much can randomness help computation? Motivated by this general question and by volume computation, one of the few instances where randomness provably helps, we analyze a notion of dispersion and connect it to asymptotic convex geometry. We obtain a nearly quadratic lower bound on the complexity of randomized volume algorithms for convex bodies in R^n (the current best algorithm has complexity roughly n^4, conjectured to be n^3). Our main tools, dispersion of random determinants and dispersion of the length of a random point from a convex body, are of independent interest and applicable more generally; in particular, the latter is closely related to the variance hypothesis from convex geometry. This geometric dispersion also leads to lower bounds for matrix problems and property testing.

“Dispersion Of Mass And The Complexity Of Randomized Geometric Algorithms” Metadata:

  • Title: ➤  Dispersion Of Mass And The Complexity Of Randomized Geometric Algorithms
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 13.93 Mbs, the file-s for this book were downloaded 77 times, the file-s went public at Fri Sep 20 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Dispersion Of Mass And The Complexity Of Randomized Geometric Algorithms at online marketplaces:


15Algorithms And Complexity

By

How much can randomness help computation? Motivated by this general question and by volume computation, one of the few instances where randomness provably helps, we analyze a notion of dispersion and connect it to asymptotic convex geometry. We obtain a nearly quadratic lower bound on the complexity of randomized volume algorithms for convex bodies in R^n (the current best algorithm has complexity roughly n^4, conjectured to be n^3). Our main tools, dispersion of random determinants and dispersion of the length of a random point from a convex body, are of independent interest and applicable more generally; in particular, the latter is closely related to the variance hypothesis from convex geometry. This geometric dispersion also leads to lower bounds for matrix problems and property testing.

“Algorithms And Complexity” Metadata:

  • Title: Algorithms And Complexity
  • Author:
  • Language: English

“Algorithms And Complexity” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 515.27 Mbs, the file-s for this book were downloaded 42 times, the file-s went public at Thu Feb 03 2022.

Available formats:
ACS Encrypted PDF - Cloth Cover Detection Log - DjVuTXT - Djvu XML - Dublin Core - Item Tile - JPEG Thumb - JSON - LCP Encrypted EPUB - LCP Encrypted PDF - Log - MARC - MARC Binary - Metadata - OCR Page Index - OCR Search Text - PNG - Page Numbers JSON - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - Title Page Detection Log - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Algorithms And Complexity at online marketplaces:


16DTIC ADA214247: A Renormalization Group Approach To Image Processing. A New Computational Method For 3-Dimensional Shapes In Robot Vision, And The Computational Complexity Of The Cooling Algorithms

By

During the period of the contract, 6/15/86-7/31/89, we have develop: I). A parallel multilevel-multiresolution algorithm for Image Processing and low-level Robot vision tasks, II). A Bayesian/Geometric Framework for 3-D shape estimation from 2-D images, appropriate for object recognition and other Robot tasks III). A procedure for rotation and scale invariant representation (coding) and recognition of textures; a computationally efficient algorithm for estimating Markov Random Fields, IV). We have obtained mathematical results concerning convergence and speed of convergence of computational algorithms such as the annealing algorithm, and have studied mathematically the consistency and asymptotic normality of Maximum Likelihood Estimators for Gibbs distributions. Keywords: Computer vision. (KR)

“DTIC ADA214247: A Renormalization Group Approach To Image Processing. A New Computational Method For 3-Dimensional Shapes In Robot Vision, And The Computational Complexity Of The Cooling Algorithms” Metadata:

  • Title: ➤  DTIC ADA214247: A Renormalization Group Approach To Image Processing. A New Computational Method For 3-Dimensional Shapes In Robot Vision, And The Computational Complexity Of The Cooling Algorithms
  • Author: ➤  
  • Language: English

“DTIC ADA214247: A Renormalization Group Approach To Image Processing. A New Computational Method For 3-Dimensional Shapes In Robot Vision, And The Computational Complexity Of The Cooling Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 5.20 Mbs, the file-s for this book were downloaded 48 times, the file-s went public at Fri Feb 23 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA214247: A Renormalization Group Approach To Image Processing. A New Computational Method For 3-Dimensional Shapes In Robot Vision, And The Computational Complexity Of The Cooling Algorithms at online marketplaces:


17DTIC ADA431591: Change Detection And Estimation In Large Scale Sensor Networks: Linear Complexity Algorithms

By

We propose algorithms for nonparametric sample-based spacial change detection and estimation in large scale sensor networks. We collect random samples containing the location of sensors and their local decisions, and assume that the local decisions can be stimulated or normal , reflecting the local strength of some stimulating agent. Then change in the location of the agent manifests itself by a change in the distribution of stimulated sensors. In this paper, we are aiming at developing a test that, given two collections of samples, can decide whether the distribution generating the samples has changed or not, and give an estimated changed area if a change is indeed detected. The focus of this paper is to reduce the complexity of the detection and estimation algorithm. We propose two fast algorithms with almost linear complexity and analyze their completeness, flexibility and robustness.

“DTIC ADA431591: Change Detection And Estimation In Large Scale Sensor Networks: Linear Complexity Algorithms” Metadata:

  • Title: ➤  DTIC ADA431591: Change Detection And Estimation In Large Scale Sensor Networks: Linear Complexity Algorithms
  • Author: ➤  
  • Language: English

“DTIC ADA431591: Change Detection And Estimation In Large Scale Sensor Networks: Linear Complexity Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 4.50 Mbs, the file-s for this book were downloaded 53 times, the file-s went public at Thu May 24 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA431591: Change Detection And Estimation In Large Scale Sensor Networks: Linear Complexity Algorithms at online marketplaces:


18DTIC ADA418278: Low-Complexity Interior Point Algorithms For Stochastic Programming: Derivation Analysis And Performance Evaluation

By

The broad purpose of this project was to investigate low-complexity interior point decomposition algorithms for stochastic programming. A specific objective was to evaluate algorithms using test problems arising from useful applications. The important direct results of this project include: (1) a new test problem collection that includes problem instances from a variety of application areas; (2) a new package of C-routines for converting SMPS input data into data structures more suitable for implementing algorithms; (3) a new software package, CPA, for two-stage stochastic linear programs. The test problems and input conversion routines have been developed in a general manner to be useful to other researchers. CPA includes volumetric center algorithms that proved to be successful in our computational evaluations. To the best of our knowledge, CPA is the only software for stochastic programming that includes volumetric center algorithms. Items (1), (2) and (3) are freely accessible over the Internet. The important theoretical results of this project include: (4) a new characterization of convexity-preserving maps; (5) a new coordinate-free foundation for projective spaces; (6) a new geometric characterization of one-dimensional projective spaces; (7) new algorithms for bound-constrained nonlinear optimization. These theoretical results are likely to be useful in computational optimization in general.

“DTIC ADA418278: Low-Complexity Interior Point Algorithms For Stochastic Programming: Derivation Analysis And Performance Evaluation” Metadata:

  • Title: ➤  DTIC ADA418278: Low-Complexity Interior Point Algorithms For Stochastic Programming: Derivation Analysis And Performance Evaluation
  • Author: ➤  
  • Language: English

“DTIC ADA418278: Low-Complexity Interior Point Algorithms For Stochastic Programming: Derivation Analysis And Performance Evaluation” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 30.05 Mbs, the file-s for this book were downloaded 60 times, the file-s went public at Mon May 14 2018.

Available formats:
Abbyy GZ - Additional Text PDF - Archive BitTorrent - DjVuTXT - Djvu XML - Image Container PDF - JPEG Thumb - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA418278: Low-Complexity Interior Point Algorithms For Stochastic Programming: Derivation Analysis And Performance Evaluation at online marketplaces:


19NASA Technical Reports Server (NTRS) 20020034969: The Computational Complexity, Parallel Scalability, And Performance Of Atmospheric Data Assimilation Algorithms

By

The computational complexity of algorithms for Four Dimensional Data Assimilation (4DDA) at NASA's Data Assimilation Office (DAO) is discussed. In 4DDA, observations are assimilated with the output of a dynamical model to generate best-estimates of the states of the system. It is thus a mapping problem, whereby scattered observations are converted into regular accurate maps of wind, temperature, moisture and other variables. The DAO is developing and using 4DDA algorithms that provide these datasets, or analyses, in support of Earth System Science research. Two large-scale algorithms are discussed. The first approach, the Goddard Earth Observing System Data Assimilation System (GEOS DAS), uses an atmospheric general circulation model (GCM) and an observation-space based analysis system, the Physical-space Statistical Analysis System (PSAS). GEOS DAS is very similar to global meteorological weather forecasting data assimilation systems, but is used at NASA for climate research. Systems of this size typically run at between 1 and 20 gigaflop/s. The second approach, the Kalman filter, uses a more consistent algorithm to determine the forecast error covariance matrix than does GEOS DAS. For atmospheric assimilation, the gridded dynamical fields typically have More than 10(exp 6) variables, therefore the full error covariance matrix may be in excess of a teraword. For the Kalman filter this problem can easily scale to petaflop/s proportions. We discuss the computational complexity of GEOS DAS and our implementation of the Kalman filter. We also discuss and quantify some of the technical issues and limitations in developing efficient, in terms of wall clock time, and scalable parallel implementations of the algorithms.

“NASA Technical Reports Server (NTRS) 20020034969: The Computational Complexity, Parallel Scalability, And Performance Of Atmospheric Data Assimilation Algorithms” Metadata:

  • Title: ➤  NASA Technical Reports Server (NTRS) 20020034969: The Computational Complexity, Parallel Scalability, And Performance Of Atmospheric Data Assimilation Algorithms
  • Author: ➤  
  • Language: English

“NASA Technical Reports Server (NTRS) 20020034969: The Computational Complexity, Parallel Scalability, And Performance Of Atmospheric Data Assimilation Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 36.38 Mbs, the file-s for this book were downloaded 65 times, the file-s went public at Wed Oct 19 2016.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find NASA Technical Reports Server (NTRS) 20020034969: The Computational Complexity, Parallel Scalability, And Performance Of Atmospheric Data Assimilation Algorithms at online marketplaces:


20ICE: A General And Validated Energy Complexity Model For Multithreaded Algorithms

By

Like time complexity models that have significantly contributed to the analysis and development of fast algorithms, energy complexity models for parallel algorithms are desired as crucial means to develop energy efficient algorithms for ubiquitous multicore platforms. Ideal energy complexity models should be validated on real multicore platforms and applicable to a wide range of parallel algorithms. However, existing energy complexity models for parallel algorithms are either theoretical without model validation or algorithm-specific without ability to analyze energy complexity for a wide-range of parallel algorithms. This paper presents a new general validated energy complexity model for parallel (multithreaded) algorithms. The new model abstracts away possible multicore platforms by their static and dynamic energy of computational operations and data access, and derives the energy complexity of a given algorithm from its work, span and I/O complexity. The new model is validated by different sparse matrix vector multiplication (SpMV) algorithms and dense matrix multiplication (matmul) algorithms running on high performance computing (HPC) platforms (e.g., Intel Xeon and Xeon Phi). The new energy complexity model is able to characterize and compare the energy consumption of SpMV and matmul kernels according to three aspects: different algorithms, different input matrix types and different platforms. The prediction of the new model regarding which algorithm consumes more energy with different inputs on different platforms, is confirmed by the experimental results. In order to improve the usability and accuracy of the new model for a wide range of platforms, the platform parameters of ICE model are provided for eleven platforms including HPC, accelerator and embedded platforms.

“ICE: A General And Validated Energy Complexity Model For Multithreaded Algorithms” Metadata:

  • Title: ➤  ICE: A General And Validated Energy Complexity Model For Multithreaded Algorithms
  • Authors:

“ICE: A General And Validated Energy Complexity Model For Multithreaded Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.56 Mbs, the file-s for this book were downloaded 18 times, the file-s went public at Fri Jun 29 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find ICE: A General And Validated Energy Complexity Model For Multithreaded Algorithms at online marketplaces:


21User-Base Station Association In HetSNets: Complexity And Efficient Algorithms

By

This work considers the problem of user association to small-cell base stations (SBSs) in a heterogeneous and small-cell network (HetSNet). Two optimization problems are investigated, which are maximizing the set of associated users to the SBSs (the unweighted problem) and maximizing the set of weighted associated users to the SBSs (the weighted problem), under signal-to-interference-plus-noise ratio (SINR) constraints. Both problems are formulated as linear integer programs. The weighted problem is known to be NP-hard and, in this paper, the unweighted problem is proved to be NP-hard as well. Therefore, this paper develops two heuristic polynomial-time algorithms to solve both problems. The computational complexity of the proposed algorithms is evaluated and is shown to be far more efficient than the complexity of the optimal brute-force (BF) algorithm. Moreover, the paper benchmarks the performance of the proposed algorithms against the BF algorithm, the branch-and-bound (B\&B) algorithm and standard algorithms, through numerical simulations. The results demonstrate the close-to-optimal performance of the proposed algorithms. They also show that the weighted problem can be solved to provide solutions that are fair between users or to balance the load among SBSs.

“User-Base Station Association In HetSNets: Complexity And Efficient Algorithms” Metadata:

  • Title: ➤  User-Base Station Association In HetSNets: Complexity And Efficient Algorithms
  • Authors:

“User-Base Station Association In HetSNets: Complexity And Efficient Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.45 Mbs, the file-s for this book were downloaded 20 times, the file-s went public at Sat Jun 30 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find User-Base Station Association In HetSNets: Complexity And Efficient Algorithms at online marketplaces:


22The Jones Polynomial: Quantum Algorithms And Applications In Quantum Complexity Theory

By

We analyze relationships between quantum computation and a family of generalizations of the Jones polynomial. Extending recent work by Aharonov et al., we give efficient quantum circuits for implementing the unitary Jones-Wenzl representations of the braid group. We use these to provide new quantum algorithms for approximately evaluating a family of specializations of the HOMFLYPT two-variable polynomial of trace closures of braids. We also give algorithms for approximating the Jones polynomial of a general class of closures of braids at roots of unity. Next we provide a self-contained proof of a result of Freedman et al. that any quantum computation can be replaced by an additive approximation of the Jones polynomial, evaluated at almost any primitive root of unity. Our proof encodes two-qubit unitaries into the rectangular representation of the eight-strand braid group. We then give QCMA-complete and PSPACE-complete problems which are based on braids. We conclude with direct proofs that evaluating the Jones polynomial of the plat closure at most primitive roots of unity is a #P-hard problem, while learning its most significant bit is PP-hard, circumventing the usual route through the Tutte polynomial and graph coloring.

“The Jones Polynomial: Quantum Algorithms And Applications In Quantum Complexity Theory” Metadata:

  • Title: ➤  The Jones Polynomial: Quantum Algorithms And Applications In Quantum Complexity Theory
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 17.22 Mbs, the file-s for this book were downloaded 86 times, the file-s went public at Sun Sep 22 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find The Jones Polynomial: Quantum Algorithms And Applications In Quantum Complexity Theory at online marketplaces:


23Path Computation In Multi-layer Networks: Complexity And Algorithms

By

Carrier-grade networks comprise several layers where different protocols coexist. Nowadays, most of these networks have different control planes to manage routing on different layers, leading to a suboptimal use of the network resources and additional operational costs. However, some routers are able to encapsulate, decapsulate and convert protocols and act as a liaison between these layers. A unified control plane would be useful to optimize the use of the network resources and automate the routing configurations. Software-Defined Networking (SDN) based architectures, such as OpenFlow, offer a chance to design such a control plane. One of the most important problems to deal with in this design is the path computation process. Classical path computation algorithms cannot resolve the problem as they do not take into account encapsulations and conversions of protocols. In this paper, we propose algorithms to solve this problem and study several cases: Path computation without bandwidth constraint, under bandwidth constraint and under other Quality of Service constraints. We study the complexity and the scalability of our algorithms and evaluate their performances on real topologies. The results show that they outperform the previous ones proposed in the literature.

“Path Computation In Multi-layer Networks: Complexity And Algorithms” Metadata:

  • Title: ➤  Path Computation In Multi-layer Networks: Complexity And Algorithms
  • Authors:

“Path Computation In Multi-layer Networks: Complexity And Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.84 Mbs, the file-s for this book were downloaded 25 times, the file-s went public at Fri Jun 29 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Path Computation In Multi-layer Networks: Complexity And Algorithms at online marketplaces:


24Cooperative Task-oriented Computing : Algorithms And Complexity

By

Carrier-grade networks comprise several layers where different protocols coexist. Nowadays, most of these networks have different control planes to manage routing on different layers, leading to a suboptimal use of the network resources and additional operational costs. However, some routers are able to encapsulate, decapsulate and convert protocols and act as a liaison between these layers. A unified control plane would be useful to optimize the use of the network resources and automate the routing configurations. Software-Defined Networking (SDN) based architectures, such as OpenFlow, offer a chance to design such a control plane. One of the most important problems to deal with in this design is the path computation process. Classical path computation algorithms cannot resolve the problem as they do not take into account encapsulations and conversions of protocols. In this paper, we propose algorithms to solve this problem and study several cases: Path computation without bandwidth constraint, under bandwidth constraint and under other Quality of Service constraints. We study the complexity and the scalability of our algorithms and evaluate their performances on real topologies. The results show that they outperform the previous ones proposed in the literature.

“Cooperative Task-oriented Computing : Algorithms And Complexity” Metadata:

  • Title: ➤  Cooperative Task-oriented Computing : Algorithms And Complexity
  • Author:
  • Language: English

“Cooperative Task-oriented Computing : Algorithms And Complexity” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 460.27 Mbs, the file-s for this book were downloaded 14 times, the file-s went public at Fri Jul 21 2023.

Available formats:
ACS Encrypted PDF - Cloth Cover Detection Log - DjVuTXT - Djvu XML - Dublin Core - EPUB - Item Tile - JPEG Thumb - JSON - LCP Encrypted EPUB - LCP Encrypted PDF - Log - MARC - MARC Binary - Metadata - OCR Page Index - OCR Search Text - PNG - Page Numbers JSON - RePublisher Final Processing Log - RePublisher Initial Processing Log - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - Title Page Detection Log - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Cooperative Task-oriented Computing : Algorithms And Complexity at online marketplaces:


25Sparsity-aware Sphere Decoding: Algorithms And Complexity Analysis

By

Integer least-squares problems, concerned with solving a system of equations where the components of the unknown vector are integer-valued, arise in a wide range of applications. In many scenarios the unknown vector is sparse, i.e., a large fraction of its entries are zero. Examples include applications in wireless communications, digital fingerprinting, and array-comparative genomic hybridization systems. Sphere decoding, commonly used for solving integer least-squares problems, can utilize the knowledge about sparsity of the unknown vector to perform computationally efficient search for the solution. In this paper, we formulate and analyze the sparsity-aware sphere decoding algorithm that imposes $\ell_0$-norm constraint on the admissible solution. Analytical expressions for the expected complexity of the algorithm for alphabets typical of sparse channel estimation and source allocation applications are derived and validated through extensive simulations. The results demonstrate superior performance and speed of sparsity-aware sphere decoder compared to the conventional sparsity-unaware sphere decoding algorithm. Moreover, variance of the complexity of the sparsity-aware sphere decoding algorithm for binary alphabets is derived. The search space of the proposed algorithm can be further reduced by imposing lower bounds on the value of the objective function. The algorithm is modified to allow for such a lower bounding technique and simulations illustrating efficacy of the method are presented. Performance of the algorithm is demonstrated in an application to sparse channel estimation, where it is shown that sparsity-aware sphere decoder performs close to theoretical lower limits.

“Sparsity-aware Sphere Decoding: Algorithms And Complexity Analysis” Metadata:

  • Title: ➤  Sparsity-aware Sphere Decoding: Algorithms And Complexity Analysis
  • Authors:

“Sparsity-aware Sphere Decoding: Algorithms And Complexity Analysis” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 1.08 Mbs, the file-s for this book were downloaded 18 times, the file-s went public at Sat Jun 30 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Sparsity-aware Sphere Decoding: Algorithms And Complexity Analysis at online marketplaces:


26DTIC ADA1022251: Research In Complexity Theory And Combinatorial Algorithms

By

Since October 1, 1979, research in Complexity Theory and Combinatorial Algorithms at the Department of Computer Science at the University of Illinois was supported by the Office of Naval Research. During this period of time, research work was carried out in the areas of Computational Complexity Theory, Scheduling Algorithms, Graph Algorithms, Dynamic Programming, and Fault- Tolerance Computing. We summarize here our accomplishments and our future plans, and we wish to request continued support for the period of October 1, 1980 - September 30, 1982 from ONR for research in these areas. Scheduling to meet deadlines -- The problem of scheduling jobs to meet their deadlines was studied. Given a set of jobs each of which is specified by three parameters, ready time, deadline, and computation time, we want to schedule them on a computer system so that, if possible, all deadlines will be met. Furthermore, if indeed all deadlines can be met, we want to know the possibility of completing the executing of each job so that there will be a 'slack time' between the time of completion and the deadline. In particular, the following model is used: There is a single processor in the computing system. Each job consists of an infinite stream of periodic and identical requests. A request is ready when it arrives and should be completed prior to the arrival of the next request of the same job. The execution of a job can be interrupted and be resumed later on.

“DTIC ADA1022251: Research In Complexity Theory And Combinatorial Algorithms” Metadata:

  • Title: ➤  DTIC ADA1022251: Research In Complexity Theory And Combinatorial Algorithms
  • Author: ➤  
  • Language: English

“DTIC ADA1022251: Research In Complexity Theory And Combinatorial Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 9.11 Mbs, the file-s for this book were downloaded 51 times, the file-s went public at Sun Feb 02 2020.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA1022251: Research In Complexity Theory And Combinatorial Algorithms at online marketplaces:


27DTIC ADA443743: Quantum Algorithms And Complexity For Certain Continuous And Related Discrete Problems

By

This thesis contains an analysis of two computational problems. The first problem is discrete quantum Boolean summation, which is a building block of quantum algorithms for many continuous problems, such as integration, approximation, differential equations, and path integration. The second problem is continuous multivariate Feynman-Kac path integration, which is a special case of path integration. The quantum Boolean summation problem can be solved by the quantum summation (QS) algorithm of Brassard, Hoyer, Mosca and Tapp, which approximates the arithmetic mean of a Boolean function. The author improves the error bound of Brassard et al. for the worst-probabilistic setting. The error bound is sharp. He also presents new sharp error bounds in the average-probabilistic and worst-average settings. His average-probabilistic error bounds prove the optimality of the QS algorithm for a certain choice of its parameters. The study of the worst-average error shows that the QS algorithm is not optimal in this setting; one needs to use a certain number of repetitions to regain its optimality. The multivariate Feynman-Kac path integration problem for smooth multivariate functions suffers from the provable curse of dimensionality in the worst-case deterministic setting (i.e., the minimal number of function evaluations needed to compute an approximation depends exponentially on the number of variables). He shows that, in both the randomized and quantum settings, the curse of dimensionality is vanquished (i.e., the minimal number of function evaluations and/or quantum queries required to compute an approximation depends only polynomially on the reciprocal of the desired accuracy and has a bound independent of the number of variables). The exponents of these polynomials are 2 in the randomized setting and 1 in the quantum setting. These exponents can be lowered at the expense of the dependence on the number of variables.

“DTIC ADA443743: Quantum Algorithms And Complexity For Certain Continuous And Related Discrete Problems” Metadata:

  • Title: ➤  DTIC ADA443743: Quantum Algorithms And Complexity For Certain Continuous And Related Discrete Problems
  • Author: ➤  
  • Language: English

“DTIC ADA443743: Quantum Algorithms And Complexity For Certain Continuous And Related Discrete Problems” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 46.36 Mbs, the file-s for this book were downloaded 65 times, the file-s went public at Thu May 31 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA443743: Quantum Algorithms And Complexity For Certain Continuous And Related Discrete Problems at online marketplaces:


28DTIC ADA046860: General Theory Of Optimal Error Algorithms And Analytic Complexity. Part A. General Information Model.

By

This is the first of a series of papers constructing an information based general theory of optimal errors and analytic computational complexity. Among the applications are such traditionally diverse areas as approximation, boundary-value problems, quadrature, and nonlinear equations in a finite or infinite dimensional space. Traditionally algorithms are often derived by ad hoc criteria. The information based theory rationalizes the synthesis of algorithms by showing how to construct algorithms which minimize or nearly minimize the error. For certain classes of problems it shows how to construct algorithms (linear optimal error algorithms) which enjoy essentially optimal complexity with respect to all possible algorithms. The existence of strongly non-computable problems is demonstrated. In contrast with the gap theorem of recursively computable functions it is shown that every monotonic real function is the complexity of some problem.

“DTIC ADA046860: General Theory Of Optimal Error Algorithms And Analytic Complexity. Part A. General Information Model.” Metadata:

  • Title: ➤  DTIC ADA046860: General Theory Of Optimal Error Algorithms And Analytic Complexity. Part A. General Information Model.
  • Author: ➤  
  • Language: English

“DTIC ADA046860: General Theory Of Optimal Error Algorithms And Analytic Complexity. Part A. General Information Model.” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 50.48 Mbs, the file-s for this book were downloaded 78 times, the file-s went public at Thu Jan 05 2017.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - Cloth Cover Detection Log - DjVuTXT - Djvu XML - Item Tile - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA046860: General Theory Of Optimal Error Algorithms And Analytic Complexity. Part A. General Information Model. at online marketplaces:


29Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.

By

ADA121995

“Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.” Metadata:

  • Title: ➤  Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms.
  • Author:
  • Language: en_US,eng

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 298.48 Mbs, the file-s for this book were downloaded 122 times, the file-s went public at Mon Oct 05 2015.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Representation Techniques For Relational Languages And The Worst Case Asymptotical Time Complexity Behaviour Of The Related Algorithms. at online marketplaces:


30Algorithms And Complexity : Second Italian Conference, CIAC '94, Rome, Italy, February 23-25, 1994 : Proceedings

By

ADA121995

“Algorithms And Complexity : Second Italian Conference, CIAC '94, Rome, Italy, February 23-25, 1994 : Proceedings” Metadata:

  • Title: ➤  Algorithms And Complexity : Second Italian Conference, CIAC '94, Rome, Italy, February 23-25, 1994 : Proceedings
  • Author: ➤  
  • Language: English

“Algorithms And Complexity : Second Italian Conference, CIAC '94, Rome, Italy, February 23-25, 1994 : Proceedings” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 433.10 Mbs, the file-s for this book were downloaded 11 times, the file-s went public at Tue May 02 2023.

Available formats:
ACS Encrypted PDF - Cloth Cover Detection Log - DjVuTXT - Djvu XML - Dublin Core - Extra Metadata JSON - Item Tile - JPEG Thumb - JSON - LCP Encrypted EPUB - LCP Encrypted PDF - Log - MARC - MARC Binary - Metadata - Metadata Log - OCR Page Index - OCR Search Text - PNG - Page Numbers JSON - RePublisher Final Processing Log - RePublisher Initial Processing Log - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - Title Page Detection Log - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Algorithms And Complexity : Second Italian Conference, CIAC '94, Rome, Italy, February 23-25, 1994 : Proceedings at online marketplaces:


31DTIC ADA442586: Quantum Complexity, Algorithms, And Primitives

By

The project undertook theoretical research in quantum algorithms, complexity of quantum computation, quantum primitives, and quantum communication protocols. In the area of complexity, it compared quantum computation models with classical ones, finding counting complexity classes between BQP and AWPP that are likely different from both. It investigated small-depth quantum circuits (both with and without unbounded fan-in gates such as quantum AND) and found lower and upper bounds on their power and complexity. In the area of new quantum primitives, the project found Hamiltonians for the quantum fan-out gate, based on spin-exchange interactions. In the area of quantum algorithms, the project showed that there are efficient quantum algorithms for various group theoretic problems, for example, group intersection and double coset membership for certain classes of solvable groups. It also found a network of efficient quantum reducibilities between these and other group-theoretic problems. These are the project's successes. The project was unsuccessful in some endeavors. It has so far failed to find natural problems in these intermediate classes between BQP and AWPP, or to isolate the more robust classes among these. It did not find further evidence that BQP does not contain NP. There was no significant progress on quantum communication protocols.

“DTIC ADA442586: Quantum Complexity, Algorithms, And Primitives” Metadata:

  • Title: ➤  DTIC ADA442586: Quantum Complexity, Algorithms, And Primitives
  • Author: ➤  
  • Language: English

“DTIC ADA442586: Quantum Complexity, Algorithms, And Primitives” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 9.72 Mbs, the file-s for this book were downloaded 41 times, the file-s went public at Tue May 29 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - OCR Page Index - OCR Search Text - Page Numbers JSON - Scandata - Single Page Processed JP2 ZIP - Text PDF - chOCR - hOCR -

Related Links:

Online Marketplaces

Find DTIC ADA442586: Quantum Complexity, Algorithms, And Primitives at online marketplaces:


32Algorithms : Their Complexity And Efficiency

By

The project undertook theoretical research in quantum algorithms, complexity of quantum computation, quantum primitives, and quantum communication protocols. In the area of complexity, it compared quantum computation models with classical ones, finding counting complexity classes between BQP and AWPP that are likely different from both. It investigated small-depth quantum circuits (both with and without unbounded fan-in gates such as quantum AND) and found lower and upper bounds on their power and complexity. In the area of new quantum primitives, the project found Hamiltonians for the quantum fan-out gate, based on spin-exchange interactions. In the area of quantum algorithms, the project showed that there are efficient quantum algorithms for various group theoretic problems, for example, group intersection and double coset membership for certain classes of solvable groups. It also found a network of efficient quantum reducibilities between these and other group-theoretic problems. These are the project's successes. The project was unsuccessful in some endeavors. It has so far failed to find natural problems in these intermediate classes between BQP and AWPP, or to isolate the more robust classes among these. It did not find further evidence that BQP does not contain NP. There was no significant progress on quantum communication protocols.

“Algorithms : Their Complexity And Efficiency” Metadata:

  • Title: ➤  Algorithms : Their Complexity And Efficiency
  • Author:
  • Language: English

“Algorithms : Their Complexity And Efficiency” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 572.38 Mbs, the file-s for this book were downloaded 29 times, the file-s went public at Tue Mar 30 2021.

Available formats:
ACS Encrypted PDF - Cloth Cover Detection Log - DjVuTXT - Djvu XML - Dublin Core - Item Tile - JPEG Thumb - JSON - LCP Encrypted EPUB - LCP Encrypted PDF - Log - MARC - MARC Binary - Metadata - OCR Page Index - OCR Search Text - PNG - Page Numbers JSON - Scandata - Single Page Original JP2 Tar - Single Page Processed JP2 ZIP - Text PDF - Title Page Detection Log - chOCR - hOCR -

Related Links:

Online Marketplaces

Find Algorithms : Their Complexity And Efficiency at online marketplaces:


33NASA Technical Reports Server (NTRS) 20090034945: Trajectory-Oriented Approach To Managing Traffic Complexity: Trajectory Flexibility Metrics And Algorithms And Preliminary Complexity Impact Assessment

By

This document describes exploratory research on a distributed, trajectory oriented approach for traffic complexity management. The approach is to manage traffic complexity based on preserving trajectory flexibility and minimizing constraints. In particular, the document presents metrics for trajectory flexibility; a method for estimating these metrics based on discrete time and degree of freedom assumptions; a planning algorithm using these metrics to preserve flexibility; and preliminary experiments testing the impact of preserving trajectory flexibility on traffic complexity. The document also describes an early demonstration capability of the trajectory flexibility preservation function in the NASA Autonomous Operations Planner (AOP) platform.

“NASA Technical Reports Server (NTRS) 20090034945: Trajectory-Oriented Approach To Managing Traffic Complexity: Trajectory Flexibility Metrics And Algorithms And Preliminary Complexity Impact Assessment” Metadata:

  • Title: ➤  NASA Technical Reports Server (NTRS) 20090034945: Trajectory-Oriented Approach To Managing Traffic Complexity: Trajectory Flexibility Metrics And Algorithms And Preliminary Complexity Impact Assessment
  • Author: ➤  
  • Language: English

“NASA Technical Reports Server (NTRS) 20090034945: Trajectory-Oriented Approach To Managing Traffic Complexity: Trajectory Flexibility Metrics And Algorithms And Preliminary Complexity Impact Assessment” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 56.99 Mbs, the file-s for this book were downloaded 56 times, the file-s went public at Fri Nov 04 2016.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find NASA Technical Reports Server (NTRS) 20090034945: Trajectory-Oriented Approach To Managing Traffic Complexity: Trajectory Flexibility Metrics And Algorithms And Preliminary Complexity Impact Assessment at online marketplaces:


34Minimum Degree Up To Local Complementation: Bounds, Parameterized Complexity, And Exact Algorithms

By

The local minimum degree of a graph is the minimum degree that can be reached by means of local complementation. For any n, there exist graphs of order n which have a local minimum degree at least 0.189n, or at least 0.110n when restricted to bipartite graphs. Regarding the upper bound, we show that for any graph of order n, its local minimum degree is at most 3n/8+o(n) and n/4+o(n) for bipartite graphs, improving the known n/2 upper bound. We also prove that the local minimum degree is smaller than half of the vertex cover number (up to a logarithmic term). The local minimum degree problem is NP-Complete and hard to approximate. We show that this problem, even when restricted to bipartite graphs, is in W[2] and FPT-equivalent to the EvenSet problem, which W[1]-hardness is a long standing open question. Finally, we show that the local minimum degree is computed by a O*(1.938^n)-algorithm, and a O*(1.466^n)-algorithm for the bipartite graphs.

“Minimum Degree Up To Local Complementation: Bounds, Parameterized Complexity, And Exact Algorithms” Metadata:

  • Title: ➤  Minimum Degree Up To Local Complementation: Bounds, Parameterized Complexity, And Exact Algorithms
  • Authors:
  • Language: English

“Minimum Degree Up To Local Complementation: Bounds, Parameterized Complexity, And Exact Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 5.71 Mbs, the file-s for this book were downloaded 43 times, the file-s went public at Wed Jun 27 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Minimum Degree Up To Local Complementation: Bounds, Parameterized Complexity, And Exact Algorithms at online marketplaces:


35Idempotent And Tropical Mathematics. Complexity Of Algorithms And Interval Analysis

By

A very brief introduction to tropical and idempotent mathematics is presented. Tropical mathematics can be treated as a result of a dequantization of the traditional mathematics as the Planck constant tends to zero taking imaginary values. In the framework of idempotent mathematics usually constructions and algorithms are more simple with respect to their traditional analogs. We especially examine algorithms of tropical/idempotent mathematics generated by a collection of basic semiring (or semifield) operations and other "good" operations. Every algorithm of this type has an interval version. The complexity of this interval version coincides with the complexity of the initial algorithm. The interval version of an algorithm of this type gives exact interval estimates for the corresponding output data. Algorithms of linear algebra over idempotent and semirings are examined. In this case, basic algorithms are polynomial as well as their interval versions. This situation is very different from the traditional linear algebra, where basic algorithms are polynomial but the corresponding interval versions are NP-hard and interval estimates are not exact.

“Idempotent And Tropical Mathematics. Complexity Of Algorithms And Interval Analysis” Metadata:

  • Title: ➤  Idempotent And Tropical Mathematics. Complexity Of Algorithms And Interval Analysis
  • Author:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 13.07 Mbs, the file-s for this book were downloaded 84 times, the file-s went public at Wed Sep 18 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Idempotent And Tropical Mathematics. Complexity Of Algorithms And Interval Analysis at online marketplaces:


36Coordinate Descent With Arbitrary Sampling I: Algorithms And Complexity

By

We study the problem of minimizing the sum of a smooth convex function and a convex block-separable regularizer and propose a new randomized coordinate descent method, which we call ALPHA. Our method at every iteration updates a random subset of coordinates, following an arbitrary distribution. No coordinate descent methods capable to handle an arbitrary sampling have been studied in the literature before for this problem. ALPHA is a remarkably flexible algorithm: in special cases, it reduces to deterministic and randomized methods such as gradient descent, coordinate descent, parallel coordinate descent and distributed coordinate descent -- both in nonaccelerated and accelerated variants. The variants with arbitrary (or importance) sampling are new. We provide a complexity analysis of ALPHA, from which we deduce as a direct corollary complexity bounds for its many variants, all matching or improving best known bounds.

“Coordinate Descent With Arbitrary Sampling I: Algorithms And Complexity” Metadata:

  • Title: ➤  Coordinate Descent With Arbitrary Sampling I: Algorithms And Complexity
  • Authors:

“Coordinate Descent With Arbitrary Sampling I: Algorithms And Complexity” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.39 Mbs, the file-s for this book were downloaded 14 times, the file-s went public at Sat Jun 30 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Coordinate Descent With Arbitrary Sampling I: Algorithms And Complexity at online marketplaces:


37Can Everything Be Computed? - On The Solvability Complexity Index And Towers Of Algorithms

By

This paper establishes some of the fundamental barriers in the theory of computations and finally settles the long standing computational spectral problem. Due to these barriers, there are problems at the heart of computational theory that do not fit into classical complexity theory. Many computational problems can be solved as follows: a sequence of approximations is created by an algorithm, and the solution to the problem is the limit of this sequence. However, as we demonstrate, for several basic problems in computations (computing spectra of operators, inverse problems or roots of polynomials using rational maps) such a procedure based on one limit is impossible. Yet, one can compute solutions to these problems, but only by using several limits. This may come as a surprise, however, this touches onto the boundaries of computational mathematics. To analyze this phenomenon we use the Solvability Complexity Index (SCI). The SCI is the smallest number of limits needed in the computation. We show that the SCI of spectra and essential spectra of operators is equal to three, and that the SCI of spectra of self-adjoint operators is equal to two, thus providing the lower bound barriers and the first algorithms to compute such spectra in two and three limits. This finally settles the long standing computational spectral problem. In addition, we provide bounds for the SCI of spectra of classes of Schr\"{o}dinger operators, thus we affirmatively answer the long standing question on whether or not these spectra can actually be computed. The SCI yields a framework for understanding barriers in computations. It has a direct link to the Arithmetical Hierarchy, and we demonstrate how the impossibility result of McMullen on polynomial root finding with rational maps in one limit and the results of Doyle and McMullen on solving the quintic in several limits can be put in the SCI framework.

“Can Everything Be Computed? - On The Solvability Complexity Index And Towers Of Algorithms” Metadata:

  • Title: ➤  Can Everything Be Computed? - On The Solvability Complexity Index And Towers Of Algorithms
  • Authors:
  • Language: English

“Can Everything Be Computed? - On The Solvability Complexity Index And Towers Of Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 42.15 Mbs, the file-s for this book were downloaded 42 times, the file-s went public at Thu Jun 28 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Can Everything Be Computed? - On The Solvability Complexity Index And Towers Of Algorithms at online marketplaces:


38Software Radio Architecture With Smart Antennas A Tutorial On Algorithms And Complexity

Software Radio Architecture With Smart Antennas A Tutorial On Algorithms And Complexity

“Software Radio Architecture With Smart Antennas A Tutorial On Algorithms And Complexity” Metadata:

  • Title: ➤  Software Radio Architecture With Smart Antennas A Tutorial On Algorithms And Complexity
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 12.48 Mbs, the file-s for this book were downloaded 379 times, the file-s went public at Tue Feb 09 2016.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Software Radio Architecture With Smart Antennas A Tutorial On Algorithms And Complexity at online marketplaces:


39Probabilistic Robustness Analysis -- Risks, Complexity And Algorithms

By

It is becoming increasingly apparent that probabilistic approaches can overcome conservatism and computational complexity of the classical worst-case deterministic framework and may lead to designs that are actually safer. In this paper we argue that a comprehensive probabilistic robustness analysis requires a detailed evaluation of the robustness function and we show that such evaluation can be performed with essentially any desired accuracy and confidence using algorithms with complexity linear in the dimension of the uncertainty space. Moreover, we show that the average memory requirements of such algorithms are absolutely bounded and well within the capabilities of today's computers. In addition to efficiency, our approach permits control over statistical sampling error and the error due to discretization of the uncertainty radius. For a specific level of tolerance of the discretization error, our techniques provide an efficiency improvement upon conventional methods which is inversely proportional to the accuracy level; i.e., our algorithms get better as the demands for accuracy increase.

“Probabilistic Robustness Analysis -- Risks, Complexity And Algorithms” Metadata:

  • Title: ➤  Probabilistic Robustness Analysis -- Risks, Complexity And Algorithms
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 11.36 Mbs, the file-s for this book were downloaded 69 times, the file-s went public at Wed Sep 18 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Probabilistic Robustness Analysis -- Risks, Complexity And Algorithms at online marketplaces:


40Asynchronous Parallel Algorithms For Nonconvex Big-Data Optimization. Part II: Complexity And Numerical Results

By

We present complexity and numerical results for a new asynchronous parallel algorithmic method for the minimization of the sum of a smooth nonconvex function and a convex nonsmooth regularizer, subject to both convex and nonconvex constraints. The proposed method hinges on successive convex approximation techniques and a novel probabilistic model that captures key elements of modern computational architectures and asynchronous implementations in a more faithful way than state-of-the-art models. In the companion paper we provided a detailed description on the probabilistic model and gave convergence results for a diminishing stepsize version of our method. Here, we provide theoretical complexity results for a fixed stepsize version of the method and report extensive numerical comparisons on both convex and nonconvex problems demonstrating the efficiency of our approach.

“Asynchronous Parallel Algorithms For Nonconvex Big-Data Optimization. Part II: Complexity And Numerical Results” Metadata:

  • Title: ➤  Asynchronous Parallel Algorithms For Nonconvex Big-Data Optimization. Part II: Complexity And Numerical Results
  • Authors:

“Asynchronous Parallel Algorithms For Nonconvex Big-Data Optimization. Part II: Complexity And Numerical Results” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.63 Mbs, the file-s for this book were downloaded 27 times, the file-s went public at Sat Jun 30 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Asynchronous Parallel Algorithms For Nonconvex Big-Data Optimization. Part II: Complexity And Numerical Results at online marketplaces:


41Matchings With Lower Quotas: Algorithms And Complexity

By

We study a natural generalization of the maximum weight many-to-one matching problem. We are given an undirected bipartite graph $G= (A \cup P, E)$ with weights on the edges in $E$, and with lower and upper quotas on the vertices in $P$. We seek a maximum weight many-to-one matching satisfying two sets of constraints: vertices in $A$ are incident to at most one matching edge, while vertices in $P$ are either unmatched or they are incident to a number of matching edges between their lower and upper quota. This problem, which we call maximum weight many-to-one matching with lower and upper quotas (WMLQ), has applications to the assignment of students to projects within university courses, where there are constraints on the minimum and maximum numbers of students that must be assigned to each project. In this paper, we provide a comprehensive analysis of the complexity of WMLQ from the viewpoints of classic polynomial time algorithms, fixed-parameter tractability, as well as approximability. We draw the line between NP-hard and polynomially tractable instances in terms of degree and quota constraints and provide efficient algorithms to solve the tractable ones. We further show that the problem can be solved in polynomial time for instances with bounded treewidth; however, the corresponding runtime is exponential in the treewidth with the maximum upper quota $u_{max}$ as basis, and we prove that this dependence is necessary unless FPT = W[1]. The approximability of WMLQ is also discussed: we present an approximation algorithm for the general case with performance guarantee $u_{\max}+1$, which is asymptotically best possible unless P = NP. Finally, we elaborate on how most of our positive results carry over to matchings in arbitrary graphs with lower quotas.

“Matchings With Lower Quotas: Algorithms And Complexity” Metadata:

  • Title: ➤  Matchings With Lower Quotas: Algorithms And Complexity
  • Authors:

“Matchings With Lower Quotas: Algorithms And Complexity” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.31 Mbs, the file-s for this book were downloaded 21 times, the file-s went public at Sat Jun 30 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Matchings With Lower Quotas: Algorithms And Complexity at online marketplaces:


42On Practical Algorithms For Entropy Estimation And The Improved Sample Complexity Of Compressed Counting

By

Estimating the p-th frequency moment of data stream is a very heavily studied problem. The problem is actually trivial when p = 1, assuming the strict Turnstile model. The sample complexity of our proposed algorithm is essentially O(1) near p=1. This is a very large improvement over the previously believed O(1/eps^2) bound. The proposed algorithm makes the long-standing problem of entropy estimation an easy task, as verified by the experiments included in the appendix.

“On Practical Algorithms For Entropy Estimation And The Improved Sample Complexity Of Compressed Counting” Metadata:

  • Title: ➤  On Practical Algorithms For Entropy Estimation And The Improved Sample Complexity Of Compressed Counting
  • Author:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 10.87 Mbs, the file-s for this book were downloaded 63 times, the file-s went public at Sun Sep 22 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find On Practical Algorithms For Entropy Estimation And The Improved Sample Complexity Of Compressed Counting at online marketplaces:


43Complexity And Algorithms For Euler Characteristic Of Simplicial Complexes

By

We consider the problem of computing the Euler characteristic of an abstract simplicial complex given by its vertices and facets. We show that this problem is #P-complete and present two new practical algorithms for computing Euler characteristic. The two new algorithms are derived using combinatorial commutative algebra and we also give a second description of them that requires no algebra. We present experiments showing that the two new algorithms can be implemented to be faster than previous Euler characteristic implementations by a large margin.

“Complexity And Algorithms For Euler Characteristic Of Simplicial Complexes” Metadata:

  • Title: ➤  Complexity And Algorithms For Euler Characteristic Of Simplicial Complexes
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 16.08 Mbs, the file-s for this book were downloaded 87 times, the file-s went public at Tue Sep 24 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Complexity And Algorithms For Euler Characteristic Of Simplicial Complexes at online marketplaces:


44On Cooperative Patrolling: Optimal Trajectories, Complexity Analysis, And Approximation Algorithms

By

The subject of this work is the patrolling of an environment with the aid of a team of autonomous agents. We consider both the design of open-loop trajectories with optimal properties, and of distributed control laws converging to optimal trajectories. As performance criteria, the refresh time and the latency are considered, i.e., respectively, time gap between any two visits of the same region, and the time necessary to inform every agent about an event occurred in the environment. We associate a graph with the environment, and we study separately the case of a chain, tree, and cyclic graph. For the case of chain graph, we first describe a minimum refresh time and latency team trajectory, and we propose a polynomial time algorithm for its computation. Then, we describe a distributed procedure that steers the robots toward an optimal trajectory. For the case of tree graph, a polynomial time algorithm is developed for the minimum refresh time problem, under the technical assumption of a constant number of robots involved in the patrolling task. Finally, we show that the design of a minimum refresh time trajectory for a cyclic graph is NP-hard, and we develop a constant factor approximation algorithm.

“On Cooperative Patrolling: Optimal Trajectories, Complexity Analysis, And Approximation Algorithms” Metadata:

  • Title: ➤  On Cooperative Patrolling: Optimal Trajectories, Complexity Analysis, And Approximation Algorithms
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 18.88 Mbs, the file-s for this book were downloaded 66 times, the file-s went public at Sun Sep 22 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find On Cooperative Patrolling: Optimal Trajectories, Complexity Analysis, And Approximation Algorithms at online marketplaces:


45Convergence Radius And Sample Complexity Of ITKM Algorithms For Dictionary Learning

By

In this work we show that iterative thresholding and K-means (ITKM) algorithms can recover a generating dictionary with K atoms from noisy $S$ sparse signals up to an error $\tilde \varepsilon$ as long as the initialisation is within a convergence radius, that is up to a $\log K$ factor inversely proportional to the dynamic range of the signals, and the sample size is proportional to $K \log K \tilde \varepsilon^{-2}$. The results are valid for arbitrary target errors if the sparsity level is of the order of the square root of the signal dimension $d$ and for target errors down to $K^{-\ell}$ if $S$ scales as $S \leq d/(\ell \log K)$.

“Convergence Radius And Sample Complexity Of ITKM Algorithms For Dictionary Learning” Metadata:

  • Title: ➤  Convergence Radius And Sample Complexity Of ITKM Algorithms For Dictionary Learning
  • Author:
  • Language: English

“Convergence Radius And Sample Complexity Of ITKM Algorithms For Dictionary Learning” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 21.30 Mbs, the file-s for this book were downloaded 36 times, the file-s went public at Wed Jun 27 2018.

Available formats:
Abbyy GZ - Archive BitTorrent - DjVuTXT - Djvu XML - JPEG Thumb - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Convergence Radius And Sample Complexity Of ITKM Algorithms For Dictionary Learning at online marketplaces:


46Complexity Results And Practical Algorithms For Logics In Knowledge Representation

By

Description Logics (DLs) are used in knowledge-based systems to represent and reason about terminological knowledge of the application domain in a semantically well-defined manner. In this thesis, we establish a number of novel complexity results and give practical algorithms for expressive DLs that provide different forms of counting quantifiers. We show that, in many cases, adding local counting in the form of qualifying number restrictions to DLs does not increase the complexity of the inference problems, even if binary coding of numbers in the input is assumed. On the other hand, we show that adding different forms of global counting restrictions to a logic may increase the complexity of the inference problems dramatically. We provide exact complexity results and a practical, tableau based algorithm for the DL SHIQ, which forms the basis of the highly optimized DL system iFaCT. Finally, we describe a tableau algorithm for the clique guarded fragment (CGF), which we hope will serve as the basis for an efficient implementation of a CGF reasoner.

“Complexity Results And Practical Algorithms For Logics In Knowledge Representation” Metadata:

  • Title: ➤  Complexity Results And Practical Algorithms For Logics In Knowledge Representation
  • Author:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 105.07 Mbs, the file-s for this book were downloaded 133 times, the file-s went public at Sun Sep 22 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Complexity Results And Practical Algorithms For Logics In Knowledge Representation at online marketplaces:


47Near-Optimal Sensor Scheduling For Batch State Estimation: Complexity, Algorithms, And Limits

By

In this paper, we focus on batch state estimation for linear systems. This problem is important in applications such as environmental field estimation, robotic navigation, and target tracking. Its difficulty lies on that limited operational resources among the sensors, e.g., shared communication bandwidth or battery power, constrain the number of sensors that can be active at each measurement step. As a result, sensor scheduling algorithms must be employed. Notwithstanding, current sensor scheduling algorithms for batch state estimation scale poorly with the system size and the time horizon. In addition, current sensor scheduling algorithms for Kalman filtering, although they scale better, provide no performance guarantees or approximation bounds for the minimization of the batch state estimation error. In this paper, one of our main contributions is to provide an algorithm that enjoys both the estimation accuracy of the batch state scheduling algorithms and the low time complexity of the Kalman filtering scheduling algorithms. In particular: 1) our algorithm is near-optimal: it achieves a solution up to a multiplicative factor 1/2 from the optimal solution, and this factor is close to the best approximation factor 1/e one can achieve in polynomial time for this problem; 2) our algorithm has (polynomial) time complexity that is not only lower than that of the current algorithms for batch state estimation; it is also lower than, or similar to, that of the current algorithms for Kalman filtering. We achieve these results by proving two properties for our batch state estimation error metric, which quantifies the square error of the minimum variance linear estimator of the batch state vector: a) it is supermodular in the choice of the sensors; b) it has a sparsity pattern (it involves matrices that are block tri-diagonal) that facilitates its evaluation at each sensor set.

“Near-Optimal Sensor Scheduling For Batch State Estimation: Complexity, Algorithms, And Limits” Metadata:

  • Title: ➤  Near-Optimal Sensor Scheduling For Batch State Estimation: Complexity, Algorithms, And Limits
  • Authors:

“Near-Optimal Sensor Scheduling For Batch State Estimation: Complexity, Algorithms, And Limits” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 0.23 Mbs, the file-s for this book were downloaded 26 times, the file-s went public at Fri Jun 29 2018.

Available formats:
Archive BitTorrent - Metadata - Text PDF -

Related Links:

Online Marketplaces

Find Near-Optimal Sensor Scheduling For Batch State Estimation: Complexity, Algorithms, And Limits at online marketplaces:


48Query Evaluation In P2P Systems Of Taxonomy-based Sources: Algorithms, Complexity, And Optimizations

By

In this study, we address the problem of answering queries over a peer-to-peer system of taxonomy-based sources. A taxonomy states subsumption relationships between negation-free DNF formulas on terms and negation-free conjunctions of terms. To the end of laying the foundations of our study, we first consider the centralized case, deriving the complexity of the decision problem and of query evaluation. We conclude by presenting an algorithm that is efficient in data complexity and is based on hypergraphs. More expressive forms of taxonomies are also investigated, which however lead to intractability. We then move to the distributed case, and introduce a logical model of a network of taxonomy-based sources. On such network, a distributed version of the centralized algorithm is then presented, based on a message passing paradigm, and its correctness is proved. We finally discuss optimization issues, and relate our work to the literature.

“Query Evaluation In P2P Systems Of Taxonomy-based Sources: Algorithms, Complexity, And Optimizations” Metadata:

  • Title: ➤  Query Evaluation In P2P Systems Of Taxonomy-based Sources: Algorithms, Complexity, And Optimizations
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 20.59 Mbs, the file-s for this book were downloaded 92 times, the file-s went public at Sat Sep 21 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Query Evaluation In P2P Systems Of Taxonomy-based Sources: Algorithms, Complexity, And Optimizations at online marketplaces:


49Microsoft Research Video 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms

By

How much can randomness help computation? Motivated by this general question and by volume computation, one of the few instances where randomness probably helps, we analyze a notion of dispersion and connect it to asymptotic convex geometry. We obtain a nearly quadratic lower bound on the complexity of randomized volume algorithms for convex bodies in R n (the current best algorithm has complexity roughly n 4 and is conjectured to be n 3 ). Our main tools, dispersion of random determinants and dispersion of the length of a random point from a convex body, are of independent interest and applicable more generally; in particular, the latter is closely related to the variance hypothesis from convex geometry. This geometric dispersion also leads to lower bounds for matrix problems and property testing. This is joint work with Luis Rademacher. ©2006 Microsoft Corporation. All rights reserved.

“Microsoft Research Video 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms” Metadata:

  • Title: ➤  Microsoft Research Video 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms
  • Author:
  • Language: English

“Microsoft Research Video 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms” Subjects and Themes:

Edition Identifiers:

Downloads Information:

The book is available for download in "movies" format, the size of the file-s is: 726.49 Mbs, the file-s for this book were downloaded 51 times, the file-s went public at Thu May 01 2014.

Available formats:
Animated GIF - Archive BitTorrent - Item Tile - Metadata - Ogg Video - Thumbnail - Windows Media - h.264 -

Related Links:

Online Marketplaces

Find Microsoft Research Video 104292: Dispersion Of Mass And The Complexity Of Randomized Algorithms at online marketplaces:


50Cooperative Cognitive Networks: Optimal, Distributed And Low-Complexity Algorithms

By

This paper considers the cooperation between a cognitive system and a primary system where multiple cognitive base stations (CBSs) relay the primary user's (PU) signals in exchange for more opportunity to transmit their own signals. The CBSs use amplify-and-forward (AF) relaying and coordinated beamforming to relay the primary signals and transmit their own signals. The objective is to minimize the overall transmit power of the CBSs given the rate requirements of the PU and the cognitive users (CUs). We show that the relaying matrices have unit rank and perform two functions: Matched filter receive beamforming and transmit beamforming. We then develop two efficient algorithms to find the optimal solution. The first one has linear convergence rate and is suitable for distributed implementation, while the second one enjoys superlinear convergence but requires centralized processing. Further, we derive the beamforming vectors for the linear conventional zero-forcing (CZF) and prior zero-forcing (PZF) schemes, which provide much simpler solutions. Simulation results demonstrate the improvement in terms of outage performance due to the cooperation between the primary and cognitive systems.

“Cooperative Cognitive Networks: Optimal, Distributed And Low-Complexity Algorithms” Metadata:

  • Title: ➤  Cooperative Cognitive Networks: Optimal, Distributed And Low-Complexity Algorithms
  • Authors:
  • Language: English

Edition Identifiers:

Downloads Information:

The book is available for download in "texts" format, the size of the file-s is: 12.86 Mbs, the file-s for this book were downloaded 84 times, the file-s went public at Sat Sep 21 2013.

Available formats:
Abbyy GZ - Animated GIF - Archive BitTorrent - DjVu - DjVuTXT - Djvu XML - Item Tile - Metadata - Scandata - Single Page Processed JP2 ZIP - Text PDF -

Related Links:

Online Marketplaces

Find Cooperative Cognitive Networks: Optimal, Distributed And Low-Complexity Algorithms at online marketplaces:


Buy “Algorithms And Complexity” online:

Shop for “Algorithms And Complexity” on popular online marketplaces.