Item type | Current location | Call number | Status | Date due | Barcode |
---|---|---|---|---|---|
Books | 519.544 CYG (Browse shelf) | Checked out | 16/12/2024 | 032012 |
519.542 SIV Data analysis : a bayesian tutorial | 519.542 WIL Bayesian nets and causality : philosophical and computational foundations | 519.5420285 KOL Probabilistic graphical models : principles and techniques | 519.544 CYG Parameterized algorithms | 519.544 DOK Pathwise estimation and inference for diffusion market models | 519.544 HIL Methods of statistical model estimation | 519.544 KAI Linear estimation |
Includes bibliographical references and indexes.
This comprehensive textbook presents a clean and coherent account of most fundamental tools and techniques in Parameterized Algorithms and is a self-contained guide to the area. The book covers many of the recent developments of the field, including application of important separators, branching based on linear programming, Cut & Count to obtain faster algorithms on tree decompositions, algorithms based on representative families of matroids, and use of the Strong Exponential Time Hypothesis. A number of older results are revisited and explained in a modern and didactic way. The book provides a toolbox of algorithmic techniques. Part I is an overview of basic techniques, each chapter discussing a certain algorithmic paradigm. The material covered in this part can be used for an introductory course on fixed-parameter tractability. Part II discusses more advanced and specialized algorithmic ideas, bringing the reader to the cutting edge of current research. Part III presents complexity results and lower bounds, giving negative evidence by way of W[1]-hardness, the Exponential Time Hypothesis, and kernelization lower bounds. All the results and concepts are introduced at a level accessible to graduate students and advanced undergraduate students. Every chapter is accompanied by exercises, many with hints, while the bibliographic notes point to original publications and related work.
There are no comments for this item.