Markedets billigste bøger
Levering: 1 - 2 hverdage

Bøger af Panos M. Pardalos

Filter
Filter
Sorter efterSorter Populære
  • af Panos M. Pardalos, Antonios Fytopoulos & Rohit Ramachandran
    338,95 kr.

  • af Panos M. Pardalos, Francesco Archetti, Ilias S. Kotsireas, mfl.
    878,95 kr.

  • af Panos M. Pardalos, Stamatina Th. Rassia & Arsenios Tsokas
    1.208,95 kr.

  • af Panos M. Pardalos, Alexander Panchenko, Olessia Koltsova, mfl.
    658,95 kr.

  • af Panos M. Pardalos & Themistocles M. Rassias
    2.288,95 kr.

  • af Ashkan Nikeghbali
    753,95 - 823,95 kr.

    This volume presents extensive research devoted to a broad spectrum of mathematics with emphasis on interdisciplinary aspects of Optimization and Probability. Chapters also emphasize applications to Data Science, a timely field with a high impact in our modern society. The discussion presents modern, state-of-the-art, research results and advances in areas including non-convex optimization, decentralized distributed convex optimization, topics on surrogate-based reduced dimension global optimization in process systems engineering, the projection of a point onto a convex set, optimal sampling for learning sparse approximations in high dimensions, the split feasibility problem, higher order embeddings, codifferentials and quasidifferentials of the expectation of nonsmooth random integrands, adjoint circuit chains associated with a random walk, analysis of the trade-off between sample size and precision in truncated ordinary least squares, spatial deep learning, efficient location-based tracking for IoT devices using compressive sensing and machine learning techniques, and nonsmooth mathematical programs with vanishing constraints in Banach spaces.The book is a valuable source for graduate students as well as researchers working on Optimization, Probability and their various interconnections with a variety of other areas.Chapter 12 is available open access under a Creative Commons Attribution 4.0 International License via link.springer.com.

  • af Panos M. Pardalos, Michael N. Vrahatis & Varvara Rasskazova
    1.210,95 kr.

  • af Panos M. Pardalos & J. Ben Rosen
    455,95 kr.

  • - An Object-Oriented and UML Approach
    af Panos M. Pardalos & Petraq Papajorgji
    1.137,95 kr.

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The book is divided in two parts: the first part presents concepts of the object-oriented paradigm and the UML notation of these concepts, and the second part provides a number of examples of applications that use the material presented in the first part. The examples presented illustrate the techniques discussed, focusing on how to construct better models using objects and UML diagrams. More advanced concepts such as distributed systems and examples of how to build these systems are presented in the last chapter of the book.The book presents a step-by-step approach for modeling agricultural systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.

  • af Sergiy Butenko
    1.308,95 kr.

    A cooperative system is a collection of dynamical objects, which communicate and cooperate in order to achieve a common or shared objective. The cooperation of entities is achieved through communication; either explicitly by message passing, or implicitly via observation of another entities' state. As in natural systems, cooperation may assume a hierarchical form and the control processes may be distributed or decentralized. Due to the dynamic nature of individuals and the interaction between them, the problems associated with cooperative systems typically involve many uncertainties. Moreover, in many cases cooperative systems are required to operate in a noisy or hazardous environment, which creates special challenges for designing the control process. During the last decades, considerable progress has been observed in all aspects regarding the study of cooperative systems including modeling of cooperative systems, resource allocation, discrete event driven dynamical control, continuous and hybrid dynamical control, and theory of the interaction of information, control, and hierarchy. Solution methods have been proposed using control and optimization approaches, emergent rule based techniques, game theoretic and team theoretic approaches. Measures of performance have been suggested that include the effects of hierarchies and information structures on solutions, performance bounds, concepts of convergence and stability, and problem complexity. These and other topics were discusses at the Second Annual Conference on Cooperative Control and Optimization in Gainesville, Florida. Refereed papers written by selected conference participants from the conference are gathered in this volume, which presents problem models, theoretical results, and algorithms for various aspects of cooperative control. Audience: The book is addressed to faculty, graduate students, and researchers in optimization and control, computer sciences and engineering.

  • af Panos M. Pardalos & H. Edwin Romeijn
    1.734,95 kr.

  • af Panos M. Pardalos, Ding-Zhu Du & Weili Wu
    1.308,95 kr.

  • af Panos M. Pardalos & Christodoulos A. Floudas
    1.308,95 kr.

  • af Panos M. Pardalos
    1.308,95 kr.

    The technique of randomization has been employed to solve numerous prob- lems of computing both sequentially and in parallel. Examples of randomized algorithms that are asymptotically better than their deterministic counterparts in solving various fundamental problems abound. Randomized algorithms have the advantages of simplicity and better performance both in theory and often in practice. This book is a collection of articles written by renowned experts in the area of randomized parallel computing. A brief introduction to randomized algorithms In the aflalysis of algorithms, at least three different measures of performance can be used: the best case, the worst case, and the average case. Often, the average case run time of an algorithm is much smaller than the worst case. 2 For instance, the worst case run time of Hoare's quicksort is O(n ), whereas its average case run time is only O( n log n). The average case analysis is conducted with an assumption on the input space. The assumption made to arrive at the O( n log n) average run time for quicksort is that each input permutation is equally likely. Clearly, any average case analysis is only as good as how valid the assumption made on the input space is. Randomized algorithms achieve superior performances without making any assumptions on the inputs by making coin flips within the algorithm. Any analysis done of randomized algorithms will be valid for all p0:.sible inputs.

  • af Panos M. Pardalos, Mahdi Fathi & Marzieh Khakifirooz
    1.052,95 kr.

  • af James Abello
    6.513,95 kr.

    The proliferation of massive data sets brings with it a series of special computational challenges. This "e;data avalanche"e; arises in a wide range of scientific and commercial applications. With advances in computer and information technologies, many of these challenges are beginning to be addressed by diverse inter-disciplinary groups, that indude computer scientists, mathematicians, statisticians and engineers, working in dose cooperation with application domain experts. High profile applications indude astrophysics, bio-technology, demographics, finance, geographi- cal information systems, government, medicine, telecommunications, the environment and the internet. John R. Tucker of the Board on Mathe- matical Seiences has stated: "e;My interest in this problern (Massive Data Sets) isthat I see it as the rnost irnportant cross-cutting problern for the rnathernatical sciences in practical problern solving for the next decade, because it is so pervasive. "e; The Handbook of Massive Data Sets is comprised of articles writ- ten by experts on selected topics that deal with some major aspect of massive data sets. It contains chapters on information retrieval both in the internet and in the traditional sense, web crawlers, massive graphs, string processing, data compression, dustering methods, wavelets, op- timization, external memory algorithms and data structures, the US national duster project, high performance computing, data warehouses, data cubes, semi-structured data, data squashing, data quality, billing in the large, fraud detection, and data processing in astrophysics, air pollution, biomolecular data, earth observation and the environment.

  • af Panos M. Pardalos & Themistocles M. Rassias
    881,95 kr.

  • - Models, Algorithms, Diagnostics, and Therapeutic Applications
    af Panos M. Pardalos
    1.308,95 kr.

    Advances in the field of signal processing, nonlinear dynamics, statistics, and optimization theory, combined with marked improvement in instrumenta- tion and development of computers systems, have made it possible to apply the power of mathematics to the task of understanding the human brain. This verita- ble revolution already has resulted in widespread availability of high resolution neuroimaging devices in clinical as well as research settings. Breakthroughs in functional imaging are not far behind. Mathematical tech- niques developed for the study of complex nonlinear systems and chaos already are being used to explore the complex nonlinear dynamics of human brain phys- iology. Global optimization is being applied to data mining expeditions in an effort to find knowledge in the vast amount of information being generated by neuroimaging and neurophysiological investigations. These breakthroughs in the ability to obtain, store and analyze large datasets offer, for the first time, exciting opportunities to explore the mechanisms underlying normal brain func- tion as well as the affects of diseases such as epilepsy, sleep disorders, movement disorders, and cognitive disorders that affect millions of people every year. Ap- plication of these powerful tools to the study of the human brain requires, by necessity, collaboration among scientists, engineers, neurobiologists and clini- cians. Each discipline brings to the table unique knowledge, unique approaches to problem solving, and a unique language.

  • af Panos M. Pardalos, Panagiotis D. Panagiotopoulos & R. P. Gilbert
    1.308,95 kr.

  • - Honoring the Memory of C. Caratheodory (1873-1950)
    af Nicolas Hadjisavvas
    1.308,95 kr.

    There has been much recent progress in global optimization algo- rithms for nonconvex continuous and discrete problems from both a theoretical and a practical perspective. Convex analysis plays a fun- damental role in the analysis and development of global optimization algorithms. This is due essentially to the fact that virtually all noncon- vex optimization problems can be described using differences of convex functions and differences of convex sets. A conference on Convex Analysis and Global Optimization was held during June 5 -9, 2000 at Pythagorion, Samos, Greece. The conference was honoring the memory of C. Caratheodory (1873-1950) and was en- dorsed by the Mathematical Programming Society (MPS) and by the Society for Industrial and Applied Mathematics (SIAM) Activity Group in Optimization. The conference was sponsored by the European Union (through the EPEAEK program), the Department of Mathematics of the Aegean University and the Center for Applied Optimization of the University of Florida, by the General Secretariat of Research and Tech- nology of Greece, by the Ministry of Education of Greece, and several local Greek government agencies and companies. This volume contains a selective collection of refereed papers based on invited and contribut- ing talks presented at this conference. The two themes of convexity and global optimization pervade this book. The conference provided a forum for researchers working on different aspects of convexity and global opti- mization to present their recent discoveries, and to interact with people working on complementary aspects of mathematical programming.

  • af Boris I. Goldengorin
    455,95 kr.

    Data Correcting Approaches in Combinatorial Optimization focuses on algorithmic applications of the well known polynomially solvable special cases of computationally intractable problems. The purpose of this text is to design practically efficient algorithms for solving wide classes of combinatorial optimization problems. Researches, students and engineers will benefit from new bounds and branching rules in development efficient branch-and-bound type computational algorithms. This book examines applications for solving the Traveling Salesman Problem and its variations, Maximum Weight Independent Set Problem, Different Classes of Allocation and Cluster Analysis as well as some classes of Scheduling Problems. Data Correcting Algorithms in Combinatorial Optimization introduces the data correcting approach to algorithms which provide an answer to the following questions: how to construct a bound to the original intractable problem and find which element of the corrected instance one should branch such that the total size of search tree will be minimized. The PC time needed for solving intractable problems will be adjusted with the requirements for solving real world problems.

  • - Computational Methods and Applications
    af Christodoulos A. Floudas
    1.734,95 kr.

    Optimization problems abound in most fields of science, engineering, and tech- nology. In many of these problems it is necessary to compute the global optimum (or a good approximation) of a multivariable function. The variables that define the function to be optimized can be continuous and/or discrete and, in addition, many times satisfy certain constraints. Global optimization problems belong to the complexity class of NP-hard prob- lems. Such problems are very difficult to solve. Traditional descent optimization algorithms based on local information are not adequate for solving these problems. In most cases of practical interest the number of local optima increases, on the aver- age, exponentially with the size of the problem (number of variables). Furthermore, most of the traditional approaches fail to escape from a local optimum in order to continue the search for the global solution. Global optimization has received a lot of attention in the past ten years, due to the success of new algorithms for solving large classes of problems from diverse areas such as engineering design and control, computational chemistry and biology, structural optimization, computer science, operations research, and economics. This book contains refereed invited papers presented at the conference on "e;State of the Art in Global Optimization: Computational Methods and Applications"e; held at Princeton University, April 28-30, 1995. The conference presented current re- search on global optimization and related applications in science and engineering. The papers included in this book cover a wide spectrum of approaches for solving global optimization problems and applications.

  • af A. Migdalas
    1.734,95 kr.

    Researchers working with nonlinear programming often claim "e;the word is non- linear"e; indicating that real applications require nonlinear modeling. The same is true for other areas such as multi-objective programming (there are always several goals in a real application), stochastic programming (all data is uncer- tain and therefore stochastic models should be used), and so forth. In this spirit we claim: The word is multilevel. In many decision processes there is a hierarchy of decision makers, and decisions are made at different levels in this hierarchy. One way to handle such hierar- chies is to focus on one level and include other levels' behaviors as assumptions. Multilevel programming is the research area that focuses on the whole hierar- chy structure. In terms of modeling, the constraint domain associated with a multilevel programming problem is implicitly determined by a series of opti- mization problems which must be solved in a predetermined sequence. If only two levels are considered, we have one leader (associated with the upper level) and one follower (associated with the lower level).

  • af Sergiy Butenko
    881,95 kr.

    Over the past several years, cooperative control and optimization has un- questionably been established as one of the most important areas of research in the military sciences. Even so, cooperative control and optimization tran- scends the military in its scope -having become quite relevant to a broad class of systems with many exciting, commercial, applications. One reason for all the excitement is that research has been so incredibly diverse -spanning many scientific and engineering disciplines. This latest volume in the Cooperative Systems book series clearly illustrates this trend towards diversity and creative thought. And no wonder, cooperative systems are among the hardest systems control science has endeavored to study, hence creative approaches to model- ing, analysis, and synthesis are a must! The definition of cooperation itself is a slippery issue. As you will see in this and previous volumes, cooperation has been cast into many different roles and therefore has assumed many diverse meanings. Perhaps the most we can say which unites these disparate concepts is that cooperation (1) requires more than one entity, (2) the entities must have some dynamic behavior that influences the decision space, (3) the entities share at least one common objective, and (4) entities are able to share information about themselves and their environment. Optimization and control have long been active fields of research in engi- neering.

  • af Renato De Leone
    1.308,95 kr.

    This book contains a selection of papers presented at the conference on High Performance Software for Nonlinear Optimization (HPSN097) which was held in Ischia, Italy, in June 1997. The rapid progress of computer technologies, including new parallel architec- tures, has stimulated a large amount of research devoted to building software environments and defining algorithms able to fully exploit this new computa- tional power. In some sense, numerical analysis has to conform itself to the new tools. The impact of parallel computing in nonlinear optimization, which had a slow start at the beginning, seems now to increase at a fast rate, and it is reasonable to expect an even greater acceleration in the future. As with the first HPSNO conference, the goal of the HPSN097 conference was to supply a broad overview of the more recent developments and trends in nonlinear optimization, emphasizing the algorithmic and high performance software aspects. Bringing together new computational methodologies with theoretical ad- vances and new computer technologies is an exciting challenge that involves all scientists willing to develop high performance numerical software. This book contains several important contributions from different and com- plementary standpoints. Obviously, the articles in the book do not cover all the areas of the conference topic or all the most recent developments, because of the large number of new theoretical and computational ideas of the last few years.

  • af Panos M. Pardalos, Sanguthevar Rajasekaran, Jose Rolim & mfl.
    455,95 kr.

  • - State of the Art
    af William W. Hager
    1.734,95 kr.

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con- ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program- ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At- tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com- puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abroad. Accurate modeling of scientific problems often leads to the formulation of large- scale optimization problems involving thousands of continuous and/or discrete vari- ables. Large scale optimization has seen a dramatic increase in activities in the past decade. This has been a natural consequence of new algorithmic developments and of the increased power of computers. For example, decomposition ideas proposed by G. Dantzig and P. Wolfe in the 1960's, are now implement able in distributed process- ing systems, and today many optimization codes have been implemented on parallel machines.

  • af Panos M. Pardalos, Nguyen Van Thoai & R. Horst
    1.308,95 kr.

  • af Panos M. Pardalos & Christodoulos A. Floudas
    455,95 kr.

  • af Petros Xanthopoulos, Panos M. Pardalos & Theodore B. Trafalis
    455,95 kr.

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.