Udvidet returret til d. 31. januar 2025

Bøger udgivet af now publishers Inc

Filter
Filter
Sorter efterSorter Populære
  • af Arjun Chaudhuri
    1.023,95 kr.

    The rapid growth in big data from mobile, Internet of things (IoT), and edge devices, and the continued demand for higher computing power, have established deep learning as the cornerstone of most artificial intelligence (AI) applications today. Recent years have seen a push towards deep learning implemented on domain-specific AI accelerators that support custom memory hierarchies, variable precision, and optimized matrix multiplication. Commercial AI accelerators have shown superior energy and footprint efficiency compared to GPUs for a variety of inference tasks. In this monograph, roadblocks that need to be understood and analyzed to ensure functional robustness in emerging AI accelerators are discussed. State-of-the-art practices adopted for structural and functional testing of the accelerators are presented, as well as methodologies for assessing the functional criticality of hardware faults in AI accelerators for reducing the test time by targeting the functionally critical faults. This monograph highlights recent research on efforts to improve test and reliability of neuromorphic computing systems built using non-volatile memory (NVM) devices like spin-transfer-torque (STT-MRAM) and resistive RAM (ReRAM) devices. Also are the robustness of silicon-photonic neural networks and the reliability concerns with manufacturing defects and process variations in monolithic 3D (M3D) based near-memory computing systems.

  • af Anamika Dubey
    1.063,95 kr.

    Recent fire-related damages and fatalities caused by high-voltage transmission lines coupled with dry weather are costing billions of dollars annually, with the only practical solution being de-energizing the lines and disrupting the power supply to millions of customers. The recent advances in the distribution grid, including the integration of distributed generation (DGs), distributed energy resources (DERs), and microgrids, provide potential means to improve the grid's operational resilience. An advanced decision-support system is needed to plan and manage grid operations by proactively managing the grid's variable, uncertain, and distributed resources. Consequently, resilient operational solutions for power distribution grids have drawn significant attention. These applications range from leveraging recent advances in smart grid technology, such as remote control capabilities and DER integration, to enabling advanced grid services such as frequency and voltage support for the bulk grid and resilient operations through intentional islanding to support critical services during disruptions. This monograph provides a much-needed primer on optimization methods used in active power distribution systems for advanced operations, with the goal of benefiting researchers working in this field. The graduate students and young researchers working in the area of DERs and distribution systems operations need a background on not only topics related to power distribution engineering but also a wide variety of interdisciplinary subjects to address the upcoming challenges. The monograph will benefit a diverse pool of researchers and industry practitioners by building the necessary background on modeling the distribution systems (with DERs) and system optimization methods for provisioning grid services.

  • af Luca Carlone
    1.063,95 kr.

    Geometric perception is the problem of estimating unknown geometric models such as poses, rotations, and 3D structure from sensor data, such as camera images, lidar scans, inertial data, and wheel odometry. Geometric perception has been at the center stage of robotics and computer vision research since their inception. Outlier-robust estimation is a fundamental problem and has been extensively investigated by statisticians and practitioners. The last few years have seen a convergence across research fields towards "algorithmic robust statistics", which focuses on developing tractable outlier-robust techniques for high-dimensional estimation problems. Despite this convergence, research efforts across fields have been mostly disconnected from one another. This monograph bridges recent work on certifiable outlier-robust estimation for geometric perception in robotics and computer vision with parallel work in robust statistics. In particular, recent results on robust linear regression and list-decodable regression are adapted and extended to the setup commonly found in robotics and vision, where (i) variables belong to a non-convex domain, (ii) measurements are vector-valued, and (iii) the number of outliers is not known a priori. The emphasis here is on performance guarantees: rather than proposing radically new algorithms, conditions are provided on the input measurements under which modern estimation algorithms are guaranteed to recover an estimate close to the ground truth in the presence of outliers. These conditions are what we call an "estimation contract". The monograph also provides numerical experiments to shed light on the applicability of the theoretical results and to showcase the potential of list-decodable regression algorithms in geometric perception. Besides the proposed extensions of existing results, the main contributions of this monograph are (i) to unify parallel research lines by pointing out commonalities and differences, (ii) to introduce advanced material (e.g., sum-of-squares proofs) in an accessible and self-contained presentation for the practitioner, and (iii) to point out a few immediate opportunities and open questions in outlier-robust geometric perception.

  • af Omer Bulakçi
    1.343,95 kr.

    While the 5th Generation (5G) system is being widely deployed across the globe, the information and communication technology (ICT) industry, research, standardization and consensus building for the 6th generation (6G) are already well underway with high expectations towards the merger of digital, physical, and human worlds. The main goal of this book is to introduce the upcoming 6G technologies and outline the foreseen challenges, enablers, and architectural design trends that will be instrumental in realizing a Sustainable and Trustworthy 6G system in the coming years.The envisioned 6G system promises to offer a more advanced and comprehensive user experience not only by achieving higher speeds, larger capacity, and lower latency, but also much more improved reliability, greater energy efficiency, and an enhanced security and privacy-preserving framework while natively integrating intelligence end-to-end (E2E). Achieving these goals will require innovative technological solutions and a holistic system design that considers the needs of various stakeholders and future 6G use cases.Capitalizing on the European 5G Public-Private-Partnership (5G PPP) Phase 3 projects working on 5G & Beyond and 6G research in recent years, and the join efforts between the Architecture Working Group (WG) and the 6G flagship Hexa-X project, this book delves into the critical challenges and enablers of the 6G system, including new network architectures and novel enhancements as well as the role of regulators, network operators, industry players, application developers, and end-users. Accordingly, this book provides a comprehensive landscape of the current research activities on 6G in Europe and sets a solid cornerstone on the 6G development towards a more connected, intelligent, and sustainable world.Furthermore, as 5G PPP Phase 3 consists of the last calls of the Horizon 2020 program, this book is aimed to lay the architectural foundation for the next European program towards 6G, i.e., Smart Networks & Services (SNS) Joint Undertaking (JU).

  • af Haifeng Luo
    863,95 kr.

    Sixth-generation (6G) wireless communication networks will transform connected things in 5G into connected intelligence. The networks can have human-like cognition capabilities by enabling many potential services, such as high-accuracy localization and tracking, augmented human sense, gesture and activity recognition, etc. For this purpose, many emerging applications in 6G have stringent requirements on transmission throughput and latency. With the explosion of devices in the connected intelligence world, spectrum utilization has to be enhanced to meet these stringent requirements. In-band full-duplex has been reported as a promising technique to enhance spectral efficiency and reduce end-to-end latency. However, simultaneous transmission and reception over the same frequency introduce additional interference compared to conventional half-duplex radios. The receiver is exposed to the transmitter of the same node operating in in-band full-duplex mode, causing self-interference. Due to the significant power difference between self-interference and the signal of interest, self-interference must be effectively suppressed to benefit from in-band full-duplex operation. In addition to self-interference, uplink users will interfere with downlink users within the range, known as co-channel interference. This interference could be significant in cellular networks, so it has to be appropriately processed to maximize the in-band full-duplex gain. The objective of this monograph is to present a timely overview of self-interference cancellation techniques and discuss the challenges and possible solutions to implement effective self-interference cancellation in 6G networks. Also included is beamforming to manage the complex interference and maximize the in-band full-duplex gain in cellular networks. Furthermore, a deep insight into the benefits of in-band full-duplex operations on various emerging applications is presented, such as integrated access and backhaul networks, integrated sensing and communications, and physical layer security.

  • af Lin Zhou
    1.063,95 kr.

    In modern 5G networks and beyond, finite blocklength lossy source coding is essential to provide ultra-reliable and low-latency communications. The analysis of point-to-point and multiterminal settings from the perspective of finite blocklength lossy source coding is therefore of great interest to 5G system designers and is also related to other long-standing problems in information theory. In this monograph, the authors provide a comprehensive review of recent advances in second-order asymptotics for lossy source coding. The monograph is divided into three parts. In Part I, the reader is introduced to the topic with a description of the mathematical tools used and a basic example of lossless source coding. In Part II, the authors present five generations of the rate-distortion problem to tackle various aspects of practical quantization tasks: noisy source, noisy channel, mismatched code, Gauss-Markov source and fixed-to-variable length compression. In Part III, the authors present four multi-terminal generalizations of the rate-distortion problem to consider multiple encoders, decoders or source sequences: the Kaspi problem, the successive refinement problem, the Fu-Yeung problem and the Gray-Wyner problem. This self-contained, complete review of a topic at the heart of 5G technology is essential reading for all students, researchers and designers of modern day communication systems.

  • af Jeremy Bertomeu
    1.063,95 kr.

    A Primer on Structural Estimation in Accounting Research provides an introduction to researchers interested in incorporating structural models into their analysis. The monograph is designed for researchers with little or no prior knowledge of structural models and with the objective to make barriers to entry into this literature no greater than in other theoretical or empirical areas. The emphasis is on adequate use of the methods in applied work. While most examples are drawn from accounting research, many of the methods are applicable more generally to other related areas such as finance, marketing, and economics. This primer is divided into eight sections, which are interconnected but can be also read individually. Section 1 presents two simple guided examples of structural estimation exercises where the logic of the main tools can be absorbed with minimal formalism. Section 2 presents a step-by-step approach to structural estimation, generalizing the methods applied in the two examples. Section 3 discusses more details of the econometric methods for readers interested in applying statistical concepts and widely-used mathematical formulas for estimators and their standard-errors. Section 4 discusses special topics required in estimation approaches using dynamic models, including dynamic programming. Sections 5, 6 and 7 discusses contemporary advances in the context of principal-agent theory, disclosure theory, and earnings management, respectively. Section 8 concludes the monograph.

  • af Sebastian Bruch
    968,95 kr.

    Information retrieval researchers develop algorithmic solutions to hard problems and insist on a proper, multifaceted evaluation of ideas. As we move towards even more complex deep learning models in a wide range of applications, questions on efficiency once again resurface with renewed urgency. Efficiency is no longer limited to time and space but has found new, challenging dimensions that stretch to resource-, sample- and energy-efficiency with ramifications for researchers, users, and the environment. This monograph takes a step towards promoting the study of efficiency in the era of neural information retrieval by offering a comprehensive survey of the literature on efficiency and effectiveness in ranking and retrieval. It is inspired by the parallels that exist between the challenges in neural network-based ranking solutions and their predecessors, decision forest-based learning-to-rank models, and the connections between the solutions the literature to date has to offer. By understanding the fundamentals underpinning these algorithmic and data structure solutions one can better identify future directions and more efficiently determine the merits of ideas.

  • af Gaoqing Zhang
    1.063,95 kr.

    Models of Accounting Disclosure by Banking Institutions examines an emerging stream of accounting literature that deploys economic models to study issues of accounting disclosure by banking institutions. Motivating the focus on a specific industry are two banking specificities: banks are vulnerable to the risk of runs and banks are heavily regulated. The author shows that considering these banking specificities, accounting disclosure by banks can play a prominent role in influencing the stability and the efficiency of the banking system. Workhorse models are presented that can be adapted as building blocks to capture the roles of accounting disclosure in the banking industry. Recent studies illustrating specific accounting applications of the workhorse models and discussed as to their potential to generate implications that inform policy debates and empirical tests.

  • af Zoya Bylinskii
    593,95 kr.

    Most research in computer graphics and image synthesis produces outputs for human consumption. In many cases, these algorithms operate largely automatically; in other cases, interactive tools allow professionals or everyday users to author or edit images, video, textures, geometry, or animation. Online crowdsourcing platforms have made it increasingly easy to perform evaluations of algorithm outputs with survey questions like "which image is better, A or B?", leading to their proliferation in vision and graphics research papers. Results of these studies are often used as quantitative evidence in support of a paper's contributions. When conducted hastily as an afterthought, such studies can lead to an increase of uninformative, and, potentially, misleading conclusions. On the other hand, in these same communities, user research is underutilized in driving project direction and forecasting user needs and reception. Increased attention is needed in both the design and reporting of user studies in computer vision and graphics papers towards (1) improved replicability and (2) improved project direction. This monograph focusses on these aspects, and an overview of methodologies from user experience research (UXR), human-computer interaction (HCI), and applied perception to increase exposure to the available methodologies and best practices are also presented. Foundational user research methods are included, (e.g., need finding) that are presently underutilized in computer vision and graphics research, but can provide valuable project direction. Also, further pointers to the literature for readers interested in exploring other UXR methodologies are given, and broader open issues and recommendations for the research community are described.

  • af Paul G. A. Jespers
    593,95 kr.

    Mainstream textbooks explain how electronic circuits work, but cover very little on how to conceive them. That is the aim of this monograph, namely to show readers how they can determine currents, channel lengths and widths of CMOS circuits, so as to optimally satisfy design specifications of electronic circuits. The idea underlying the methodology described in this monograph consists in the use of a set of Lookup Tables embodying device data extracted from systematic runs done using an advanced circuit simulator, the same as used for final design verifications. In this way, all parameters put to use during the sizing procedure incorporate not only the bearings of bias conditions and geometry, but also every second-order effect present in the simulator's model, in particular short-channel effects. Consequently, the number of verification simulations one has to perform is not only substantially reduced, but the designer may concentrate on actual design strategies without being bothered by inconsistencies caused by poor models or inappropriate parameters. This monograph will be of use to engineers and researchers who work on the design of electronic circuits and systems.

  • af Yibo Yang
    808,95 kr.

    The goal of data compression is to reduce the number of bits needed to represent useful information. Neural, or learned compression, is the application of neural networks and related machine learning techniques to this task. This monograph aims to serve as an entry point for machine learning researchers interested in compression by reviewing the prerequisite background and representative methods in neural compression. Neural compression is the application of neural networks and other machine learning methods to data compression. Recent advances in statistical machine learning have opened up new possibilities for data compression, allowing compression algorithms to be learned end-to-end from data using powerful generative models such as normalizing flows, variational autoencoders, diffusion probabilistic models, and generative adversarial networks. This monograph introduces this field of research to a broader machine learning audience by reviewing the necessary background in information theory (e.g., entropy coding, rate-distortion theory) and computer vision (e.g., image quality assessment, perceptual metrics), and providing a curated guide through the essential ideas and methods in the literature thus far. Instead of surveying the vast literature, essential concepts and methods in neural compression are covered, with a reader in mind who is versed in machine learning but not necessarily data compression.

  • af Amritanshu Pandey
    543,95 kr.

    This monograph, written in a tutorial style, focuses on electromagnetic transient (EMT) simulation. These are becoming increasingly common, with significant active research, due to the introduction of inverter-based resources on the grid. A step-by-step tutorial is provided on performing an electromagnetic transient simulation for simple networks from the first principles, and concepts covered range from circuit simulation and Newton-Raphson to numerical integration with difference methods. This monograph should help graduate and senior undergraduate students, and working professionals, learn the inner workings of tools that perform EMT simulations. The tutorial uses a circuit-theory based approach to EMT simulation, and is introductory in that it shows the workings through several examples, expecting the reader to thereafter build further expertise through more detailed research. Nonetheless, the monograph should provide an excellent first step for anyone interested in the subject of electromagnetic transient simulation, especially those interested in building their own tools.

  • af Peng Zhang
    1.128,95 kr.

    The introduction of Quantum Theory (QT) provides a unified mathematical framework for Information Retrieval (IR). Compared with the classical IR framework, the quantum-inspired IR framework is based on user-centered modeling methods to model non-classical cognitive phenomena in human relevance judgment in the IR process. With the increase of data and computing resources, neural IR methods have been applied to the text matching and understanding task of IR. Neural networks have a strong learning ability of effective representation and generalization of matching patterns from raw data. This monograph provides a systematic introduction to quantum-inspired neural IR, including quantum-inspired neural language representation, matching and understanding. The cross-field research on QT, neural network and IR is not only helpful to non-classical phenomena modeling in IR but also to break the theoretical bottleneck of neural networks and design more transparent neural IR models. The authors first introduce the language representation method based on QT. Secondly, they introduce the quantum-inspired text matching and decision making model under neural network that shows its theoretical advantages in document ranking, relevance matching, multimodal IR, and can be integrated with neural network to jointly promote the development of IR. Finally, the latest progress of quantum language understanding is introduced and further topics on QT and language modeling provide readers with more materials for thinking.

  • af Pierre Jinghong Liang
    928,95 kr.

    Bookkeeping Graphs: Computational Theory and Applications first describes the graph or network representation of Double-Entry bookkeeping both in theory and in practice. The representation serves as the intellectual basis for a series of applied computational works on pattern recognition and anomaly detection in corporate journal-entry audit settings. The second part of the monograph reviews the computational theory of pattern recognition and anomaly detection built on the Minimum Description Length (MDL) principle. The main part of the monograph describes how the computational MDL theory is applied to recognize patterns and detect anomalous transactions in graphs representing the journal entries of a large set of transactions extracted from real-world corporate entities' bookkeeping data.

  • af Pascal Seguin
    858,95 kr.

    Since the birth of the first industrial robot in the early 1960s, robotics has often replaced humans for tedious and repetitive tasks in the industrial world. To meet these challenges, industrial robots have needed to become specialized. They have been designed according to the task that needs to be performed. In the early 1980s, the ambition to equip robots with robotic hands with universal capabilities led to the development of robotic grasping research. The emergence of more agile industry and also collaborative robotics requires the development of new generation grippers: more versatile, with not only adaptive grasping capabilities but also dexterous manipulation capabilities. The development of flexible multi-fingered grippers with both adaptive grasping and in-hand manipulation capabilities remains a complex issue for human-like dexterous manipulation. After four decades of research in dexterous manipulation, many robotic hands have been developed. The development of these hands however remains a key challenge, as the dexterity of robot hands is far from human capabilities. The aim of this monograph is, through the evolution of robotics from industrial and manufacturing robotics to service and collaborative robotics, to show the evolution of the grasping function. From industrial grippers to dexterous robot hands, and the stakes inherent today to new robotic applications in open, dynamic environments where humans are likely to evolve.

  • af Changho Suh
    1.568,95 kr.

    Information theory deals with mathematical laws that govern the flow, representation and transmission of information, just as the field of physics concerns laws that govern the behavior of the physical universe. The foundation was made in the context of communication while characterizing the fundamental limits of communication and offering codes (sometimes called algorithms) to achieve them.The most significant achievement of the field is the invention of digital communication which forms the basis of our daily-life digital products such as smart phones, laptops and any IoT devices. Recently it has also found important roles in a spotlight field that has been revolutionized during the past decades: data science.This book aims at demonstrating modern roles of information theory in a widening array of data science applications. The first and second parts of the book covers the core concepts of information theory: basic concepts on several key notions; and celebrated source and channel coding theorems which concern the fundamental limits of communication. The last part focuses on applications that arise in data science, including social networks, ranking, and machine learning.The book is written as a text for senior undergraduate and graduate students working on Information Theory and Communications, and it should also prove to be a valuable reference for professionals and engineers from these fields.

  • af Anastasios N. Angelopoulos
    928,95 kr.

    Black-box machine learning models are now routinely used in high-risk settings, like medical diagnostics, which demand uncertainty quantification to avoid consequential model failures. Conformal prediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models. One can use conformal prediction with any pre-trained model, such as a neural network, to produce sets that are guaranteed to contain the ground truth with a user-specified probability, such as 90%. It is easy-to-understand, easy-to-use, and in general, applies naturally to problems arising in the fields of computer vision, natural language processing, deep reinforcement learning, amongst others. In this hands-on introduction the authors provide the reader with a working understanding of conformal prediction and related distribution-free uncertainty quantification techniques. They lead the reader through practical theory and examples of conformal prediction and describe its extensions to complex machine learning tasks involving structured outputs, distribution shift, time-series, outliers, models that abstain, and more. Throughout, there are many explanatory illustrations, examples, and code samples in Python. With each code sample comes a Jupyter notebook implementing the method on a real-data example. This hands-on tutorial, full of practical and accessible examples, is essential reading for all students, practitioners and researchers working on all types of systems deploying machine learning techniques.

  • af Faisal Nawab
    1.098,95 kr.

    Consensus is the problem of making distributed nodes reach agreement. It is a basic building block that can be used in more complex distributed data management systems while retaining correctness guarantees of the state of the data and its recovery. Solving the intricacies of distributed coordination, network uncertainties, and failures in such complex data management problems is a daunting challenge. This has led many systems designers to utilize consensus as a tool to build more complex distributed protocols. Consensus has thus influenced data management systems and research for many decades. This monograph provides a foundation for the reader to understand the use of consensus protocols in data management systems and aims to empower data management researchers and practitioners to pursue work that utilizes and innovates consensus for their data management applications. It presents the foundations of consensus and consensus in data management by pointing out work that has been influential or representative of the data management areas the authors explore. They start with an introduction to the principles of consensus and then present background on the use of consensus in data management. They show how consensus is used for the distributed atomic commit problem and how it is used in replication protocols where data copies are distributed across different nodes. They further expand the scope of the crash-tolerant commit protocols to handle arbitrary failures by exploring the seminal fault-tolerant consensus protocol known as Practical Byzantine Fault Tolerance (Pbft). For each data management problem, the authors present a basic solution and highlight its shortcomings that invites the utilization of consensus. They then demonstrate the integration of consensus to overcome these shortcomings and provide desired design features, providing examples of each type of integration of consensus as well as an analysis of the integration and its implications. The monograph concludes with a summary and a discussion of future directions.

  • af Xingchao Jian
    783,95 kr.

    In modern data analysis, massive measurements from a network require novel signal processing techniques, which are expected to be adapted to the network topology, have distributed implementation, and are flexible enough for various applications. Graph signal processing (GSP) theories and techniques are geared towards these goals. GSP has seen rapid developments in recent years. Since its introduction around ten years ago, we have seen numerous new ideas and practical applications related to the field. In this monograph, an overview of recent advances in generalizing GSP is presented, with a focus on the extension to high-dimensional spaces, models, and structures. Alongside new frameworks proposed to tackle such problems, many new mathematical tools are introduced. In the first part of the monograph, traditional GSP is reviewed, challenges that it faces are highlighted, and efforts in overcoming such challenges are motivated. These efforts then become the theme for the rest of the publication. Included are the generalization of GSP to high dimensional vertex signal spaces, the theory of random shift operators and the wide-sense stationary (WSS) statistical signal models, and the treatment of high dimensionality in graph structures and generalized graph-like structures. The monograph concludes with an outline of possible future directions.

  • af Albert N. Link
    723,95 kr.

    The U.S. Small Business Technology Transfer (STTR) Program: An Assessment and an Evaluation of the Program is intended to expand the National Academies' report and set a stage for more in depth studies in the future of the STTR program by offering a systematic analytical overview of the STTR program and tying that overview to a qualitative/quantitative assessment and evaluation of the program given the limited data collected by and available from the NRC. In addition, this is an effort to orientate readers to a number of nuances of the STTR program that were beyond the scope of the National Academies' report. A secondary purpose is to highlight the economic importance of the STTR program itself. The remainder of this monograph is organized as follows. The legislative background for the STTR program is discussed in Section 2. Section 3 explains program assessments and program evaluations from a conceptual perspective. Section 4 describes the nature of the NRC's dataset used in this study, and based on that dataset a systematic analytical overview of the STTR program is presented. Section 5 presents a qualitative/quantitative assessment of the STTR program followed by a qualitative/quantitative evaluation of the STTR program in Section 6. Section 7 offers a summary of the paper and some concluding observations and additional suggestions for future NRC-led studies.

  • af Leonardo Rezende Juracy
    723,95 kr.

    The past decade has witnessed the consolidation of Artificial Intelligence technology, thanks to the popularization of Machine Learning (ML) models. The technological boom of ML models started in 2012 when the world was stunned by the record-breaking classification performance achieved by combining an ML model with a high computational performance graphic processing unit (GPU). Since then, ML models received ever-increasing attention, being applied in different areas such as computational vision, virtual reality, voice assistants, chatbots, and self-driving vehicles. The most popular ML models are brain-inspired models such as Neural Networks (NNs), including Convolutional Neural Networks (CNNs) and, more recently, Deep Neural Networks (DNNs). They are characterized by resembling the human brain, performing data processing by mimicking synapses using thousands of interconnected neurons in a network. In this growing environment, GPUs have become the de facto reference platform for the training and inference phases of CNNs and DNNs, due to their high processing parallelism and memory bandwidth. However, GPUs are power-hungry architectures. To enable the deployment of CNN and DNN applications on energy-constrained devices (e.g., IoT devices), industry and academic research have moved towards hardware accelerators. Following the evolution of neural networks from CNNs to DNNs, this monograph sheds light on the impact of this architectural shift and discusses hardware accelerator trends in terms of design, exploration, simulation, and frameworks developed in both academia and industry.

  • af Kuize Zhang
    1.098,95 kr.

    Real-world problems are often formulated as diverse properties of different types of dynamical systems. Hence property verification and synthesis have been long-standing research interests. The supervisory control framework developed in the 1980s provides a closed-loop property enforcement framework for discrete-event systems which usually consist of discrete states and transitions between states caused by spontaneous occurrences of labeled events. In this comprehensive review, the author develops an open-loop property enforcement framework for discrete event systems which scales better and can be implemented in more models. The author demonstrates the practicality of this framework using a tool called concurrent composition, and uses this tool to unify multiple inference-based properties and concealment-based properties in discrete-event systems. In the second part, the author introduces a new model called labeled weighed automata over monoids (LWAMs). LWAMs provide a natural generalization of labeled finite-state automata in the sense that each transition therein carries a weight from a monoid, the weight of a run is the product of the weights of the run's transitions. This book introduces the reader to a new paradigm in discrete event dynamic systems. It provides researchers, students and practitioners with the basic theory and a set on implementable tools that will have a significant impact on systems of the future.

  • af Valentin Kammerlohr
    723,95 kr.

    We encounter trust every day in our lives but it becomes increasingly important in technology-based transactions as traditional interpersonal trust factors cannot be applied as usual. As technology becomes more and more ubiquitous in our lives, we need to understand how trust in technology contexts is created, maintained, destroyed, and possibly rebuilt. This knowledge is important for the developers of technology, to create successful use, and for the users of technology, to be aware of the vulnerabilities and potential risks of technology use. This monograph examines the rich history of trust research outside of a technology context to assess existing trust studies in technology contexts and to inform the design and execution of future trust research in technology contexts. Because trust is a very complex construct, the authors first review the term. The rest of the review is organized in the context of personal, professional, and organizational relationships, looking at initial trust and the long-term evolution of trust. An overview of existing technology-based trust studies published in MIS Quarterly, Information Systems Research, and other Information Systems research outlets is provided. Finally, the authors identify where research and practical gaps and opportunities exist for future technology-based trust studies by balancing acquired and practical relevance.

  • af Florin Rusu
    1.098,95 kr.

    Multidimensional arrays are one of the fundamental computing abstractions to represent data across virtually all areas of science and engineering, and beyond. Due to their ubiquity, multidimensional arrays have been studied extensively across many areas of computer science. This survey provides a comprehensive guide for past, present, and future research in array data management from a database perspective. Unlike previous surveys that are limited to raster processing in the context of scientific data, this survey considers all types of arrays: rasters, data cubes, and tensors. The author's goal is to identify and analyze the most important research ideas on arrays and to serve two objectives: first, to summarize the most relevant work on multidimensional array data management by identifying the major research problems; and second, to organize this material to provide an accurate perspective on the state-of-the-art and future directions in array processing. Multidimensional Array Data Management covers all data management aspects, from array algebras and query languages to storage strategies, execution techniques, and operator implementations. Moreover, the author discusses which research ideas are adopted in real systems and how are they integrated in complete data processing pipelines. Finally, the author compares arrays with the relational data model. The result is a thorough survey on array data management that is an excellent resource for anyone interested in this topic, independent of experience level.

  • af Nicolas Guigui
    1.098,95 kr.

    As data is a predominant resource in applications, Riemannian geometry is a natural framework to model and unify complex nonlinear sources of data. However, the development of computational tools from the basic theory of Riemannian geometry is laborious. In this monograph the authors present a self-contained exposition of the basic concepts of Riemannian geometry from a computational viewpoint, providing illustrations and examples at each step. They proceed to demonstrate how these concepts are implemented in the open-source project Geomstats, explaining the choices that were made and the conventions chosen. The reader thus learns in one self-contained volume the theory of Riemann geometry and geometric statistics and their implementation to perform statistics and machine learning on manifolds. Containing many practical Python examples, this monograph is a valuable resource both for mathematicians and applied scientists to learn the theory of Riemann geometry and its use in practice implemented with the Geomstats package where most of the difficulties are hidden under high-level functions.

  • af Emanuela Carbonara
    893,95 kr.

    The Impact of Constitutional Protection of Economic Rights on Entrepreneurship: A Taxonomic Survey highlights how characteristics of the legal infrastructure of a country can create conditions that enhance new business creation. It is an exploration of the institutional determinants of entrepreneurship and the way these can affect the observed cross-country differences in the creation of new firms. The main aim is to analyze 195 constitutions, singling out the provisions that enhance economic freedom and are thus likely to create an institutional and legal setup favorable to new business creation. The study tries to answer a question of primary importance for the analysis of entrepreneurship. Does the constitutional protection of principles and values usually associated with a country's endowment of entrepreneurship capital and presence of small firms positively influence the rate of new firm formation and the total endowment of entrepreneurship capital in that country? The remainder of this monograph is structured as follows. Section 2 discusses the importance of institutions in shaping the entrepreneurship capital of a country and favoring the emergence of entrepreneurial ecosystems. Section 3 describes how higher-rank formal institutions represented by actual constitutional provisions affect the design of lower rank norms and regulations of primary importance for economic activity. Section 4 outlines the features of 'economic constitutions', i.e. of constitutional provisions playing a key role in the management of a country's economy. Section 5 gives an overview of alternative measures of entrepreneurship and discusses their implications for empirical analysis. Section 6 focuses on the countries that have adopted the principles of 'economic constitutions' in their written constitutions and discusses the impact of the de jure and de facto implementation of such principles on entrepreneurship. Section 7 shows how the prevailing psychological traits of a country's population may shape the impact of constitutional provisions on its proneness to entrepreneurship and sheds light on the relationship between constitutional provisions and the observed cross-country and cross-industry differences in labor productivity. Finally, Section 8 concludes and provides some recommendations for entrepreneurship policy.

  • af Wayes Tushar
    818,95 kr.

    Globally, the demand for electricity is increasing significantly and thus there is a need for more power generation. As a result, environmental pollution from burning fossil fuels is accelerating climate change at an unprecedented pace, as evidenced by recent extreme weather events across the world. As such, several paradigm shifts in power and energy systems are happening to protect the environment, societies, and economies against climate change. As the world is planning for a future with low carbon emissions, today's power system is transitioning from its existing traditional hierarchical structure to a more decentralized framework through innovative energy management techniques, such as peer-to-peer (P2P) sharing. Due to the potential benefits that P2P sharing can offer to electricity prosumers, consumers, and the grid, research, development, and pilot trials of P2P are advancing rapidly. To capture these developments in this emerging energy management paradigm, present in this monograph is a comprehensive review of various features of P2P sharing. To do so, first introduced is the network and market structures that are required to facilitate P2P sharing within a local community. Thereafter, a comprehensive overview of various challenges of P2P energy-sharing mechanisms at both virtual and physical layers is provided, followed by a discussion of technical approaches used in literature to address these challenges. Third, some emerging technological innovations that will be relevant to, and important for, the development of P2P sharing in future markets are introduced and discussed. Fourth, a summary of existing pilot P2P projects is provided, and this is followed by a summary of potential future research directions and a conclusion. Thus, by providing a holistic view of challenges and contributions to both virtual and physical layers of P2P energy systems simultaneously and in a structured way, this monograph delivers a comprehensive understanding of the core challenges that hinder the integration of P2P sharing in the current market model.

  • af Lisha Chen
    1.113,95 kr.

    Deep learning has achieved remarkable success in many machine learning tasks such as image classification, speech recognition, and game playing. However, these breakthroughs are often difficult to translate into real-world engineering systems because deep learning models require a massive number of training samples, which are costly to obtain in practice. To address labeled data scarcity, few-shot meta-learning optimizes learning algorithms that can efficiently adapt to new tasks quickly. While meta-learning is gaining significant interest in the machine learning literature, its working principles and theoretic fundamentals are not as well understood in the engineering community. This review monograph provides an introduction to meta-learning by covering principles, algorithms, theory, and engineering applications. After introducing meta-learning in comparison with conventional and joint learning, the main meta-learning algorithms are described, as well as a general bilevel optimization framework for the definition of meta-learning techniques. Then, known results on the generalization capabilities of meta-learning from a statistical learning viewpoint are summarized. Applications to communication systems, including decoding and power allocation, are discussed next, followed by an introduction to aspects related to the integration of meta-learning with emerging computing technologies, namely neuromorphic and quantum computing. The monograph concludes with an overview of open research challenges.

  • af Lingfei Wu Wu
    1.098,95 kr.

    Deep learning has become the dominant approach in addressing various tasks in Natural Language Processing (NLP). Although text inputs are typically represented as a sequence of tokens, there is a rich variety of NLP problems that can be best expressed with a graph structure. As a result, there is a surge of interest in developing new deep learning techniques on graphs for a large number of NLP tasks. In this monograph, the authors present a comprehensive overview on Graph Neural Networks (GNNs) for Natural Language Processing. They propose a new taxonomy of GNNs for NLP, which systematically organizes existing research of GNNs for NLP along three axes: graph construction, graph representation learning, and graph based encoder-decoder models. They further introduce a large number of NLP applications that exploits the power of GNNs and summarize the corresponding benchmark datasets, evaluation metrics, and open-source codes. Finally, they discuss various outstanding challenges for making the full use of GNNs for NLP as well as future research directions. This is the first comprehensive overview of Graph Neural Networks for Natural Language Processing. It provides students and researchers with a concise and accessible resource to quickly get up to speed with an important area of machine learning research.

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.