Markedets billigste bøger
Levering: 1 - 2 hverdage

Bøger i The Springer International Series in Engineering and Computer Science serien

Filter
Filter
Sorter efterSorter Serie rækkefølge
  • - A Data Mining Perspective
    af Huan Liu
    2.154,95 kr.

    There is broad interest in feature extraction, construction, and selection among practitioners from statistics, pattern recognition, and data mining to machine learning. Data preprocessing is an essential step in the knowledge discovery process for real-world applications. This book compiles contributions from many leading and active researchers in this growing field and paints a picture of the state-of-art techniques that can boost the capabilities of many existing data mining tools. The objective of this collection is to increase the awareness of the data mining community about the research of feature extraction, construction and selection, which are currently conducted mainly in isolation. This book is part of our endeavor to produce a contemporary overview of modern solutions, to create synergy among these seemingly different branches, and to pave the way for developing meta-systems and novel approaches. Even with today's advanced computer technologies, discovering knowledge from data can still be fiendishly hard due to the characteristics of the computer generated data. Feature extraction, construction and selection are a set of techniques that transform and simplify data so as to make data mining tasks easier. Feature construction and selection can be viewed as two sides of the representation problem.

  • af Jonathan Schaeffer
    1.682,95 kr.

    Major advances in computing are occurring at an ever-increasing pace. This is especially so in the area of high performance computing (HPC), where today's supercomputer is tomorrow's workstation. High Performance Computing Systems and Applications is a record of HPCS'98, the 12th annual Symposium on High Performance Computing Systems and Applications. The quality of the conference was significantly enhanced by the high proportion of keynote and invited speakers. This book presents the latest research in HPC architecture, networking, applications and tools. Of special note are the sections on computational biology and physics. High Performance Computing Systems and Applications is suitable as a secondary text for a graduate-level course on computer architecture and networking, and as a reference for researchers and practitioners in industry.

  • af Laurence Tianruo Yang
    1.662,95 kr.

    Parallel Numerical Computations with Applications contains selected edited papers presented at the 1998 Frontiers of Parallel Numerical Computations and Applications Workshop, along with invited papers from leading researchers around the world. These papers cover a broad spectrum of topics on parallel numerical computation with applications; such as advanced parallel numerical and computational optimization methods, novel parallel computing techniques, numerical fluid mechanics, and other applications related to material sciences, signal and image processing, semiconductor technology, and electronic circuits and systems design. This state-of-the-art volume will be an up-to-date resource for researchers in the areas of parallel and distributed computing.

  • af Dimiter R. Avresky
    2.158,95 kr.

    Dependable Network Computing provides insights into various problems facing millions of global users resulting from the `internet revolution'. It covers real-time problems involving software, servers, and large-scale storage systems with adaptive fault-tolerant routing and dynamic reconfiguration techniques. Also included is material on routing protocols, QoS, and dead- and live-lock free related issues. All chapters are written by leading specialists in their respective fields. Dependable Network Computing provides useful information for scientists, researchers, and application developers building networks based on commercially off-the-shelf components.

  • - Frameworks, Middleware and Environments
    af Elias N. Houstis
    1.114,95 kr.

    Enabling Technologies for Computational Science assesses future application computing needs, identifies research directions in problem-solving environments (PSEs), addresses multi-disciplinary environments operating on the Web, proposes methodologies and software architectures for building adaptive and human-centered PSEs, and describes the role of symbolic computing in scientific and engineering PSEs. The book also includes an extensive bibliography of over 400 references. Enabling Technologies for Computational Science illustrates the extremely broad and interdisciplinary nature of the creation and application of PSEs. Authors represent academia, government laboratories and industry, and come from eight distinct disciplines (chemical engineering, computer science, ecology, electrical engineering, mathematics, mechanical engineering, psychology and wood sciences). This breadth and diversity extends into the computer science aspects of PSEs. These papers deal with topics such as artificial intelligence, computer-human interaction, control, data mining, graphics, language design and implementation, networking, numerical analysis, performance evaluation, and symbolic computing. Enabling Technologies for Computational Science provides an assessment of the state of the art and a road map to the future in the area of problem-solving environments for scientific computing. This book is suitable as a reference for scientists from a variety of disciplines interested in using PSEs for their research.

  • - Analysis and Control
    af R. Boel
    1.111,95 kr.

    Discrete Event Systems: Analysis and Control is the proceedings of WODES2000 (the 5th Workshop on Discrete Event Systems, held in Ghent, Belgium, on August 21-23, 2000). This book provides a survey of the current state of the art in the field of modeling, analysis and control synthesis of discrete event systems, lecture notes for a mini course on sensitivity analysis for performance evaluation of timed discrete event systems, and 48 carefully selected papers covering all areas of discrete event theory and the most important applications domains. Topics include automata theory and supervisory control (12); Petri net based models for discrete event systems, and their control synthesis (11); (max,+) and timed automata models (9); applications papers related to scheduling, failure detection, and implementation of supervisory controllers (7); formal description of PLCs (6); and finally, stochastic models of discrete event systems (3).

  • af Michael Lautenschlager
    1.659,95 kr.

    Climate and Environmental Database Systems contains the papers presented at the Second International Workshop on Climate and Environmental Database Systems, held November 21-23, 1995, in Hamburg, Germany. Climate and environmental data may be separated into two classes, large amounts of well structured data and smaller amounts of less structured data. The large amounts are produced by numerical climate models and by satellites, handling data in the order of magnitude of 100 Tbytes for the climate modelling sites and 1000 Tbytes for the recording and processing of satellite data. Smaller amounts of poorly structured data are the environmental data, which come mainly from observations and measurements. Present-day problems in data management are connected with a variety of data types. Climate and Environmental Database Systems addresses the state of the art, practical experience, and future perspectives for climate and environmental database systems, and may be used as a text for a graduate level course on the subject or as a reference for researchers or practitioners in industry.

  • af David Touretzky
    1.082,95 kr.

    arise automatically as a result of the recursive structure of the task and the continuous nature of the SRN's state space. Elman also introduces a new graphical technique for study- ing network behavior based on principal components analysis. He shows that sentences with multiple levels of embedding produce state space trajectories with an intriguing self- similar structure. The development and shape of a recurrent network's state space is the subject of Pollack's paper, the most provocative in this collection. Pollack looks more closely at a connectionist network as a continuous dynamical system. He describes a new type of machine learning phenomenon: induction by phase transition. He then shows that under certain conditions, the state space created by these machines can have a fractal or chaotic structure, with a potentially infinite number of states. This is graphically illustrated using a higher-order recurrent network trained to recognize various regular languages over binary strings. Finally, Pollack suggests that it might be possible to exploit the fractal dynamics of these systems to achieve a generative capacity beyond that of finite-state machines.

  • af Masaru Tomita
    2.145,95 kr.

  • af Raul Camposano
    2.151,95 kr.

    The time has come for high-level synthesis. When research into synthesizing hardware from abstract, program-like de- scriptions started in the early 1970' s, there was no automated path from the register- transfer design produced by high-level synthesis to a complete hardware imple- mentation. As a result, it was very difficult to measure the effectiveness of high- level synthesis methods; it was also hard to justify to users the need to automate architecture design when low-level design had to be completed manually. Today's more mature CAD techniques help close the gap between an automat- ically synthesized design and a manufacturable design. Market pressures encour- age designers to make use of any and all automated tools. Layout synthesis, logic synthesis, and specialized datapath generators make it feasible to quickly imple- ment a register-transfer design in silicon,leaving designers more time to consider architectural improvements. As IC design becomes more automated, customers are increasing their demands; today's leading edge designers using logic synthesis systems are training themselves to be tomorrow's consumers of high-level synthe- sis systems. The need for very fast turnaround, a competitive fabrication market WhlCh makes small-quantity ASIC manufacturing possible, and the ever growing co:n- plexity of the systems being designed, all make higher-level design automaton inevitable.

  • af Giovanni L. Sicuranza
    1.658,95 kr.

    A color time-varying image can be described as a three-dimensional vector (representing the colors in an appropriate color space) defined on a three-dimensional spatiotemporal space. In conventional analog television a one-dimensional signal suitable for transmission over a communication channel is obtained by sampling the scene in the vertical and tem- poral directions and by frequency-multiplexing the luminance and chrominance informa- tion. In digital processing and transmission systems, sampling is applied in the horizontal direction, too, on a signal which has been already scanned in the vertical and temporal directions or directly in three dimensions when using some solid-state sensor. As a conse- quence, in recent years it has been considered quite natural to assess the potential advan- tages arising from an entire multidimensional approach to the processing of video signals. As a simple but significant example, a composite color video signal, such as the conven- tional PAL or NTSC signal, possesses a three-dimensional spectrum which, by using suitable three-dimensional filters, permits horizontal sampling at a rate which is less than that re- quired for correctly sampling the equivalent one-dimensional signal. More recently it has been widely recognized that the improvement of the picture quality in current and advanced television systems requires well-chosen signal processing algorithms which are multidimen- sional in nature within the demanding constraints of a real-time implementation.

  • af Jean Mermet
    2.136,95 kr.

    The success of VHDL since it has been balloted in 1987 as an IEEE standard may look incomprehensible to the large population of hardware designers, who had never heared of Hardware Description Languages before (for at least 90% of them), as well as to the few hundreds of specialists who had been working on these languages for a long time (25 years for some of them). Until 1988, only a very small subset of designers, in a few large companies, were used to describe their designs using a proprietary HDL, or sometimes a HDL inherited from a University when some software environment happened to be developped around it, allowing usability by third parties. A number of benefits were definitely recognized to this practice, such as functional verification of a specification through simulation, first performance evaluation of a tentative design, and sometimes automatic microprogram generation or even automatic high level synthesis. As there was apparently no market for HDL's, the ECAD vendors did not care about them, start-up companies were seldom able to survive in this area, and large users of proprietary tools were spending more and more people and money just to maintain their internal system.

  • af Bishnu S. Atal
    2.152,95 kr.

    Speech coding has been an ongoing area of research for several decades, yet the level of activity and interest in this area has expanded dramatically in the last several years. Important advances in algorithmic techniques for speech coding have recently emerged and excellent progress has been achieved in producing high quality speech at bit rates as low as 4.8 kb/s. Although the complexity of the newer more sophisticated algorithms greatly exceeds that of older methods (such as ADPCM), today's powerful programmable signal processor chips allow rapid technology transfer from research to product development and permit many new cost-effective applications of speech coding. In particular, low bit rate voice technology is converging with the needs of the rapidly evolving digital telecom- munication networks. The IEEE Workshop on Speech Coding for Telecommunications was held in Vancouver, British Columbia, Canada, from September 5 to 8, 1989. The objective of the workshop was to provide a forum for discussion of recent developments and future directions in speech coding. The workshop attracted over 130 researchers from several countries and its technical program included 51 papers.

  • af L. Richard Carley
    881,95 kr.

    Computer-Aided Design of Analog Circuits and Systems brings together in one place important contributions and state-of-the-art research results in the rapidly advancing area of computer-aided design of analog circuits and systems. This book serves as an excellent reference, providing insights into some of the most important issues in the field.

  • af Francky Catthoor
    1.664,95 kr.

    Application-Driven Architecture Synthesis describes the state of the art of architectural synthesis for complex real-time processing. In order to deal with the stringent timing requirements and the intricacies of complex real-time signal and data processing, target architecture styles and target application domains have been adopted to make the synthesis approach feasible. These approaches are also heavily application-driven, which is illustrated by many realistic demonstrations, used as examples in the book. The focus is on domains where application-specific solutions are attractive, such as significant parts of audio, telecom, instrumentation, speech, robotics, medical and automotive processing, image and video processing, TV, multi-media, radar, sonar. Application-Driven Architecture Synthesis is of interest to both academics and senior design engineers and CAD managers in industry. It provides an excellent overview of what capabilities to expect from future practical design tools, and includes an extensive bibliography.

  • - A Special Issue of Analog Integrated Circuits and Signal Processing
    af Lawrence P. Huelsman
    1.080,95 kr.

    This book brings together important contributions and state-of-the-art research results in the rapidly advancing area of symbolic analysis of analog circuits. It is also of interest to those working in analog CAD. The book is an excellent reference, providing insights into some of the most important issues in the symbolic analysis of analog circuits.

  • af Bishnu S. Atal
    2.142,95 kr.

    Speech and Audio Coding for Wireless and Network Applications contains 34 chapters, loosely grouped into six topical areas. The chapters in this volume reflect the progress and present the state of the art in low-bit-rate speech coding, primarily at bit rates from 2.4 kbit/s to 16 kbit/s. Together they represent important contributions from leading researchers in the speech coding community. Speech and Audio Coding for Wireless and Network Applications contains contributions describing technologies that are under consideration as standards for such applications as digital cellular communications (the half-rate American and European coding standards). A brief Introduction is followed by a section dedicated to low-delay speech coding, a research direction which emerged as a result of the CCITT requirement for a universal low-delay 16 kbit/s speech coding technology and now continues with the objective of achieving toll quality with moderate delay at a rate of 8 kbit/s. A section on the important topic of speech quality evaluation is then presented. This is followed by a section on speech coding for wireless transmission, and a section on audio coding which covers not only 7 kHz bandwidth speech, but also wideband coding applicable to high fidelity music. The book concludes with a section on speech coding for noisy transmission channels, followed by a section addressing future research directions. Speech and Audio Coding for Wireless and Network Applications presents a cross-section of the key contributions in speech and audio coding which have emerged recently. For this reason, the book is a valuable reference for all researchers and graduate students in the speech coding community.

  • - A Special Issue of MACHINE LEARNING
    af Ryszard S. Michalski
    2.131,95 kr.

    Most machine learning research has been concerned with the development of systems that implememnt one type of inference within a single representational paradigm. Such systems, which can be called monostrategy learning systems, include those for empirical induction of decision trees or rules, explanation-based generalization, neural net learning from examples, genetic algorithm-based learning, and others. Monostrategy learning systems can be very effective and useful if learning problems to which they are applied are sufficiently narrowly defined. Many real-world applications, however, pose learning problems that go beyond the capability of monostrategy learning methods. In view of this, recent years have witnessed a growing interest in developing multistrategy systems, which integrate two or more inference types and/or paradigms within one learning system. Such multistrategy systems take advantage of the complementarity of different inference types or representational mechanisms. Therefore, they have a potential to be more versatile and more powerful than monostrategy systems. On the other hand, due to their greater complexity, their development is significantly more difficult and represents a new great challenge to the machine learning community. Multistrategy Learning contains contributions characteristic of the current research in this area.

  • - Technology and Protocols
    af Ahmed N. Tantawy
    1.662,95 kr.

    In the last few years, the world of information networks has undergone significant changes that will revolutionize the future of communications. Data rates have reached the gigabit per second range. Optical fibers have become the transmission medium of choice. Standardization activities have very aggressively produced a set of well established standard for future LANs, MANs and WANs. It has become very difficult for computer and communications professionals to follow these rapidly evolving technologies and standards. High Performance Networks: Technology and Protocols provides a timely technical overview of the start-of-the-art in high performance networking. Chapters cover lightweight protocols, high performance protocol implementation techniques, high speed MAC protocols, optical networks, as well as emerging standards, including ATM, SMDS, B-ISDN, SONET, FCS and HIPPI. Professionals, engineers, and researchers in communications and computers, who need to understand the underlying technologies of high performance (gigabit) networks, will find this volume to be an invaluable reference. The book is also suitable for use as a text for advanced courses on the subject.

  • af Osamu Wada
    2.160,95 kr.

    As we approach the end of the present century, the elementary particles of light (photons) are seen to be competing increasingly with the elementary particles of charge (electrons/holes) in the task of transmitting and processing the insatiable amounts of infonnation needed by society. The massive enhancements in electronic signal processing that have taken place since the discovery of the transistor, elegantly demonstrate how we have learned to make use of the strong interactions that exist between assemblages of electrons and holes, disposed in suitably designed geometries, and replicated on an increasingly fine scale. On the other hand, photons interact extremely weakly amongst themselves and all-photonic active circuit elements, where photons control photons, are presently very difficult to realise, particularly in small volumes. Fortunately rapid developments in the design and understanding of semiconductor injection lasers coupled with newly recognized quantum phenomena, that arise when device dimensions become comparable with electronic wavelengths, have clearly demonstrated how efficient and fast the interaction between electrons and photons can be. This latter situation has therefore provided a strong incentive to devise and study monolithic integrated circuits which involve both electrons and photons in their operation. As chapter I notes, it is barely fifteen years ago since the first demonstration of simple optoelectronic integrated circuits were realised using m-V compound semiconductors; these combined either a laser/driver or photodetector/preamplifier combination.

  • af Peter Marwedel
    2.143,95 kr.

    Modern electronics is driven by the explosive growth of digital communications and multi-media technology. A basic challenge is to design first-time-right complex digital systems, that meet stringent constraints on performance and power dissipation. In order to combine this growing system complexity with an increasingly short time-to-market, new system design technologies are emerging based on the paradigm of embedded programmable processors. This concept introduces modularity, flexibility and re-use in the electronic system design process. However, its success will critically depend on the availability of efficient and reliable CAD tools to design, programme and verify the functionality of embedded processors. Recently, new research efforts emerged on the edge between software compilation and hardware synthesis, to develop high-quality code generation tools for embedded processors. Code Generation for Embedded Systems provides a survey of these new developments. Although not limited to these targets, the main emphasis is on code generation for modern DSP processors. Important themes covered by the book include: the scope of general purpose versus application-specific processors, machine code quality for embedded applications, retargetability of the code generation process, machine description formalisms, and code generation methodologies. Code Generation for Embedded Systems is the essential introduction to this fast developing field of research for students, researchers, and practitioners alike.

  • - A Special Issue of Analog Integrated Circuits and Signal Processing An International Journal Volume 8, No. 1 (1995)
    af Wouter A. Serdijn
    1.093,95 kr.

    Low-Voltage Low-Power Analog Integrated Circuits brings together in one place important contributions and state-of-the-art research results in this rapidly advancing area. Low-Voltage Low-Power Analog Integrated Circuits serves as an excellent reference, providing insight into some of the most important issues in the field.

  • af Donald Fussell
    2.142,95 kr.

    Responsive Computer Systems: Steps Towards Fault-Tolerant Real-Time Systems provides an extensive treatment of the most important issues in the design of modern Responsive Computer Systems. It lays the groundwork for a more comprehensive model that allows critical design issues to be treated in ways that more traditional disciplines of computer research have inhibited. It breaks important ground in the development of a fruitful, modern perspective on computer systems as they are currently developing and as they may be expected to develop over the next decade. Audience: An interesting and important road map to some of the most important emerging issues in computing, suitable as a secondary text for graduate level courses on responsive computer systems and as a reference for industrial practitioners.

  • af MengChu Zhou
    2.151,95 kr.

    Over the past two decades, research in the theory of Petri nets and the development of graphical tools has yielded a powerful methodology. The contributions in Petri Nets in Flexible and Agile Automation present theoretical development of Petri nets as well as in industrial applications to areas such as discrete- event control design, scheduling, performance evaluation and deadlock avoidance. These contributions also include comparative studies of Petri nets and other approaches. A primary theme of this book is to provide a unified approach to the applications of Petri nets in flexible and agile automation and, in that regard, a common notation and terminology is used. The book also allows readers to evaluate the benefits and applicability of state-of-the-art Petri net methods and apply CAD tools to problems of interest. Petri Nets in Flexible and Agile Automation is not only an essential reference for researchers, it is also a very useful tool for engineers, analysts and managers who are responsible for the design, implementation and operation of the next generation of manufacturing systems.

  • af Lizy Kurian John
    2.138,95 kr.

    The formal study of program behavior has become an essential ingredient in guiding the design of new computer architectures. Accurate characterization of applications leads to efficient design of high performing architectures. Quantitative and analytical characterization of workloads is important to understand and exploit the interesting features of workloads. This book includes ten chapters on various aspects of workload characterizati on. File caching characteristics of the industry-standard web-serving benchmark SPECweb99 are presented by Keller et al. in Chapter 1, while value locality of SPECJVM98 benchmarks are characterized by Rychlik et al. in Chapter 2. SPECJVM98 benchmarks are visited again in Chapter 3, where Tao et al. study the operating system activity in Java programs. In Chapter 4, KleinOsowski et al. describe how the SPEC2000 CPU benchmark suite may be adapted for computer architecture research and present the small, representative input data sets they created to reduce simulation time without compromising on accuracy. Their research has been recognized by the Standard Performance Evaluation Corporation (SPEC) and is listed on the official SPEC website, http://www. spec. org/osg/cpu2000/research/umnl. The main contribution of Chapter 5 is the proposal of a new measure called locality surface to characterize locality of reference in programs. Sorenson et al. describe how a three-dimensional surface can be used to represent both of programs. In Chapter 6, Thornock et al.

  • af Roy Ladner
    1.085,95 kr.

    We are facing a rapidly growing capability to collect more and more data regarding our environment. With that, we must have the ability to extract more insightful knowledge about the environmental processes at work on the earth. Spatio-Temporal Information Systems (STIS) will especially prove beneficial in producing useful knowledge about changes in our world from these ever burgeoning collections of environment data.STIS provide the ability to store, analyze and represent the dynamic properties of the environment, that is, geographic information in space and time. An STIS, for example, can produce a weather map, but more importantly, it can present a user with information in map or report form indicating how precipitation progresses in space over time to affect a watershed. Other uses include forestry and even electrical systems management. Forestry experts using an STIS are able to examine the rates of movements of forest fires, how they evolve over time, and their impact on forest growth over long periods of time. A large electrical network system manager uses an STIS to track the failures and repairs of electrical transformers. Use of an STIS in this case allows the reconstruction of the status of the network at any given past time. Mining Spatio-Temporal Information Systems, an edited volume is composed of chapters from leading experts in the field of Spatial-Temporal Information Systems and addresses the many issues in support of modeling, creation, querying, visualizing and mining. Mining Spatio-Temporal Information Systems is intended to bring together a coherent body of recent knowledge relating to STIS data modeling, design, implementation and STIS in knowledge discovery. In particular, the reader is exposed to the latest techniques for the practical design of STIS, essential for complex query processing. Mining Spatio-Temporal Information Systems is structured to meet the needs of practitioners and researchers in industry and graduate-level students in Computer Science.

  • - For Authentication in an E-World
    af David D. Zhang
    1.680,95 kr.

    In today's complex, geographically mobile, electronically wired information society, the problem of identifying a person continues to pose a great challenge. Since the conventional technology of using a Personal Identification Number (PIN) or password hardly meets the requirements of an authentication system, biometric-based authentication is emerging as the most reliable method. Biometric Solutions for Authentication in an E-World provides a collection of sixteen chapters containing tutorial articles and new material in a unified manner. This includes the basic concepts, theories, and characteristic features of integrating/formulating different facets of biometric solutions for authentication, with recent developments and significant applications in an E-world. This book provides the reader with a basic concept of biometrics, an in-depth discussion exploring biometric technologies in various applications in an E-world. It also includes a detailed description of typical biometric-based security systems and up-to-date coverage of how these issues are developed. Experts from all over the world demonstrate the various ways this integration can be made to efficiently design methodologies, algorithms, architectures, and implementations for biometric-based applications in an E-world. Biometric Solutions for Authentication in an E-World meets the needs of a professional audience composed of researchers and practitioners in industry and graduate-level students in computer science and engineering. Researchers and practitioners in research and development laboratories working in fields of security systems design, biometrics, immigration, law enforcement, control, pattern recognition, and the Internet will benefit from this book.

  • af Soha Hassoun
    1.680,95 kr.

    Research and development of logic synthesis and verification have matured considerably over the past two decades. Many commercial products are available, and they have been critical in harnessing advances in fabrication technology to produce today's plethora of electronic components. While this maturity is assuring, the advances in fabrication continue to seemingly present unwieldy challenges. Logic Synthesis and Verification provides a state-of-the-art view of logic synthesis and verification. It consists of fifteen chapters, each focusing on a distinct aspect. Each chapter presents key developments, outlines future challenges, and lists essential references. Two unique features of this book are technical strength and comprehensiveness. The book chapters are written by twenty-eight recognized leaders in the field and reviewed by equally qualified experts. The topics collectively span the field. Logic Synthesis and Verification fills a current gap in the existing CAD literature. Each chapter contains essential information to study a topic at a great depth, and to understand further developments in the field. The book is intended for seniors, graduate students, researchers, and developers of related Computer-Aided Design (CAD) tools. From the foreword: "e;The commercial success of logic synthesis and verification is due in large part to the ideas of many of the authors of this book. Their innovative work contributed to design automation tools that permanently changed the course of electronic design."e; by Aart J. de Geus, Chairman and CEO, Synopsys, Inc.

  • af Mohammed Ismail
    1.086,95 kr.

    Very large scale integration (VLSI) technologies are now maturing with a current emphasis toward submicron structures and sophisticated applications combining digital as well as analog circuits on a single chip. Abundant examples are found on today's advanced systems for telecom- munications, robotics, automotive electronics, image processing, intelli- gent sensors, etc .. Exciting new applications are being unveiled in the field of neural computing where the massive use of analog/digital VLSI technologies will have a significant impact. To match such a fast technological trend towards single chip ana- logi digital VLSI systems, researchers worldwide have long realized the vital need of producing advanced computer aided tools for designing both digital and analog circuits and systems for silicon integration. Ar- chitecture and circuit compilation, device sizing and the layout genera- tion are but a few familiar tasks on the world of digital integrated circuit design which can be efficiently accomplished by matured computer aided tools. In contrast, the art of tools for designing and producing analog or even analogi digital integrated circuits is quite primitive and still lack- ing the industrial penetration and acceptance already achieved by digital counterparts. In fact, analog design is commonly perceived to be one of the most knowledge-intensive design tasks and analog circuits are still designed, largely by hand, by expert intimately familiar with nuances of the target application and integrated circuit fabrication process. The techniques needed to build good analog circuits seem to exist solely as expertise invested in individual designers.

  • - A Special Issue of Machine Learning on Knowledge Acquisition
    af Sandra Marcus
    1.083,95 kr.

    What follows is a sampler of work in knowledge acquisition. It comprises three technical papers and six guest editorials. The technical papers give an in-depth look at some of the important issues and current approaches in knowledge acquisition. The editorials were pro- duced by authors who were basically invited to sound off. I've tried to group and order the contributions somewhat coherently. The following annotations emphasize the connections among the separate pieces. Buchanan's editorial starts on the theme of "e;Can machine learning offer anything to expert systems?"e; He emphasizes the practical goals of knowledge acquisition and the challenge of aiming for them. Lenat's editorial briefly describes experience in the development of CYC that straddles both fields. He outlines a two-phase development that relies on an engineering approach early on and aims for a crossover to more automated techniques as the size of the knowledge base increases. Bareiss, Porter, and Murray give the first technical paper. It comes from a laboratory of machine learning researchers who have taken an interest in supporting the development of knowledge bases, with an emphasis on how development changes with the growth of the knowledge base. The paper describes two systems. The first, Protos, adjusts the training it expects and the assistance it provides as its knowledge grows. The second, KI, is a system that helps integrate knowledge into an already very large knowledge base.

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.