Gør som tusindvis af andre bogelskere
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.Du kan altid afmelde dig igen.
The book provides a timely coverage of the paradigm of knowledge distillation¿an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a variety of training schemes, teacher¿student architectures, and distillation algorithms. The book covers a wealth of topics including recent developments in vision and language learning, relational architectures, multi-task learning, and representative applications to image processing, computer vision, edge intelligence, and autonomous systems. The book is of relevance to a broad audience including researchers and practitioners active in the area of machine learning and pursuing fundamental and applied research in the area of advanced learning paradigms.
This book provides timely studies on multi-view facets of data analytics by covering recent trends in processing and reasoning about data originating from an array of local sources. A multi-view nature of data analytics is encountered when working with a variety of real-world scenarios including clustering, consensus building in decision processes, computer vision, knowledge representation, big data, data streaming, among others.The chapters demonstrate recent pursuits in the methodology, theory, advanced algorithms, and applications of multi-view data analytics and bring new perspectives of data interpretation. The timely book will appeal to a broad readership including both researchers and practitioners interested in gaining exposure to the rapidly growing trend of multi-view data analytics and intelligent systems.
Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.
Ved tilmelding accepterer du vores persondatapolitik.