A systematic investigation of growth in nature and society, from tiny organisms to the trajectories of empires and civilizations. Growth has been both an unspoken and an explicit aim of our individual and collective striving. It governs the lives of microorganisms and galaxies; it shapes the capabilities of our extraordinarily large brains and the fortunes of our economies. Growth is manifested in annual increments of continental crust, a rising gross domestic product, a child''s growth chart, the spread of cancerous cells. In this magisterial book, Vaclav Smil offers systematic investigation of growth in nature and society, from tiny organisms to the trajectories of empires and civilizations. Smil takes readers from bacterial invasions through animal metabolisms to megacities and the global economy. He begins with organisms whose mature sizes range from microscopic to enormous, looking at disease-causing microbes, the cultivation of staple crops, and human growth from infancy to adulthood. He examines the growth of energy conversions and man-made objects that enable economic activities--developments that have been essential to civilization. Finally, he looks at growth in complex systems, beginning with the growth of human populations and proceeding to the growth of cities. He considers the challenges of tracing the growth of empires and civilizations, explaining that we can chart the growth of organisms across individual and evolutionary time, but that the progress of societies and economies, not so linear, encompasses both decline and renewal. The trajectory of modern civilization, driven by competing imperatives of material growth and biospheric limits, Smil tells us, remains uncertain.
B>Why the United States lags behind other industrialized countries in sharing the benefits of innovation with workers and how we can remedy the problem./b>br>br>The United States has too many low-quality, low-wage jobs. Every country has its share, but those in the United States are especially poorly paid and often without benefits. Meanwhile, overall productivity increases steadily and new technology has transformed large parts of the economy, enhancing the skills and paychecks of higher paid knowledge workers. Whats wrong with this picture? Why have so many workers benefited so little from decades of growth?;The Work of the Future;shows that technology is neither the problem nor the solution. We can build better jobs if we create institutions that leverage technological innovation and also support workers though long cycles of technological transformation.br>;br>Building on findings from the multiyear MIT Task Force on the Work of the Future, the book argues that we must foster institutional innovations that complement technological change. Skills programs that emphasize work-based and hybrid learning (in person and online), for example, empower workers to become and remain productive in a continuously evolving workplace. Industries fueled by new technology that augments workers can supply good jobs, and federal investment in R&D can help make these industries worker-friendly. We must act to ensure that the labor market of the future offers benefits, opportunity, and a measure of economic security to all.
B>A comprehensive update of a widely used textbook, with new material on matchings in bipartite graphs, online algorithms, machine learning, and other topics.br>;/b>br>br>Some books on algorithms are rigorous but incomplete; others cover masses of material but lack rigor. u>Introduction to Algorithms/u> uniquely combines rigor and comprehensiveness. It covers a broad range of algorithms in depth, yet makes their design and analysis accessible to all levels of readers. Since the publication of the first edition, Introduction to Algorithms has become a widely used text in universities worldwide as well as the standard reference for professionals. This fourth edition has been updated throughout, with new chapters on matchings in bipartite graphs, online algorithms, and machine learning, and new material on such topics as solving recurrence equations, hash tables, potential functions, and suffix arrays.br>;br>Each chapter is relatively self-contained, presenting an algorithm, a design technique, an application area, or a related topic, and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor. The fourth edition has 140 new exercises and 22 new problems, and color has been added to improve visual presentations. The writing has been revised throughout, and made clearer, more personal, and gender neutral. The books website offers supplemental material.;
An introduction to a broad range of topics in deep learning, covering mathematical and conceptual background, deep learning techniques used in industry, and research perspectives. "Written by three experts in the field, Deep Learning is the only comprehensive book on the subject." --Elon Musk , cochair of OpenAI; cofounder and CEO of Tesla and SpaceX Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
The significantly expanded and updated new edition of a widely used text on reinforcement learning, one of the most active research areas in artificial intelligence.
Reinforcement learning, one of the most active research areas in artificial intelligence, is a computational approach to learning whereby an agent tries to maximize the total amount of reward it receives while interacting with a complex, uncertain environment. In Reinforcement Learning, Richard Sutton and Andrew Barto provide a clear and simple account of the field''s key ideas and algorithms. This second edition has been significantly expanded and updated, presenting new topics and updating coverage of other topics.
Like the first edition, this second edition focuses on core online learning algorithms, with the more mathematical material set off in shaded boxes. Part I covers as much of reinforcement learning as possible without going beyond the tabular case for which exact solutions can be found. Many algorithms presented in this part are new to the second edition, including UCB, Expected Sarsa, and Double Learning. Part II extends these ideas to function approximation, with new sections on such topics as artificial neural networks and the Fourier basis, and offers expanded treatment of off-policy learning and policy-gradient methods. Part III has new chapters on reinforcement learning''s relationships to psychology and neuroscience, as well as an updated case-studies chapter including AlphaGo and AlphaGo Zero, Atari game playing, and IBM Watson''s wagering strategy. The final chapter discusses the future societal impacts of reinforcement learning.
Structure and Interpretation of Computer Programs has had a dramatic impact on computer science curricula over the past decade. This long-awaited revision contains changes throughout the text. There are new implementations of most of the major programming systems in the book, including the interpreters and compilers, and the authors have incorporated many small changes that reflect their experience teaching the course at MIT since the first edition was published. A new theme has been introduced that emphasizes the central role played by different approaches to dealing with time in computational models: objects with state, concurrent programming, functional programming and lazy evaluation, and nondeterministic programming. There are new example sections on higher-order procedures in graphics and on applications of stream processing in numerical programming, and many new exercises. In addition, all the programs have been reworked to run in any Scheme implementation that adheres to the IEEE standard.
A comprehensive introduction to the foundations of model checking, a fully automated technique for finding flaws in hardware and software; with extensive examples and both practical and theoretical exercises. Our growing dependence on increasingly complex computer and software systems necessitates the development of formalisms, techniques, and tools for assessing functional properties of these systems. One such technique that has emerged in the last twenty years is model checking, which systematically (and automatically) checks whether a model of a given system satisfies a desired property such as deadlock freedom, invariants, and request-response properties. This automated technique for verification and debugging has developed into a mature and widely used approach with many applications. Principles of Model Checking offers a comprehensive introduction to model checking that is not only a text suitable for classroom use but also a valuable reference for researchers and practitioners in the field. The book begins with the basic principles for modeling concurrent and communicating systems, introduces different classes of properties (including safety and liveness), presents the notion of fairness, and provides automata-based algorithms for these properties. It introduces the temporal logics LTL and CTL, compares them, and covers algorithms for verifying these logics, discussing real-time systems as well as systems subject to random phenomena. Separate chapters treat such efficiency-improving techniques as abstraction and symbolic manipulation. The book includes an extensive set of examples (most of which run through several chapters) and a complete set of basic results accompanied by detailed proofs. Each chapter concludes with a summary, bibliographic notes, and an extensive list of exercises of both practical and theoretical nature.
A comprehensive political and design theory of planetary-scale computation proposing that The Stack--an accidental megastructure--is both a technological apparatus and a model for a new geopolitical architecture. What has planetary-scale computation done to our geopolitical realities? It takes different forms at different scales--from energy and mineral sourcing and subterranean cloud infrastructure to urban software and massive universal addressing systems; from interfaces drawn by the augmentation of the hand and eye to users identified by self--quantification and the arrival of legions of sensors, algorithms, and robots. Together, how do these distort and deform modern political geographies and produce new territories in their own image? In The Stack , Benjamin Bratton proposes that these different genres of computation--smart grids, cloud platforms, mobile apps, smart cities, the Internet of Things, automation--can be seen not as so many species evolving on their own, but as forming a coherent whole: an accidental megastructure called The Stack that is both a computational apparatus and a new governing architecture. We are inside The Stack and it is inside of us. In an account that is both theoretical and technical, drawing on political philosophy, architectural theory, and software studies, Bratton explores six layers of The Stack: Earth , Cloud , City , Address , Interface , User . Each is mapped on its own terms and understood as a component within the larger whole built from hard and soft systems intermingling--not only computational forms but also social, human, and physical forces. This model, informed by the logic of the multilayered structure of protocol "stacks," in which network technologies operate within a modular and vertical order, offers a comprehensive image of our emerging infrastructure and a platform for its ongoing reinvention. The Stack is an interdisciplinary design brief for a new geopolitics that works with and for planetary-scale computation. Interweaving the continental, urban, and perceptual scales, it shows how we can better build, dwell within, communicate with, and govern our worlds. thestack.org
B>An essential guide to designing, conducting, and analyzing event-related potential (ERP) experiments, completely updated for this edition./b>The event-related potential (ERP) technique, in which neural responses to specific events are extracted from the EEG, provides a powerful noninvasive tool for exploring the human brain. This volume describes practical methods for ERP research along with the underlying theoretical rationale. It offers researchers and students an essential guide to designing, conducting, and analyzing ERP experiments. This second edition has been completely updated, with additional material, new chapters, and more accessible explanations. Freely available supplementary material, including several online-only chapters, offer expanded or advanced treatment of selected topics.The first half of the book presents essential background information, describing the origins of ERPs, the nature of ERP components, and the design of ERP experiments. The second half of the book offers a detailed treatment of the main steps involved in conducting ERP experiments, covering such topics as recording the EEG, filtering the EEG and ERP waveforms, and quantifying amplitudes and latencies. Throughout, the emphasis is on rigorous experimental design and relatively simple analyses. New material in the second edition includes entire chapters devoted to components, artifacts, measuring amplitudes and latencies, and statistical analysis; updated coverage of recording technologies; concrete examples of experimental design; and many more figures. Online chapters cover such topics as overlap, localization, writing and reviewing ERP papers, and setting up and running an ERP lab.
What altered states of consciousness--the dissolution of feelings of time and self--can tell us about the mystery of consciousness. During extraordinary moments of consciousness--shock, meditative states and sudden mystical revelations, out-of-body experiences, or drug intoxication--our senses of time and self are altered; we may even feel time and self dissolving. These experiences have long been ignored by mainstream science, or considered crazy fantasies. Recent research, however, has located the neural underpinnings of these altered states of mind. In this book, neuropsychologist Marc Wittmann shows how experiences that disturb or widen our everyday understanding of the self can help solve the mystery of consciousness. Wittmann explains that the relationship between consciousness of time and consciousness of self is close; in extreme circumstances, the experiences of space and self intensify and weaken together. He considers the emergence of the self in waking life and dreams; how our sense of time is distorted by extreme situations ranging from terror to mystical enlightenment; the experience of the moment; and the loss of time and self in such disorders as depression, schizophrenia, and epilepsy. Dostoyevsky reported godly bliss during epileptic seizures; neurologists are now investigating the phenomenon of the epileptic aura. Wittmann describes new studies of psychedelics that show how the brain builds consciousness of self and time, and discusses pilot programs that use hallucinogens to treat severe depression, anxiety, and addiction. If we want to understand our consciousness, our subjectivity, Wittmann argues, we must not be afraid to break new ground. Studying altered states of consciousness leads us directly to the heart of the matter: time and self, the foundations of consciousness.
An updated edition of a classic: an indispensable companion for a new era in cycling. The bicycle is almost unique among human-powered machines in that it uses human muscles in a near-optimum way. This essential volume offers a comprehensive account of the history of bicycles, how human beings propel them, what makes them go faster--and what keeps them from going even faster. Over the years, and through three previous editions, Bicycling Science has become the bible of technical bicycling not only for designers and builders of bicycles but also for cycling enthusiasts. After a brief history of bicycles and bicycling that demolishes many widespread myths, this fourth edition covers recent experiments and research on human-powered transportation, with updated material on cycling achievements, human-powered machines for use on land and in air and water, power-assisted bicycles, and human physiology. The authors have also added new information on aerodynamics, rolling drag, transmission of power from rider to wheels, braking, heat management, steering and stability, power and speed, and other topics. This edition also includes many new references and figures. With racks of bikeshare bikes on city sidewalks, and new restrictions on greenhouse gas-emitting cars, bicycle use will only grow. This book is the indispensable companion for a new era in cycling.
A novel, integrative approach to cities as complex adaptive systems, applicable to issues ranging from innovation to economic prosperity to settlement patterns.
Human beings around the world increasingly live in urban environments. In Introduction to Urban Science, Luis Bettencourt takes a novel, integrative approach to understanding cities as complex adaptive systems, claiming that they require us to frame the field of urban science in a way that goes beyond existing theory in such traditional disciplines as sociology, geography, and economics. He explores the processes facilitated by and, in many cases, unleashed for the first time by urban life through the lenses of social heterogeneity, complex networks, scaling, circular causality, and information.
Though the idea that cities are complex adaptive systems has become mainstream, until now those who study cities have lacked a comprehensive theoretical framework for understanding cities and urbanization, for generating useful and falsifiable predictions, and for constructing a solid body of empirical evidence so that the discipline of urban science can continue to develop. Bettencourt applies his framework to such issues as innovation and development across scales, human reasoning and strategic decision-making, patterns of settlement and mobility and their influence on socioeconomic life and resource use, inequality and inequity, biodiversity, and the challenges of sustainable development in both high- and low-income nations. It is crucial, says Bettencourt, to realize that cities are not zero-sum games and that knowledge, human cooperation, and collective action can build a better future.
The latest edition of the essential text and professional reference, with substantial new material on such topics as vEB trees, multithreaded algorithms, dynamic programming, and edge-based flow. Some books on algorithms are rigorous but incomplete; others cover masses of material but lack rigor. Introduction to Algorithms uniquely combines rigor and comprehensiveness. The book covers a broad range of algorithms in depth, yet makes their design and analysis accessible to all levels of readers. Each chapter is relatively self-contained and can be used as a unit of study. The algorithms are described in English and in a pseudocode designed to be readable by anyone who has done a little programming. The explanations have been kept elementary without sacrificing depth of coverage or mathematical rigor. The first edition became a widely used text in universities worldwide as well as the standard reference for professionals. The second edition featured new chapters on the role of algorithms, probabilistic analysis and randomized algorithms, and linear programming. The third edition has been revised and updated throughout. It includes two completely new chapters, on van Emde Boas trees and multithreaded algorithms, substantial additions to the chapter on recurrence (now called "Divide-and-Conquer"), and an appendix on matrices. It features improved treatment of dynamic programming and greedy algorithms and a new notion of edge-based flow in the material on flow networks. Many exercises and problems have been added for this edition. The international paperback edition is no longer available; the hardcover is available worldwide.
The second edition of a comprehensive introduction to all aspects of mobile robotics, from algorithms to mechanisms. Mobile robots range from the Mars Pathfinder mission''s teleoperated Sojourner to the cleaning robots in the Paris Metro. This text offers students and other interested readers an introduction to the fundamentals of mobile robotics, spanning the mechanical, motor, sensory, perceptual, and cognitive layers the field comprises. The text focuses on mobility itself, offering an overview of the mechanisms that allow a mobile robot to move through a real world environment to perform its tasks, including locomotion, sensing, localization, and motion planning. It synthesizes material from such fields as kinematics, control theory, signal analysis, computer vision, information theory, artificial intelligence, and probability theory. The book presents the techniques and technology that enable mobility in a series of interacting modules. Each chapter treats a different aspect of mobility, as the book moves from low-level to high-level details. It covers all aspects of mobile robotics, including software and hardware design considerations, related technologies, and algorithmic techniques. This second edition has been revised and updated throughout, with 130 pages of new material on such topics as locomotion, perception, localization, and planning and navigation. Problem sets have been added at the end of each chapter. Bringing together all aspects of mobile robotics into one volume, Introduction to Autonomous Mobile Robots can serve as a textbook or a working tool for beginning practitioners. Curriculum developed by Dr. Robert King, Colorado School of Mines, and Dr. James Conrad, University of North Carolina-Charlotte, to accompany the National Instruments LabVIEW Robotics Starter Kit, are available. Included are 13 (6 by Dr. King and 7 by Dr. Conrad) laboratory exercises for using the LabVIEW Robotics Starter Kit to teach mobile robotics concepts.
The new edition of an introductory text that teaches students the art of computational problem solving, covering topics ranging from simple algorithms to information visualization.
This book introduces students with little or no prior programming experience to the art of computational problem solving using Python and various Python libraries, including PyLab. It provides students with skills that will enable them to make productive use of computational techniques, including some of the tools and techniques of data science for using computation to model and interpret data. The book is based on an MIT course (which became the most popular course offered through MIT''s OpenCourseWare) and was developed for use not only in a conventional classroom but in in a massive open online course (MOOC). This new edition has been updated for Python 3, reorganized to make it easier to use for courses that cover only a subset of the material, and offers additional material including five new chapters.
Students are introduced to Python and the basics of programming in the context of such computational concepts and techniques as exhaustive enumeration, bisection search, and efficient approximation algorithms. Although it covers such traditional topics as computational complexity and simple algorithms, the book focuses on a wide range of topics not found in most introductory texts, including information visualization, simulations to model randomness, computational techniques to understand data, and statistical techniques that inform (and misinform) as well as two related but relatively advanced topics: optimization problems and dynamic programming. This edition offers expanded material on statistics and machine learning and new chapters on Frequentist and Bayesian statistics.
An updated edition of a comprehensive study of the theory that mind exists, in some form, in all living and nonliving things. In Panpsychism in the West , the first comprehensive study of the subject, David Skrbina argues for the importance of panpsychism--the theory that mind exists, in some form, in all living and nonliving things--in consideration of the nature of consciousness and mind. Panpsychism, with its conception of mind as a general phenomenon of nature, uniquely links being and mind. More than a theory of mind, it is a meta-theory--a statement about theories of mind rather than a theory in itself. Panpsychism can parallel almost every current theory of mind; it simply holds that, no matter how one conceives of mind, such mind applies to all things. After a brief discussion of general issues surrounding philosophy of mind, Skrbina examines the panpsychist views of philosophers from the pre-Socratics to the post-structuralists. The original edition of Panpsychism in the West helped to reinvigorate a neglected and important aspect of philosophic thinking. This revised edition offers expanded and updated material that reflects the growth of panpsychism as a subdiscipline. It covers the problem of emergence of mind from a non-mental reality and the combination problem in greater detail. It offers expanded coverage of the pre-Socratics and Plato; a new section on Augustine; expanded discussions of Continental panpsychism, scientific arguments, Nietzsche, and Whitehead; and a new section on Russellian monism. With this edition, Panpsychism in the West will be continue to be the standard work on the topic.
B>How to achieve a happier and healthier game design process by connecting the creative aspects of game design with techniques for effective project management.br>;br>;/b>br>br>This book teaches game designers, aspiring game developers, and game design students how to take a digital game project from start to finish--from conceptualizing and designing to building, playtesting, and iterating--while avoiding the uncontrolled overwork known among developers as crunch. Written by a legendary game designer, A Playful Production Process outlines a process that connects the creative aspects of game design with proven techniques for effective project management. The book outlines four project phases--ideation, preproduction, full production, and post-production--that give designers and developers the milestones they need to advance from the first glimmerings of an idea to a finished game.;
How information can make us happy or miserable, and why we sometimes avoid it and sometimes seek it out. How much information is too much? Do we need to know how many calories are in the giant vat of popcorn that we bought on our way into the movie theater? Do we want to know if we are genetically predisposed to a certain disease? Can we do anything useful with next week''s weather forecast for Paris if we are not in Paris? In Too Much Information , Cass Sunstein examines the effects of information on our lives. Policymakers emphasize "the right to know," but Sunstein takes a different perspective, arguing that the focus should be on human well-being and what information contributes to it. Government should require companies, employers, hospitals, and others to disclose information not because of a general "right to know" but when the information in question would significantly improve people''s lives. Sunstein argues that the information on warnings and mandatory labels is often confusing or irrelevant, yielding no benefit. He finds that people avoid information if they think it will make them sad (and seek information they think will make them happy). Our information avoidance and information seeking is notably heterogeneous--some of us do want to know the popcorn calorie count, others do not. Of course, says Sunstein, we are better off with stop signs, warnings on prescriptions drugs, and reminders about payment due dates. But sometimes less is more. What we need is more clarity about what information is actually doing or achieving.
A comprehensive introduction to machine learning that uses probabilistic models and inference as a unifying approach. Today''s Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic methods, the book stresses a principled model-based approach, often using the language of graphical models to specify models in a concise and intuitive way. Almost all the models described have been implemented in a MATLAB software package--PMTK (probabilistic modeling toolkit)--that is freely available online. The book is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.