Your cart is empty
The last 20 years have witnessed a considerable increase in the use of time series techniques in econometrics. The articles in this important set have been chosen to illustrate the main themes in time series work as it relates to econometrics. The editor has written a new concise introduction to accompany the articles. Sections covered include: Ad Hoc Forecasting Procedures, ARIMA Modelling, Structural Time Series Models, Unit Roots, Detrending and Non-stationarity, Seasonality, Seasonal Adjustment and Calendar Effects, Dynamic Regression and Intervention Analysis, Multivariate Models, Causality, Exogeneity and Expectations, State Space Models and the Kalman Filter, Non-Linear and Non-Gaussian Models.
This is a two-volume collection of major papers which have shaped the development of econometrics. Part I includes articles which together provide an overview of the history of econometrics, Part II addresses the relationship between econometrics and statistics, the articles in Part III constitute early applied studies, and Part IV includes articles concerned with the role and method of econometrics. The work comprises 42 articles, dating from 1921-1991, and contributors include E.W. Gilboy, W.C. Mitchell, J.J. Spengler, R. Stone, H.O. Wold and S. Wright.
Takeshi Amemiya has made a significant contribution to econometric theory over the past 30 years. This volume brings together 34 of his key articles and papers on areas such as limited dependent variables, non-linear simultaneous equations models, time series analysis and error components models. Many of the articles reprinted in this volume are indispensable references for researchers in the relevant fields. The specially written preface outlines the influences and motivations behind Professor Amemiya's work. Studies in Econometric Theory presents in a single volume the most significant work of one of the most important influential econometricians of our time.
The methodology of econometrics is concerned with rules governing the building of statistical models in economics. These two volumes draw together 62 previously published studies in economics and statistics. The volumes are divided into 12 sections covering controversial topics ranging from the earliest days of econometrics to the present. Sections include policy analysis, exogeneity, causality, Bayesian perspective and British econometrics.
This two volume set brings together a key selection of papers written by Jacques J. Polak over the last 50 years in the fields of economics, econometrics and finance. Presented under five broad headings, the collection begins with his work on international and national business cycles - a subject on which the author worked with Nobel Prize winner Jan Tinbergen - problems of international trade and balance of payments adjustment. Later sections examine exchange rates and how they affect the balance of payments, inflation and hyperinflation; the monetary approach to the balance of payments, a subject that the author pioneered in the IMF and that became the framework of the conditionality of IMF credits; and international liquidity, with particular reference to the special drawing right (SDR). The final section features the author's essays on the international monetary system itself, including topics such as the international co ordination of national economic policies, the changes over time in the objectives of national policy making in the main industrial countries and reform of the system. Economic Theory and Financial Policy will be welcomed by researchers, students and practitioners concerned with economics, government finance, banking and international economic relations.
Data Mining for Business Analytics: Concepts, Techniques, and Applications in XLMiner(R), Third Edition presents an applied approach to data mining and predictive analytics with clear exposition, hands-on exercises, and real-life case studies. Readers will work with all of the standard data mining methods using the Microsoft(R) Office Excel(R) add-in XLMiner(R) to develop predictive models and learn how to obtain business value from Big Data. Featuring updated topical coverage on text mining, social network analysis, collaborative filtering, ensemble methods, uplift modeling and more, the Third Edition also includes: * Real-world examples to build a theoretical and practical understanding of key data mining methods * End-of-chapter exercises that help readers better understand the presented material * Data-rich case studies to illustrate various applications of data mining techniques * Completely new chapters on social network analysis and text mining * A companion site with additional data sets, instructors material that include solutions to exercises and case studies, and Microsoft PowerPoint(R) slides * Free 140-day license to use XLMiner for Education software Data Mining for Business Analytics: Concepts, Techniques, and Applications in XLMiner(R), Third Edition is an ideal textbook for upper-undergraduate and graduate-level courses as well as professional programs on data mining, predictive modeling, and Big Data analytics. The new edition is also a unique reference for analysts, researchers, and practitioners working with predictive analytics in the fields of business, finance, marketing, computer science, and information technology. Praise for the Second Edition " full of vivid and thought-provoking anecdotes...needs to be read by anyone with a serious interest in research and marketing." Research Magazine "Shmueli et al. have done a wonderful job in presenting the field of data mining - a welcome addition to the literature." ComputingReviews.com "Excellent choice for business analysts...The book is a perfect fit for its intended audience." Keith McCormick, Consultant and Author of SPSS Statistics For Dummies, Third Edition and SPSS Statistics for Data Analysis and Visualization Galit Shmueli, PhD, is Distinguished Professor at National Tsing Hua University s Institute of Service Science. She has designed and instructed data mining courses since 2004 at University of Maryland, Statistics.com, The Indian School of Business, and National Tsing Hua University, Taiwan. Professor Shmueli is known for her research and teaching in business analytics, with a focus on statistical and data mining methods in information systems and healthcare. She has authored over 70 journal articles, books, textbooks and book chapters. Peter C. Bruce is President and Founder of the Institute for Statistics Education at www.statistics.com. He has written multiple journal articles and is the developer of Resampling Stats software. He is the author of Introductory Statistics and Analytics: A Resampling Perspective, also published by Wiley. Nitin R. Patel, PhD, is Chairman and cofounder of Cytel, Inc., based in Cambridge, Massachusetts. A Fellow of the American Statistical Association, Dr. Patel has also served as a Visiting Professor at the Massachusetts Institute of Technology and at Harvard University. He is a Fellow of the Computer Society of India and was a professor at the Indian Institute of Management, Ahmedabad for 15 years.
This text offers a comprehensive presentation of the mathematics required to tackle problems in economic analyses. To give a better understanding of the mathematical concepts, the text follows the logic of the development of mathematics rather than that of an economics course. The only prerequisite is high school algebra, but the book goes on to cover all the mathematics needed for undergraduate economics. It is also a useful reference for graduate students.
"Structural Macroeconometrics" provides a thorough overview and in-depth exploration of methodologies, models, and techniques used to analyze forces shaping national economies. In this thoroughly revised second edition, David DeJong and Chetan Dave emphasize time series econometrics and unite theoretical and empirical research, while taking into account important new advances in the field.
The authors detail strategies for solving dynamic structural models and present the full range of methods for characterizing and evaluating empirical implications, including calibration exercises, method-of-moment procedures, and likelihood-based procedures, both classical and Bayesian. The authors look at recent strides that have been made to enhance numerical efficiency, consider the expanded applicability of dynamic factor models, and examine the use of alternative assumptions involving learning and rational inattention on the part of decision makers. The treatment of methodologies for obtaining nonlinear model representations has been expanded, and linear and nonlinear model representations are integrated throughout the text. The book offers a rich array of implementation algorithms, sample empirical applications, and supporting computer code.
"Structural Macroeconometrics" is the ideal textbook for graduate students seeking an introduction to macroeconomics and econometrics, and for advanced students pursuing applied research in macroeconomics. The book's historical perspective, along with its broad presentation of alternative methodologies, makes it an indispensable resource for academics and professionals.
Continuous-Time Models in Corporate Finance synthesizes four decades of research to show how stochastic calculus can be used in corporate finance. Combining mathematical rigor with economic intuition, Santiago Moreno-Bromberg and Jean-Charles Rochet analyze corporate decisions such as dividend distribution, the issuance of securities, and capital structure and default. They pay particular attention to financial intermediaries, including banks and insurance companies. The authors begin by recalling the ways that option-pricing techniques can be employed for the pricing of corporate debt and equity. They then present the dynamic model of the trade-off between taxes and bankruptcy costs and derive implications for optimal capital structure. The core chapter introduces the workhorse liquidity-management model--where liquidity and risk management decisions are made in order to minimize the costs of external finance. This model is used to study corporate finance decisions and specific features of banks and insurance companies. The book concludes by presenting the dynamic agency model, where financial frictions stem from the lack of interest alignment between a firm's manager and its financiers. The appendix contains an overview of the main mathematical tools used throughout the book. Requiring some familiarity with stochastic calculus methods, Continuous-Time Models in Corporate Finance will be useful for students, researchers, and professionals who want to develop dynamic models of firms' financial decisions.
Econometric issues have provoked a lively and sometimes adversarial debate in the economics profession. The excitement and intellectual vitality of that debate is captured here for the reader in a lucid overview of econometric approaches, describing their advantages and limitations. This ambitious book focuses on the underlying methodological issues rather than concentrating upon econometric techniques. The limits of econometric investigations are identified through a critical appraisal of three different approaches associated with the work of Professors Hendry, Leamer and Sims. After explaining why the early optimism in econometrics was misplaced, it argues that rejection is not an appropriate response. It offers a rich spectrum of approaches to a problem of central importance in the development of modern economics. The book will appeal not only to all econometricians whatever their persuasion but also to all those with an interest in the methodology of economics.
Discover how statistical information impacts decisions in today's business world as Anderson/Sweeney/Williams/Camm/Cochran/Fry/Ohlmann's leading ESSENTIALS OF STATISTICS FOR BUSINESS AND ECONOMICS, 9E connects concepts in each chapter to real-world practice. This edition delivers sound statistical methodology, a proven problem-scenario approach and meaningful applications that reflect the latest developments in business and statistics today. More than 350 new and proven real business examples, a wealth of practical cases and meaningful hands-on exercises highlight statistics in action. You gain practice using leading professional statistical software with exercises and appendices that walk you through using JMP (R) Student Edition 14 and Excel (R) 2016. WebAssign's online course management systems further strengthens this business statistics approach and helps you maximize your course success.
A concise treatment of modern econometrics and statistics, including underlying ideas from linear algebra, probability theory, and computer programming. This book offers a cogent and concise treatment of econometric theory and methods along with the underlying ideas from statistics, probability theory, and linear algebra. It emphasizes foundations and general principles, but also features many solved exercises, worked examples, and code listings. After mastering the material presented, readers will be ready to take on more advanced work in different areas of quantitative economics and to understand papers from the econometrics literature. The book can be used in graduate-level courses on foundational aspects of econometrics or on fundamental statistical principles. It will also be a valuable reference for independent study. One distinctive aspect of the text is its integration of traditional topics from statistics and econometrics with modern ideas from data science and machine learning; readers will encounter ideas that are driving the current development of statistics and increasingly filtering into econometric methodology. The text treats programming not only as a way to work with data but also as a technique for building intuition via simulation. Many proofs are followed by a simulation that shows the theory in action. As a primer, the book offers readers an entry point into the field, allowing them to see econometrics as a whole rather than as a profusion of apparently unrelated ideas.
High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis.
Yacine Ait-Sahalia and Jean Jacod cover the mathematical foundations of stochastic processes, describe the primary characteristics of high-frequency financial data, and present the asymptotic concepts that their analysis relies on. Ait-Sahalia and Jacod also deal with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As they demonstrate, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes.
Ait-Sahalia and Jacod approach high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike."
Panel Data Econometrics with R provides a tutorial for using R in the field of panel data econometrics. Illustrated throughout with examples in econometrics, political science, agriculture and epidemiology, this book presents classic methodology and applications as well as more advanced topics and recent developments in this field including error component models, spatial panels and dynamic models. They have developed the software programming in R and host replicable material on the book s accompanying website.
This textbook offers a unique blend of theory and practical application. Taking students from a basic level up to an advanced understanding in an intuitive, step-by-step fashion, it provides perfect preparation for doing applied econometric work. Economic tests and methods of estimation are presented clearly, and practical guidance on using several types of software packages is given. Real world data is used throughout and emphasis is given to the interpretation of the results, and the conclusions to be drawn from them in econometric work. This book will be core reading for undergraduate and Master's students on an Economics or Finance degree, who take a course in applied econometrics. Its practical nature makes it perfect for modules requiring a research project.
This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space models, with an optional chapter on spectral analysis. Many additional special topics are also covered. New to this edition: A chapter devoted to Financial Time Series Introductions to Brownian motion, Levy processes and Ito calculus An expanded section on continuous-time ARMA processes
"Students of econometrics and their teachers will find this book to be the best introduction to the subject at the graduate and advanced undergraduate level. Starting with least squares regression, Hayashi provides an elegant exposition of all the standard topics of econometrics, including a detailed discussion of stationary and non-stationary time series. The particular strength of the book is the excellent balance between econometric theory and its applications, using GMM as an organizing principle throughout. Each chapter includes a detailed empirical example taken from classic and current applications of econometrics."--Dale Jorgensen, Harvard University
""Econometrics" will be a very useful book for intermediate and advanced graduate courses. It covers the topics with an easy to understand approach while at the same time offering a rigorous analysis. The computer programming tips and problems should also be useful to students. I highly recommend this book for an up-to-date coverage and thoughtful discussion of topics in the methodology and application of econometrics."--Jerry A. Hausman, Massachusetts Institute of Technology
""Econometrics" covers both modern and classic topics without shifting gears. The coverage is quite advanced yet the presentation is simple. Hayashi brings students to the frontier of applied econometric practice through a careful and efficient discussion of modern economic theory. The empirical exercises are very useful. . . . The projects are carefully crafted and have been thoroughly debugged."--Mark W. Watson, Princeton University
""Econometrics" strikes a good balance between technical rigor and clear exposition. . . . The use of empiricalexamples is well done throughout. I very much like the use of old 'classic' examples. It gives students a sense of history--and shows that great empirical econometrics is a matter of having important ideas and good data, not just fancy new methods. . . . The style is just great, informal and engaging."--James H. Stock, John F. Kennedy School of Government, Harvard University
Despite the transition from apartheid to democracy, South Africa is the most unequal country in the world. Its extremes of wealth and poverty undermine intensifying struggles for a better life for all. The wide-ranging essays in this sixth volume of the New South African Review demonstrate how the consequences of inequality extend throughout society and the political economy, crippling the quest for social justice, polarising the politics, skewing economic outcomes and bringing devastating environmental consequences in their wake. Contributors survey the extent and consequences of inequality across fields as diverse as education, disability, agrarian reform, nuclear geography and small towns, and tackle some of the most difficult social, political and economic issues. How has the quest for greater equality affected progressive political discourse? How has inequality reproduced itself, despite best intentions in social policy, to the detriment of the poor and the historically disadvantaged? How have shifts in mining and the financialisation of the economy reshaped the contours of inequality? How does inequality reach into the daily social life of South Africans, and shape the way in which they interact? How does the extent and shape of inequality in South Africa compare with that of other major countries of the global South which themselves are notorious for their extremes of wealth and poverty? South African extremes of inequality reflect increasing inequality globally, and The Crisis of Inequality will speak to all those - general readers, policy makers, researchers and students - who are demanding a more equal world.
For courses in introductory econometrics. Engaging applications bring the theory and practice of modern econometrics to life Ensure students grasp the relevance of econometrics with Introduction to Econometrics -- the text that connects modern theory and practice with motivating, engaging applications. The 4th Edition, Global Edition, maintains a focus on currency, while building on the philosophy that applications should drive the theory, not the other way around. The text incorporates real-world questions and data, and methods that are immediately relevant to the applications. With very large data sets increasingly being used in economics and related fields, a new chapter dedicated to Big Data helps students learn about this growing and exciting area. This coverage and approach make the subject come alive for students and helps them to become sophisticated consumers of econometrics. Pearson MyLab (TM) Economics is not included. Students, if Pearson MyLab Economics is a recommended/mandatory component of the course, please ask your instructor for the correct ISBN. Pearson MyLab Economics should only be purchased when required by an instructor. Instructors, contact your Pearson representative for more information. Reach every student by pairing this text with Pearson MyLab Economics MyLab (TM) is the teaching and learning platform that empowers you to reach every student. By combining trusted author content with digital tools and a flexible platform, MyLab personalizes the learning experience and improves results for each student. The 4th Edition features expanded exercise sets in Pearson MyLab Economics, offering more flexibility to instructors as they build assignments.
Doubt over the trustworthiness of published empirical results is not unwarranted and is often a result of statistical mis-specification: invalid probabilistic assumptions imposed on data. Now in its second edition, this bestselling textbook offers a comprehensive course in empirical research methods, teaching the probabilistic and statistical foundations that enable the specification and validation of statistical models, providing the basis for an informed implementation of statistical procedure to secure the trustworthiness of evidence. Each chapter has been thoroughly updated, accounting for developments in the field and the author's own research. The comprehensive scope of the textbook has been expanded by the addition of a new chapter on the Linear Regression and related statistical models. This new edition is now more accessible to students of disciplines beyond economics and includes more pedagogical features, with an increased number of examples as well as review questions and exercises at the end of each chapter.
This pithy and engaging volume shows that economists may be better equipped to predict the future than science fiction writers. Economists' ideas, based on both theory and practice, reflect their knowledge of the laws of human interactions as well as years of experimentation and reflection. Although perhaps not as screenplay-ready as a work of fiction, these economists' predictions are ready for their close-ups. In this book, ten prominent economists -- including Nobel laureates and several likely laureates -- offer their ideas about the world of the twenty-second century. In scenarios that range from the optimistic to the guardedly gloomy, these thinkers consider such topics as the transformation of work and wages, the continuing increase in inequality, the economic rise of China and India, the endlessly repeating cycle of crisis and (projected) recovery, the benefits of technology, the economic consequences of political extremism, and the long-range effects of climate change. For example, Daron Acemoglu offers a thoughtful discussion of how trends of the last century -- including uneven growth, technological integration, and resource scarcity -- might translate into the next; 2013 Nobelist Robert Shiller provides an innovative view of future risk management methods using information technology; 2012 Nobelist Alvin Roth projects his theory of Matching Markets into the next century, focusing on schools, jobs, marriage and family, and medicine; 1987 Nobelist Robert Solow considers the shift away from remunerated labor, among other subjects; and Martin Weitzman raises the intriguing but alarming possibility of using geoengineering techniques to mitigate the nevitable effects of climate change. In a 1930 essay mentioned by several contributors, "Economic Possibilities for Our Grandchildren," John Maynard Keynes offered predictions that, read today, range from absolutely correct to spectacularly wrong. This book follows in Keynes's path, hoping, perhaps, to better his average.
In this revised and expanded second edition of the bestselling Encyclopedia of Chart Patterns, Thomas Bulkowski updates the classic with new performance statistics for both bull and bear markets and 23 new patterns, including a second section devoted to ten event patterns. Bulkowski tells you how to trade the significant events -- such as quarterly earnings announcements, retail sales, stock upgrades and downgrades -- that shape today's trading and uses statistics to back up his approach. This comprehensive new edition is a must-have reference if you're a technical investor or trader. Place your order today.
"The most complete reference to chart patterns available. It
goes where no one has gone before. Bulkowski gives hard data on how
good and bad the patterns are. A must-read for anyone that's ever
looked at a chart and wondered what was happening."
This book discusses equi-quantile values and their use in generating decision alternatives under the twofold complexities of uncertainty and dependence, offering scope for surrogating between two alternative portfolios when they are correlated. The book begins with a discussion on components of rationality and learning models as indispensable concepts in decision-making processes. It identifies three-fold complexities in such processes: uncertainty, dependence and dynamism. The book is a novel attempt to seek tangible solutions for such decision problems. To do so, four hundred tables of bi-quantile pairs are presented for carefully chosen grids. In fact, it is a two-variable generalization of the inverse normal integral table, which is used in obtaining bivariate normal quantile pairs for the given values of probability and correlation. When making decisions, only two of them have to be taken at a time. These tables are essential tools for decision-making under risk and dependence, and offer scope for delving up to a single step of dynamism. The book subsequently addresses averments dealing with applications and advantages. The content is useful to empirical scientists and risk-oriented decision makers who are often required to make choices on the basis of pairs of variables. The book also helps simulators seeking valid confidence intervals for their estimates, and particle physicists looking for condensed confidence intervals for Higgs-Boson utilizing the Bose-Einstein correlation given the magnitude of such correlations. Entrepreneurs and investors as well as students of management, statistics, economics and econometrics, psychology, psychometrics and psychographics, social sciences, geographic information system, geology, agricultural and veterinary sciences, medical sciences and diagnostics, and remote sensing will also find the book very useful.
You may like...
Statistics for Management and Economics
Gerald Keller Hardcover (1)
The South African Informal Sector…
Frederick Fourie Paperback
Basic mathematics for economics students…
Derek Yu Paperback R241 Discovery Miles 2 410
Essential Mathematics for Economics and…
Teresa Bradley Paperback
Quantitative statistical techniques
Swanepoel, Vivier, … Paperback
A Guide to Modern Econometrics
Marno Verbeek Paperback
The Art of Statistics - Learning from…
David Spiegelhalter Hardcover
Statistics for Business and Economics
Jim Freeman, Eddie Shoesmith, … Paperback (2)
Basic mathematics for economics students…
D. Yu Paperback R241 Discovery Miles 2 410
Maths for Economics - A Companion to…
Ken Heather, Simka Stefanova Hardcover