Your cart is empty
An observational study is an empiric investigation of effects caused by treatments when randomized experimentation is unethical or infeasible. Observational studies are common in most fields that study the effects of treatments on people, including medicine, economics, epidemiology, education, psychology, political science and sociology. The quality and strength of evidence provided by an observational study is determined largely by its design. Design of Observational Studies is both an introduction to statistical inference in observational studies and a detailed discussion of the principles that guide the design of observational studies.
Design of Observational Studies is divided into four parts. Chapters 2, 3, and 5 of Part I cover concisely, in about one hundred pages, many of the ideas discussed in Rosenbaum s Observational Studies (also published by Springer) but in a less technical fashion. Part II discusses the practical aspects of using propensity scores and other tools to create a matched comparison that balances many covariates. Part II includes a chapter on matching in R. In Part III, the concept of design sensitivity is used to appraise the relative ability of competing designs to distinguish treatment effects from biases due to unmeasured covariates. Part IV discusses planning the analysis of an observational study, with particular reference to Sir Ronald Fisher s striking advice for observational studies, "make your theories elaborate."
The second edition of his book, Observational Studies, was published by Springer in 2002."
The demand for high quality detailed public finance statistics covering a globally representative sample of countries has increased dramatically during the recent financial crisis. Due to the complexity of public finance statistics, however, such data tend to be either available in oversimplified high level aggregates and lacking in methodological transparency, or, available with a great level of detail and a unified methodological approach yet overly complicated to understand. The IMF's Government Finance Statistics Yearbook shows fiscal data of around 140 countries following the Government Finance Statistics Manual 2001 framework. The associated database includes data series covering over an almost 40 year period. The IMF's Statistics Department embarked on several initiatives to improve its accessibility
The GVAR is a global Vector autoregression model of the global economy. The model was initially developed in the early 2000 by Professor Pesaran and co-authors, for the main purpose of analysing credit risk in a globalised economy. Starting from mid-2000 the model was substantially enlarged in the context of a project financed by the ECB, to comprise all major economies and the Euro area as a whole. The purpose of this version was to exploit the rich modelisation of international linkages in order to simulate and analyse global macro scenarios of high policy interest. The rich, yet manageable, specification of international linkages has stimulated a vast literature on the GVAR. Since early 2011, the basic model - and its data base - has also available on a dedicated GVAR-Toolbox website with an easy-to-use interface allowing practical applications by an extended audience, as well as more complex analysis by the expert public. The book provides an overview of the extensions and applications of the GVAR which have been developed in recent years. Such applications are grouped in three main categories: 1) International transmission and forecasting; 2) Finance applications; and 3) Regional applications. By using a language which is accessible to not econometricians, the book reaches out to the extended audience of practitioners and policy makers interested in understanding channels and impacts of international linkages.
Many economic theories depend on the presence or absence of a unit root for their validity, and econometric and statistical theory undergo considerable changes when unit roots are present. Thus, knowledge on unit roots has become so important, necessitating an extensive, compact, and nontechnical book on this subject. This book is rested on this motivation and introduces the literature on unit roots in a comprehensive manner to both empirical and theoretical researchers in economics and other areas. By providing a clear, complete, and critical discussion of unit root literature, In Choi covers a wide range of topics, including uniform confidence interval construction, unit root tests allowing structural breaks, mildly explosive processes, exuberance testing, fractionally integrated processes, seasonal unit roots and panel unit root testing. Extensive, up to date, and readily accessible, this book is a comprehensive reference source on unit roots for both students and applied workers.
In the last 20 years, econometric theory on panel data has developed rapidly, particularly for analyzing common behaviors among individuals over time. Meanwhile, the statistical methods employed by applied researchers have not kept up-to-date. This book attempts to fill in this gap by teaching researchers how to use the latest panel estimation methods correctly. Almost all applied economics articles use panel data or panel regressions. However, many empirical results from typical panel data analyses are not correctly executed. This book aims to help applied researchers to run panel regressions correctly and avoid common mistakes. The book explains how to model cross-sectional dependence, how to estimate a few key common variables, and how to identify them. It also provides guidance on how to separate out the long-run relationship and common dynamic and idiosyncratic dynamic relationships from a set of panel data. Aimed at applied researchers who want to learn about panel data econometrics by running statistical software, this book provides clear guidance and is supported by a full range of online teaching and learning materials. It includes practice sections on MATLAB, STATA, and GAUSS throughout, along with short and simple econometric theories on basic panel regressions for those who are unfamiliar with econometric theory on traditional panel regressions.
"Palgrave Handbooks of Econometrics" comprises 'landmark' essays by
the world's leading scholars and provides authoritative guidance in
key areas of econometrics. With definitive contributions on the
subject, the Handbook is an essential source for reference for
professional econometricians, economists, researchers and students.
Economists address key challenges facing the EU, including financial instability, welfare state reform, inadequate institutional framework, and global economic integration. The European Union began with efforts in the Cold War era to foster economic integration among a few Western European countries. Today's EU constitutes an upper tier of government that affects almost every level of policymaking in each of its twenty-seven member states. The recent financial and economic crises have tested this still-evolving institutional framework, and this book surveys key economic challenges faced by the EU. Prominent European economists examine such topics as the stability of the financial markets and possible policy options to reduce future vulnerability to crises, including Glass-Steagull-style narrow banking; the effect of emerging economies such as China and India on Europe's economic position; the protection of national interests in industrial policy; reforming and preserving the welfare state in the face of unemployment, population aging, and worker mobility within the EU; and improving the EU's institutional framework by reassigning responsibilities among supranational, national, and local governments. Among the conclusions that emerge from these analyses are the necessity for banking regulation as well as budgetary discipline; the need to consider global as well as European integration; and the idea that an environment that fosters internal competition will increase Europe's competitiveness internationally.
This book is a readable, digestible introduction to exponential families, encompassing statistical models based on the most useful distributions in statistical theory, including the normal, gamma, binomial, Poisson, and negative binomial. Strongly motivated by applications, it presents the essential theory and then demonstrates the theory's practical potential by connecting it with developments in areas like item response analysis, social network models, conditional independence and latent variable structures, and point process models. Extensions to incomplete data models and generalized linear models are also included. In addition, the author gives a concise account of the philosophy of Per Martin-Loef in order to connect statistical modelling with ideas in statistical physics, including Boltzmann's law. Written for graduate students and researchers with a background in basic statistical inference, the book includes a vast set of examples demonstrating models for applications and exercises embedded within the text as well as at the ends of chapters.
From Robin Sickles: As I indicated to you some months ago Professor William Horrace and I would like Springer to publish a Festschrift in Honor of Peter Schmidt, our professor. Peter s accomplishments are legendary among his students and the profession. I have a bit of that student perspective in my introductory and closing remarks on the website for the conference we had in his honor this last July. I have attached the conference program from which selected papers will come (as well as from students who were unable to attend). You will also find the names of his students (40) on the website. A top twenty economics department could be started up from those 40 students. Papers from some festschrifts have a thematic link among the papers based on subject material. What I think is unique to this festschrift is that the theme running through the papers will be Peter s remarkable legacy left to his students to frame a problem and then analyze and examine it in depth using rigorous techniques but rarely just for the purpose of showcasing technical refinements per se. I think this would be a book that graduate students would find invaluable in their early research careers and seasoned scholars would find invaluable in both their and their students research."
This book presents recent research on probabilistic methods in economics, from machine learning to statistical analysis. Economics is a very important - and at the same a very difficult discipline. It is not easy to predict how an economy will evolve or to identify the measures needed to make an economy prosper. One of the main reasons for this is the high level of uncertainty: different difficult-to-predict events can influence the future economic behavior. To make good predictions and reasonable recommendations, this uncertainty has to be taken into account. In the past, most related research results were based on using traditional techniques from probability and statistics, such as p-value-based hypothesis testing. These techniques led to numerous successful applications, but in the last decades, several examples have emerged showing that these techniques often lead to unreliable and inaccurate predictions. It is therefore necessary to come up with new techniques for processing the corresponding uncertainty that go beyond the traditional probabilistic techniques. This book focuses on such techniques, their economic applications and the remaining challenges, presenting both related theoretical developments and their practical applications.
This outstanding text by a foremost econometrician combines instruction in probability and statistics with econometrics in a rigorous but relatively nontechnical manner. Unlike many statistics texts, it discusses regression analysis in depth. And unlike many econometrics texts, it offers a thorough treatment of statistics. Although its only mathematical requirement is multivariate calculus, it challenges the student to think deeply about basic concepts.
The coverage of probability and statistics includes best prediction and best linear prediction, the joint distribution of a continuous and discrete random variable, large sample theory, and the properties of the maximum likelihood estimator. Exercises at the end of each chapter reinforce the many illustrative examples and diagrams. Believing that students should acquire the habit of questioning conventional statistical techniques, Takeshi Amemiya discusses the problem of choosing estimators and compares various criteria for ranking them. He also evaluates classical hypothesis testing critically, giving the realistic case of testing a composite null against a composite alternative. He frequently adopts a Bayesian approach because it provides a useful pedagogical framework for discussing many fundamental issues in statistical inference.
Turning to regression, Amemiya presents the classical bivariate model in the conventional summation notation. He follows with a brief introduction to matrix analysis and multiple regression in matrix notation. Finally, he describes various generalizations of the classical regression model and certain other statistical models extensively used in econometrics and other applications in social science.
In recent years nonlinearities have gained increasing importance in economic and econometric research, particularly after the financial crisis and the economic downturn after 2007. This book contains theoretical, computational and empirical papers that incorporate nonlinearities in econometric models and apply them to real economic problems. It intends to serve as an inspiration for researchers to take potential nonlinearities in account. Researchers should be aware of applying linear model-types spuriously to problems which include non-linear features. It is indispensable to use the correct model type in order to avoid biased recommendations for economic policy.
This volume uses state of the art models from the frontier of macroeconomics to answer key questions about how the economy functions and how policy should be conducted. The contributions cover a wide range of issues in macroeconomics and macroeconomic policy. They combine high level mathematics with economic analysis, and highlight the need to update our mathematical toolbox in order to understand the increased complexity of the macroeconomic environment. The volume represents hard evidence of high research intensity in many fields of macroeconomics, and warns against interpreting the scope of macroeconomics too narrowly. The mainstream business cycle analysis, based on dynamic stochastic general equilibrium (DSGE) modelling of a particular type, has been criticised for its inability to predict or resolve the recent financial crisis. However, macroeconomic research on financial, information, and learning imperfections had not yet made their way into many of the pre-crisis DSGE models because practical econometric versions of those models were mainly designed to fit data periods that did not include financial crises. A major response to the limitations of those older DSGE models is an active research program to bring big financial shocks and various kinds of financial, learning, and labour market frictions into a new generation of DSGE models for guiding policy. The contributors to this book utilise models and modelling assumptions that go beyond particular modelling conventions. By using alternative yet plausible assumptions, they seek to enrich our knowledge and ability to explain macroeconomic phenomena. They contribute to expanding the frontier of macroeconomic knowledge in ways that will prove useful for macroeconomic policy.
It is increasingly common for analysts to seek out the opinions of individuals and organizations using attitudinal scales such as degree of satisfaction or importance attached to an issue. Examples include levels of obesity, seriousness of a health condition, attitudes towards service levels, opinions on products, voting intentions, and the degree of clarity of contracts. Ordered choice models provide a relevant methodology for capturing the sources of influence that explain the choice made amongst a set of ordered alternatives. The methods have evolved to a level of sophistication that can allow for heterogeneity in the threshold parameters, in the explanatory variables (through random parameters), and in the decomposition of the residual variance. This book brings together contributions in ordered choice modeling from a number of disciplines, synthesizing developments over the last fifty years, and suggests useful extensions to account for the wide range of sources of influence on choice.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices. This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Economics and Game Theory.
Herbert Scarf is a highly esteemed distinguished American economist. He is internationally famous for his early epoch-making work on optimal inventory policies and his highly influential study with Andrew Clark on optimal policies for a multi-echelon inventory problem, which initiated the important and flourishing field of supply chain management. Equally, he has gained world recognition for his classic study on the stability of the Walrasian price adjustment processes and his fundamental analysis on the relationship between the core and the set of competitive equilibria (the so-called Edgeworth conjecture). Further achievements include his remarkable sufficient condition for the existence of a core in non-transferable utility games and general exchange economies, his seminal paper with Lloyd Shapley on housing markets, and his pioneering study on increasing returns and models of production in the presence of indivisibilities. All in all, however, the name of Scarf is always remembered as a synonym for the computation of economic equilibria and fixed points. In the early 1960s he invented a path-breaking technique for computing equilibrium prices.This work has generated a major research field in economics termed Applied General Equilibrium Analysis and a corresponding area in operations research known as Simplicial Fixed Point Methods. This book comprises all his research articles and consists of four volumes. This volume collects Herbert Scarf's papers in the area of Operations Research and Management.
The complexity, diversity, and random nature of transportation problems necessitates a broad analytical toolbox. Describing tools commonly used in the field, Statistical and Econometric Methods for Transportation Data Analysis, Second Edition provides an understanding of a broad range of analytical tools required to solve transportation problems. It includes a wide breadth of examples and case studies covering applications in various aspects of transportation planning, engineering, safety, and economics.
After a solid refresher on statistical fundamentals, the book focuses on continuous dependent variable models and count and discrete dependent variable models. Along with an entirely new section on other statistical methods, this edition offers a wealth of new material.
New to the Second Edition
Each chapter clearly presents fundamental concepts and principles and includes numerous references for those seeking additional technical details and applications. To reinforce a practical understanding of the modeling techniques, the data sets used in the text are offered on the book's CRC Press web page. PowerPoint and Word presentations for each chapter are also available for download.
'Experiments in Organizational Economics' highlights the importance of replicating previous economic experiments. Replication enables experimental findings to be subjected to rigorous scrutiny. Despite this obvious advantage, direct replication remains relatively scant in economics. One possible explanation for this situation is that publication outlets favor novel work over tests of robustness. Readers will gain a better understanding of the role that replication plays in economic discovery as well as valuable insights into the robustness of previously reported findings.
In the light of better and more detailed administrative databases, this open access book provides statistical tools for evaluating the effects of public policies advocated by governments and public institutions. Experts from academia, national statistics offices and various research centers present modern econometric methods for an efficient data-driven policy evaluation and monitoring, assess the causal effects of policy measures and report on best practices of successful data management and usage. Topics include data confidentiality, data linkage, and national practices in policy areas such as public health, education and employment. It offers scholars as well as practitioners from public administrations, consultancy firms and nongovernmental organizations insights into counterfactual impact evaluation methods and the potential of data-based policy and program evaluation.
One of the most urgent challenges in African economic development is to devise a strategy for improving statistical capacity. Reliable statistics, including estimates of economic growth rates and per-capita income, are basic to the operation of governments in developing countries and vital to nongovernmental organizations and other entities that provide financial aid to them. Rich countries and international financial institutions such as the World Bank allocate their development resources on the basis of such data. The paucity of accurate statistics is not merely a technical problem; it has a massive impact on the welfare of citizens in developing countries.
Where do these statistics originate? How accurate are they? Poor Numbers is the first analysis of the production and use of African economic development statistics. Morten Jerven's research shows how the statistical capacities of sub-Saharan African economies have fallen into disarray. The numbers substantially misstate the actual state of affairs. As a result, scarce resources are misapplied. Development policy does not deliver the benefits expected. Policymakers' attempts to improve the lot of the citizenry are frustrated. Donors have no accurate sense of the impact of the aid they supply. Jerven's findings from sub-Saharan Africa have far-reaching implications for aid and development policy. As Jerven notes, the current catchphrase in the development community is "evidence-based policy," and scholars are applying increasingly sophisticated econometric methods but no statistical techniques can substitute for partial and unreliable data."
This book provides a rigorous introduction to the principles of econometrics and gives students and practitioners the tools they need to effectively and accurately analyze real data. Thoroughly updated to address the developments in the field that have occurred since the original publication of this classic text, the second edition has been expanded to include two chapters on time series analysis and one on nonparametric methods. Discussions on covariance (including GMM), partial identification, and empirical likelihood have also been added. The selection of topics and the level of discourse give sufficient variety so that the book can serve as the basis for several types of courses. This book is intended for upper undergraduate and first year graduate courses in economics and statistics and also has applications in mathematics and some social sciences where a reasonable knowledge of matrix algebra and probability theory is common. It is also ideally suited for practicing professionals who want to deepen their understanding of the methods they employ. Also available for the new edition is a solutions manual, containing answers to the end-of-chapter exercises.
From an internationally acclaimed economist, a provocative call to jump-start economic growth by aggressively overhauling liberal democracy
Around the world, people who are angry at stagnant wages and growing inequality have rebelled against established governments and turned to political extremes. Liberal democracy, history's greatest engine of growth, now struggles to overcome unprecedented economic headwinds--from aging populations to scarce resources to unsustainable debt burdens. Hobbled by short-term thinking and ideological dogma, democracies risk falling prey to nationalism and protectionism that will deliver declining living standards.
In Edge Of Chaos, Dambisa Moyo shows why economic growth is essential to global stability, and why liberal democracies are failing to produce it today. Rather than turning away from democracy, she argues, we must fundamentally reform it. Edge Of Chaos presents a radical blueprint for change in order to galvanize growth and ensure the survival of democracy in the twenty-first century.
You may like...
The South African Informal Sector…
Frederick Fourie Paperback
Weapons of Math Destruction - How Big…
Cathy O'Neil Paperback (1)
Cambridge Series in Statistical and…
Martin J Wainwright Hardcover
Essential Mathematics for Economics and…
Teresa Bradley Paperback
Statistics for Business and Economics
Jim Freeman, Eddie Shoesmith, … Paperback (2)
A Guide to Modern Econometrics
Marno Verbeek Paperback
Who Gets What - And Why - Understand the…
Alvin Roth Paperback (1)
Basic mathematics for economics students…
D. Yu Paperback R241 Discovery Miles 2 410
Basic mathematics for economics students…
Derek Yu Paperback R241 Discovery Miles 2 410
An Introduction to the Advanced Theory…
Jeffrey S Racine Hardcover R879 Discovery Miles 8 790