Your cart is empty
Business Statistics of the United States is a comprehensive and practical collection of data from as early as 1913 that reflects the nation's economic performance. It provides over 80 years of annual, quarterly, and monthly data in industrial and demographic detail including key indicators such as: gross domestic product, personal income, spending, saving, employment, unemployment, the capital stock, and more. Business Statistics of the United States is the best place to find historical perspectives on the U.S. economy. Of equal importance to the data are the introductory highlights, extensive notes, and figures for each chapter that help users to understand the data, use them appropriately, and, if desired, seek additional information from the source agencies. Business Statistics of the United States provides a rich and deep picture of the American economy and contains approximately 3,500 time series in all. The data are predominately from federal government sources including: *Board of Governors of the Federal Reserve System *Bureau of Economic Analysis *Bureau of Labor Statistics *Census Bureau *Employment and Training Administration *Energy Information Administration *Federal Housing Finance Agency *U.S. Department of the Treasury
"[Taleb is] Wall Street's principal dissident. . . . [Fooled By
Randomness] is to conventional Wall Street wisdom approximately
what Martin Luther's ninety-nine theses were to the Catholic
Markets are a crucial component of how people survive. Understanding how markets function and are disrupted in emergencies is critical to any analysis of hunger, and of food and livelihood security. The Emergency Market Mapping and Analysis (EMMA) techniques described in this toolkit assist frontline staff in undertaking rapid assessments of market systems in the first few weeks of a crisis. Its purpose is to improve early response planning so that resources are used effectively, and to ensure that opportunities to bolster future recovery in the local economy are not missed. This toolkit can help prevent lasting damage to the livelihoods, jobs and businesses on which people s long-term security depends. EMMA assumes limited previous experience of economic or market analysis. Instead, the focus is on simple visual, graphical and largely qualitative ways of describing the impact of emergencies on people and on the critical market systems upon which they most rely. "The Emergency Market Mapping Analysis Toolkit" is designed for generalists, as well as specialist staff working in the food security, shelter, water and sanitation sectors. The author takes readers through ten practical steps in order that they can both understand the important market aspects of an emergency situation, and are able to communicate this knowledge promptly and effectively to decision-makers. The toolkit contains a CD-ROM with an electronic version of the toolkit, supplementary reading, and associated training materials."
Economic forecasting is a key ingredient of decision making both in the public and in the private sector. Because economic outcomes are the result of a vast, complex, dynamic and stochastic system, forecasting is very difficult and forecast errors are unavoidable. Because forecast precision and reliability can be enhanced by the use of proper econometric models and methods, this innovative book provides an overview of both theory and applications. Undergraduate and graduate students learning basic and advanced forecasting techniques will be able to build from strong foundations, and researchers in public and private institutions will have access to the most recent tools and insights. Readers will gain from the frequent examples that enhance understanding of how to apply techniques, first by using stylized settings and then by real data applications-focusing on macroeconomic and financial topics. This is first and foremost a book aimed at applying time series methods to solve real-world forecasting problems. Applied Economic Forecasting using Time Series Methods starts with a brief review of basic regression analysis with a focus on specific regression topics relevant for forecasting, such as model specification errors, dynamic models and their predictive properties as well as forecast evaluation and combination. Several chapters cover univariate time series models, vector autoregressive models, cointegration and error correction models, and Bayesian methods for estimating vector autoregressive models. A collection of special topics chapters study Threshold and Smooth Transition Autoregressive (TAR and STAR) models, Markov switching regime models, state space models and the Kalman filter, mixed frequency data models, nowcasting, forecasting using large datasets and, finally, volatility models. There are plenty of practical applications in the book and both EViews and R code are available online.
The estimation and the validation of the Basel II risk parameters PD (default probability), LGD (loss given fault), and EAD (exposure at default) is an important problem in banking practice. These parameters are used on the one hand as inputs to credit portfolio models and in loan pricing frameworks, on the other to compute regulatory capital according to the new Basel rules. This book covers the state-of-the-art in designing and validating rating systems and default probability estimations. Furthermore, it presents techniques to estimate LGD and EAD and includes a chapter on stress testing of the Basel II risk parameters. The second edition is extended by three chapters explaining how the Basel II risk parameters can be used for building a framework for risk-adjusted pricing and risk management of loans.
This book discusses equi-quantile values and their use in generating decision alternatives under the twofold complexities of uncertainty and dependence, offering scope for surrogating between two alternative portfolios when they are correlated. The book begins with a discussion on components of rationality and learning models as indispensable concepts in decision-making processes. It identifies three-fold complexities in such processes: uncertainty, dependence and dynamism. The book is a novel attempt to seek tangible solutions for such decision problems. To do so, four hundred tables of bi-quantile pairs are presented for carefully chosen grids. In fact, it is a two-variable generalization of the inverse normal integral table, which is used in obtaining bivariate normal quantile pairs for the given values of probability and correlation. When making decisions, only two of them have to be taken at a time. These tables are essential tools for decision-making under risk and dependence, and offer scope for delving up to a single step of dynamism. The book subsequently addresses averments dealing with applications and advantages. The content is useful to empirical scientists and risk-oriented decision makers who are often required to make choices on the basis of pairs of variables. The book also helps simulators seeking valid confidence intervals for their estimates, and particle physicists looking for condensed confidence intervals for Higgs-Boson utilizing the Bose-Einstein correlation given the magnitude of such correlations. Entrepreneurs and investors as well as students of management, statistics, economics and econometrics, psychology, psychometrics and psychographics, social sciences, geographic information system, geology, agricultural and veterinary sciences, medical sciences and diagnostics, and remote sensing will also find the book very useful.
Bayesian Econometrics introduces the reader to the use of Bayesian methods in the field of econometrics at the advanced undergraduate or graduate level. The book is self-contained and does not require previous training in econometrics. The focus is on models used by applied economists and the computational techniques necessary to implement Bayesian methods when doing empirical work. It includes numerous numerical examples and topics covered in the book include:
A website containing computer programs and data sets to help the student develop the computational skills of modern Bayesian econometrics can be found at: www.wiley.co.uk/koopbayesian
Doubt over the trustworthiness of published empirical results is not unwarranted and is often a result of statistical mis-specification: invalid probabilistic assumptions imposed on data. Now in its second edition, this bestselling textbook offers a comprehensive course in empirical research methods, teaching the probabilistic and statistical foundations that enable the specification and validation of statistical models, providing the basis for an informed implementation of statistical procedure to secure the trustworthiness of evidence. Each chapter has been thoroughly updated, accounting for developments in the field and the author's own research. The comprehensive scope of the textbook has been expanded by the addition of a new chapter on the Linear Regression and related statistical models. This new edition is now more accessible to students of disciplines beyond economics and includes more pedagogical features, with an increased number of examples as well as review questions and exercises at the end of each chapter.
A modern practical guide to building and using actuarial models. Loss Models: From Data to Decisions is organized around the principle that actuaries build models in order to analyze risks and make decisions about managing the risks based on conclusions drawn from the analysis. In practice, one begins with data and ends with a business decision. The book flows logically from this principle. It begins with a framework for model building and a description of frequency and severity loss data typically available to actuaries. Parametric models are emphasized throughout. The frequency and severity models are used in building aggregate loss models, in credibility-based pricing models, and in loss analysis over multiple time periods. Designed as both an educational text as well as a professional reference, Loss Models: Assumes little prior knowledge of insurance systems Features many fascinating examples taken from insurance files Contains a major instructive case study continued through each chapter Covers the classical areas of risk theory and loss distributions Gives a practical but rigorous treatment of modern credibility theory Uses standard statistical concepts, methods, and notation Provides modern computational algorithms for implementing methods Includes free companion software available from an FTP site Deals with many topics on CAS 4B and SOA 151 and 152 actuarial exams Includes many exercises based on past CAS and SOA exams.
In this Element and its accompanying second Element, A Practical Introduction to Regression Discontinuity Designs: Extensions, Matias Cattaneo, Nicolas Idrobo, and Rociio Titiunik provide an accessible and practical guide for the analysis and interpretation of regression discontinuity (RD) designs that encourages the use of a common set of practices and facilitates the accumulation of RD-based empirical evidence. In this Element, the authors discuss the foundations of the canonical Sharp RD design, which has the following features: (i) the score is continuously distributed and has only one dimension, (ii) there is only one cutoff, and (iii) compliance with the treatment assignment is perfect. In the second Element, the authors discuss practical and conceptual extensions to this basic RD setup.
For courses in introductory econometrics. Engaging applications bring the theory and practice of modern econometrics to life Ensure students grasp the relevance of econometrics with Introduction to Econometrics -- the text that connects modern theory and practice with motivating, engaging applications. The 4th Edition, Global Edition, maintains a focus on currency, while building on the philosophy that applications should drive the theory, not the other way around. The text incorporates real-world questions and data, and methods that are immediately relevant to the applications. With very large data sets increasingly being used in economics and related fields, a new chapter dedicated to Big Data helps students learn about this growing and exciting area. This coverage and approach make the subject come alive for students and helps them to become sophisticated consumers of econometrics. Pearson MyLab (TM) Economics is not included. Students, if Pearson MyLab Economics is a recommended/mandatory component of the course, please ask your instructor for the correct ISBN. Pearson MyLab Economics should only be purchased when required by an instructor. Instructors, contact your Pearson representative for more information. Reach every student by pairing this text with Pearson MyLab Economics MyLab (TM) is the teaching and learning platform that empowers you to reach every student. By combining trusted author content with digital tools and a flexible platform, MyLab personalizes the learning experience and improves results for each student. The 4th Edition features expanded exercise sets in Pearson MyLab Economics, offering more flexibility to instructors as they build assignments.
For more than 25 years, Pocket World in Figures has been informing and entertaining readers around the world with its blend of the serious, the quirky and the downright surprising.
Where else would you find out, in a single volume, that 98% of Suriname is forest, that Switzerland sells the most expensive Big Macs or that the Norway spends the most per person on music downloads?
The 2018 edition includes data from over 180 countries, presented in a series of rankings and country profiles. The rankings cover subjects as diverse as geography and demographics, business, economics and finance, health and welfare, culture and entertainment. Updated, revised and expanded each year to include new rankings and features, it also includes detailed statistical profiles of more than 65 of the world's major economies, the euro area and the world itself.
And, once again, the 2018 edition will showcase the Economist's strength in data journalism by including charts and graphs, and will invite readers to test their knowledge with its world rankings quiz, making the book an indispensable - and entertaining - guide to the world in figures.
Quantitative portfolio management has become a highly specialized discipline. Computing power and software improvements have advanced the field to a level that would not have been thinkable when Harry Markowitz began the modern era of quantitative portfolio management in 1952. In addition to raw computing power, major advances in financial economics and econometrics have shaped academia and the financial industry over the last 60 years. While the idea of a general theory of finance is still only a distant hope, asset managers now have tools in the financial engineering kit that address specific problems in their industry. The Oxford Handbook of Quantitative Asset Management consists of seven sections that explore major themes in current theoretical and practical use. These themes span all aspects of a modern quantitative investment organization. Contributions from academics and practitioners working in leading investment management organizations bring together the key theoretical and practical aspects of the field to provide a comprehensive overview of the major developments in the area.
A rigorous but nontechnical treatment of major topics in urban economics. Lectures on Urban Economics offers a rigorous but nontechnical treatment of major topics in urban economics. To make the book accessible to a broad range of readers, the analysis is diagrammatic rather than mathematical. Although nontechnical, the book relies on rigorous economic reasoning. In contrast to the cursory theoretical development often found in other textbooks, Lectures on Urban Economics offers thorough and exhaustive treatments of models relevant to each topic, with the goal of revealing the logic of economic reasoning while also teaching urban economics. Topics covered include reasons for the existence of cities, urban spatial structure, urban sprawl and land-use controls, freeway congestion, housing demand and tenure choice, housing policies, local public goods and services, pollution, crime, and quality of life. Footnotes throughout the book point to relevant exercises, which appear at the back of the book. These 22 extended exercises (containing 125 individual parts) develop numerical examples based on the models analyzed in the chapters. Lectures on Urban Economics is suitable for undergraduate use, as background reading for graduate students, or as a professional reference for economists and scholars interested in the urban economics perspective.
The worlds of Wall Street and The City have always held a certain allure, but in recent years have left an indelible mark on the wider public consciousness and there has been a need to become more financially literate. The quantitative nature of complex financial transactions makes them a fascinating subject area for mathematicians of all types, whether for general interest or because of the enormous monetary rewards on offer. An Introduction to Quantitative Finance concerns financial derivatives - a derivative being a contract between two entities whose value derives from the price of an underlying financial asset - and the probabilistic tools that were developed to analyse them. The theory in the text is motivated by a desire to provide a suitably rigorous yet accessible foundation to tackle problems the author encountered whilst trading derivatives on Wall Street. The book combines an unusual blend of real-world derivatives trading experience and rigorous academic background. Probability provides the key tools for analysing and valuing derivatives. The price of a derivative is closely linked to the expected value of its pay-out, and suitably scaled derivative prices are martingales, fundamentally important objects in probability theory. The prerequisite for mastering the material is an introductory undergraduate course in probability. The book is otherwise self-contained and in particular requires no additional preparation or exposure to finance. It is suitable for a one-semester course, quickly exposing readers to powerful theory and substantive problems. The book may also appeal to students who have enjoyed probability and have a desire to see how it can be applied. Signposts are given throughout the text to more advanced topics and to different approaches for those looking to take the subject further.
Panel data is a data type increasingly used in research in economics, social sciences, and medicine. Its primary characteristic is that the data variation goes jointly over space (across individuals, firms, countries, etc.) and time (over years, months, etc.). Panel data allow examination of problems that cannot be handled by cross-section data or time-series data. Panel data analysis is a core field in modern econometrics and multivariate statistics, and studies based on such data occupy a growing part of the field in many other disciplines. The book is intended as a text for master and advanced undergraduate courses. It may also be useful for PhD-students writing theses in empirical and applied economics and readers conducting empirical work on their own. The book attempts to take the reader gradually from simple models and methods in scalar (simple vector) notation to more complex models in matrix notation. A distinctive feature is that more attention is given to unbalanced panel data, the measurement error problem, random coefficient approaches, the interface between panel data and aggregation, and the interface between unbalanced panels and truncated and censored data sets. The 12 chapters are intended to be largely self-contained, although there is also natural progression. Most of the chapters contain commented examples based on genuine data, mainly taken from panel data applications to economics. Although the book, inter alia, through its use of examples, is aimed primarily at students of economics and econometrics, it may also be useful for readers in social sciences, psychology, and medicine, provided they have a sufficient background in statistics, notably basic regression analysis and elementary linear algebra.
The book aims at perfecting the national governance system and improving national governance ability. It evaluates the balance sheets of the state and residents, non-financial corporations, financial institutions and the central bank, the central government, local government and external sectors - the goal being to provide a systematic analysis of the characteristics and trajectory of China's economic expansion and structural adjustment, as well as objective assessments of short and long-term economic operations, debt risks and financial risks with regard to the institutional and structural characteristics of economic development in market-oriented reform. It puts forward a preliminary analysis of China's national and sectoral balance sheets on the basis of scientific estimates of various kinds of data, analyzes from a new perspective the major issues that are currently troubling China - development sustainability, government transformation, local government debt, welfare reform, and the financial opening-up and stability - and explores corresponding policies, measures, and institutional arrangements.
Handbook of Empirical Economics and Finance explores the latest developments in the analysis and modeling of economic and financial data. Well-recognized econometric experts discuss the rapidly growing research in economics and finance and offer insight on the future direction of these fields. Focusing on micro models, the first group of chapters describes the statistical issues involved in the analysis of econometric models with cross-sectional data often arising in microeconomics. The book then illustrates time series models that are extensively used in empirical macroeconomics and finance. The last set of chapters explores the types of panel data and spatial models that are becoming increasingly significant in analyzing complex economic behavior and policy evaluations. This handbook brings together both background material and new methodological and applied results that are extremely important to the current and future frontiers in empirical economics and finance. It emphasizes inferential issues that transpire in the analysis of cross-sectional, time series, and panel data-based empirical models in economics, finance, and related disciplines.
This book contains a set of notes prepared by Ragnar Frisch for a lecture series that he delivered at Yale University in 1930. The lecture notes provide not only a valuable source document for the history of econometrics, but also a more systematic introduction to some of Frisch's key methodological ideas than his other works so far published in various media for the econometrics community. In particular, these notes contain a number of prescient ideas precursory to some of the most important notions developed in econometrics during the 1970s and 1980s More remarkably, Frisch demonstrated a deep understanding of what econometric or statistical analysis could achieve under the situation where there lacked known correct theoretical models. This volume has been rigorously edited and comes with an introductory essay from Olav Bjerkholt and Duo Qin placing the notes in their historical context.
The development of economics changed dramatically during the twentieth century with the emergence of econometrics, macroeconomics and a more scientific approach in general. One of the key individuals in the transformation of economics was Ragnar Frisch, professor at the University of Oslo and the first Nobel Laureate in economics in 1969. He was a co-founder of the Econometric Society in 1930 (after having coined the word econometrics in 1926) and edited the journal Econometrics for twenty-two years. The discovery of the manuscripts of a series of eight lectures given by Frisch at the Henri Poincar Institute in March April 1933 on The Problems and Methods of Econometrics will enable economists to more fully understand his overall vision of econometrics.
This book is a rare exhibition of Frisch 's overview on econometrics and is published here in English for the first time. Edited and with an introduction by Olav Bjerkholt and Ariane Dupont-Kieffer, Frisch 's eight lectures provide an accessible and astute discussion of econometric issues from philosophical foundations to practical procedures.
Concerning the development of economics in the twentieth century and the broader visions about economic science in general and econometrics in particular held by Ragnar Frisch, this book will appeal to anyone with an interest in the history of economics and econometrics.
This book provides an introduction to index numbers for statisticians, economists and numerate members of the public. It covers the essential basics, mixing theoretical aspects with practical techniques to give a balanced and accessible introduction to the subject. The concepts are illustrated by exploring the construction and use of the Consumer Prices Index which is arguably the most important of all official statistics in the UK. The book also considers current issues and developments in the field including the use of large-scale price transaction data. A Practical Introduction to Index Numbers will be the ideal accompaniment for students taking the index number components of the Royal Statistical Society Ordinary and Higher Certificate exams; it provides suggested routes through the book for students, and sets of exercises with solutions.
The global financial crisis highlighted the impact on macroeconomic outcomes of recurrent events like business and financial cycles, highs and lows in volatility, and crashes and recessions. At the most basic level, such recurrent events can be summarized using binary indicators showing if the event will occur or not. These indicators are constructed either directly from data or indirectly through models. Because they are constructed, they have different properties than those arising in microeconometrics, and how one is to use them depends a lot on the method of construction. This book presents the econometric methods necessary for the successful modeling of recurrent events, providing valuable insights for policymakers, empirical researchers, and theorists. It explains why it is inherently difficult to forecast the onset of a recession in a way that provides useful guidance for active stabilization policy, with the consequence that policymakers should place more emphasis on making the economy robust to recessions. The book offers a range of econometric tools and techniques that researchers can use to measure recurrent events, summarize their properties, and evaluate how effectively economic and statistical models capture them. These methods also offer insights for developing models that are consistent with observed financial and real cycles. This book is an essential resource for students, academics, and researchers at central banks and institutions such as the International Monetary Fund.
This monthly compendium of statistics and articles on the UK economy contains data on UK economic accounts, prices, labour market, output and demand indicators, selected financial statistics, gross domestic product, consumer and wholesale price indices, households' final consumption expenditure, final expenditure prices index, visible and invisible trade balance, earnings, and regional and international economic indicators. Includes articles on national accounting, trade, wider economic issues, research and development statistics and international comparisons. From October 1998 the national accounts data in Economic Trends is consistent with the European System of Accounts (ESA 95) This issue includes: Public Service Productivity: health - estimates of the change in productivity expenditure on health using National Accounts data from 1995 to 2003, by Phillip Lee Health Expenditure by Charities - a description of the functional and provider breakdown of charitable expenditure on health in the UK for 2002, by Gavin Wallis
This monthly compendium of statistics and articles on the UK economy contains data on UK economic accounts, prices, labour market, output and demand indicators, selected financial statistics, gross domestic product, consumer and wholesale price indices, households' final consumption expenditure, final expenditure prices index, visible and invisible trade balance, earnings, and regional and international economic indicators. Includes articles on national accounting, trade, wider economic issues, research and development statistics and international comparisons. From October 1998 the national accounts data in Economic Trends is consistent with the European System of Accounts (ESA 95) This issue includes: The impact of UK households on the environment: a presentation of a regional breakdown of greenhouse gas emissions directly and indirectly generated by UK households, by Perry Frances Input-Output Analysis; Creative sector, 1992-2002 - detailed information and statistics covering the UK Creative Sector, based on the Input-Output Annual Supply and Use Tables, by Sanjiv Manhajan
You may like...
Using Econometrics - Custom Unisa…
The South African Informal Sector…
Frederick Fourie Paperback
Basic mathematics for economics students…
D. Yu Paperback R241 Discovery Miles 2 410
High-Dimensional Statistics - A…
Martin J Wainwright Hardcover
Introductory Econometrics - A Modern…
Jeffrey Wooldridge Hardcover
Capital And Ideology
Thomas Piketty Hardcover (1)
Torben G Andersen, Tim Bollerslev Hardcover R14,898 Discovery Miles 148 980
Basic Business Statistics, Global…
Mark L Berenson, David M Levine, … Paperback R1,444 Discovery Miles 14 440
A Guide to Modern Econometrics
Marno Verbeek Paperback
Statistics for Management and Economics
Gerald Keller Hardcover (2)
R1,496 Discovery Miles 14 960