All of the Department Discussion Papers are submitted to RePEc. The EconPapers or IDEAS sites allow you to search by author, title, keyword, JEL category and abstract contents.
Papers from 1998 onwards are available on-line as .PDF files.
14/05 Stephen Pollock
This essay was written to accompany a lecture to beginning students of the course of Economic Analytics, which is taught in the Institute of Econometrics of the University of Lodz in Poland. It provides, within a few pages, a broad historical account the development of econometrics. It begins by describing the origin of regression analysis and it concludes with an account of cointegration analysis. The purpose of the essay is to provide a context in which the students can locate various aspects of econometric analysis. A distinction must be made between the means by which new ideas were propagated and the manner and the circumstances in which they have originated. This account is concerned primarily with the propagation of the ideas.
14/04 Stephen Pollock
Alternative methods of trend extraction and of seasonal adjustment are described that operate in the time domain and in the frequency domain. The time-domain methods that are implemented in the TRAMO–SEATS and the STAMP programs are described and compared. An abbreviated time-domain method of seasonal adjustment that is implemented in the IDEOLOG program is also described. Finite-sample versions of the Wiener–Kolmogorov filter are described that can be used to implement the methods in a common way. The frequency-domain method, which is also implemented in the IDEOLOG program, employs a ideal frequency selective filter that depends on identifying the ordinates of the Fourier transform of a detrended data sequence that should lie in the pass band of the filter and those that should lie in its stop band. Filters of this nature can be used both for extracting a low-frequency cyclical component of the data and for extracting the seasonal component.
14/03 Stephen Pollock
The claim that linear filters are liable to induce spurious fluctuations has been repeated many times of late. However, there are good reasons for asserting that this cannot be the case for the filters that, nowadays, are commonly employed by econometricians. If these filters cannot have the effects that have been attributed to them, then one must ask what effects the filters do have that could have led to the aspersions that have been made against them.
14/02 Stephen Pollock
The algebra of the Kronecker products of matrices is recapitulated using a notation that reveals the tensor structures of the matrices. It is claimed that many of the difficulties that are encountered in working with the algebra can be alleviated by paying close attention to the indices that are concealed beneath the conventional matrix notation. The vectorisation operations and the commutation transformations that are common in multivariate statistical analysis alter the positional relationship of the matrix elements. These elements correspond to numbers that are liable to be stored in contiguous memory cells of a computer, which should remain undisturbed. It is suggested that, in the absence of an adequate index notation that enables the manipulations to be performed without disturbing the data, even the most clear-headed of computer programmers is liable to perform wholly unnecessary and time-wasting operations that shift data between memory cells.
14/01 Wojciech Charemza, Carlos Diaz and Svetlana Makarova
Empirical evaluation of macroeconomic uncertainties and their use for probabilistic forecasting are investigated. A new weighted skew normal distribution which parameters are interpretable in relation to monetary policy outcomes and actions is proposed. This distribution is fitted to recursively obtained forecast errors of monthly and annual inflation for 38 countries. It is found that this distribution fits inflation forecasts errors better than the two-piece normal distribution, which is often used for inflation forecasting. The new type of ‘fan charts’ net of the epistemic (potentially predictable) element is proposed and applied for UK and Poland.
13/27 Ali al-Nowaihi and Sanjit Dhami
A critical element in all discounted utility models is the specification of a discount function. We extend the standard model to allow for reference points for both out- comes and time. We consider the axiomatic foundations and properties of two main classes of discount functions. The first, the Loewenstein-Prelec discount function, accounts for declining impatience but cannot account for the evidence on subadditivity. A second class of discount functions, the Read-Scholten discount function accounts for declining impatience and subadditivity. We derive restrictions on an individual’s preferences to expedite or to delay an outcome that give rise to the discount functions under consideration. As an application of our framework we consider the explanation of the common difference effect.
13/26 Ali al-Nowaihi and Sanjit Dhami
We consider a discounted utility model that has two components. (1) The instan- taneous utility is of the prospect theory form, thus, allowing for reference dependent outcomes. (2) The discount function embodies a ‘reference time’ to which all future outcomes are discounted back to, hence, the name, reference time theory. We allow the discount function to exhibit declining impatience, as in hyperbolic discounting models, subadditivity or both. We show that if the discount function is non-additive, then the presence of a reference time has important effects on intertemporal choices. For instance, this helps to explain apparently intransitive choices over time. We also show how several recent approaches to time discounting can be incorporated within our proposed framework; these include attribute models and models of uncertainty.
13/25 Sanjit Dhami and Ali al-Nowaihi
Standard equilibrium concepts in game theory find it difficult to explain the empirical evidence in a large number of static games such as prisoners’ dilemma, voting, public goods, oligopoly, etc. Under uncertainty about what others will do in one-shot games of complete and incomplete information, evidence suggests that people often use evidential reasoning (ER), i.e., they assign diagnostic significance to their own actions in forming beliefs about the actions of other like- minded players. This is best viewed as a heuristic or bias relative to the standard approach. We provide a formal theoretical framework that incorporates ER into static games by proposing evidential games and the relevant solution concept- evidential equilibrium (EE). We derive the relation between a Nash equilibrium and an EE. We also apply EE to several common games including the prisoners’ dilemma and oligopoly games.
13/24 Matthew Polisson and John K.-H. Quah
Consider a finite data set where each observation consists of a bundle of contingent consumption chosen from a constraint set of contingent consumption bundles. We develop a general procedure for testing the consistency of such a data set with a broad class of models of choice under risk or uncertainty. Unlike previous tests, we do not require that the agent has a concave Bernoulli utility function.
13/23 Asako Ohinata and Jan C. van Ours
We analyze how the share of immigrant children in the classroom affects the educational attainment of native Dutch children in terms of their language and math performance at the end of primary school. Our paper studies the spill-over effects at different parts of the test score distribution of native Dutch students using a quantile regression approach. We find no evidence of negative spillover effects of the classroom presence of immigrant children at the median of the test score distribution. In addition, there is no indication that these spill-over effects are present at other parts of the distribution.