THE CONCENTRATION OF ECONOMIC POWER SINCE THE PANDEMIC
Frédéric Marty
Available in French only
Prisme N°41
Octobre 2020.pdf
Space Surveillance and Risk Assessment: The Economics of Debris
George Papanicolaou- Read more
- In 2007, China sent a planet-hunter satellite into orbit with the objective of colliding with another satellite. The collision produced the largest amount of space debris in history, capturing the attention of the global community. This incident changed the perception of the dangers of such debris, becoming a serious concern for governments with a space strategy. The awareness, prevention and elimination of such debris have been the object of intense planning and economic activity over the last 12 years. This Prisme presents an overview of these developments from the point of view of a scientist in the field of space debris.
Fractional and Statistical Randomness
Charles Tapiero & Pierre ValloisAbstraction vs. Application: Itô’s Calculus, Wiener’s Chaos and Poincaré’s Tangle
Daniel GoroffThe Reconciliation of Betting and Measure
Glenn Shafer- Read more
-
Three-and-a-half centuries ago, Blaise Pascal and Pierre Fermat proposed competing solutions to a classical problem in probability theory, the problem of points. Pascal looked at the
paths play in the game might take. Fermat counted the combinations. The interplay between betting and measure has been intrinsic to probability ever since.
In the mid-twentieth century, this duality could be seen beneath the contrasting styles of Paul Lévy and Joseph Doob. Lévy's vision was intrinsically and sometimes explicitly game-theoretic. Intuitively, his expectations were those of a gambler; his paths were formed by outcomes of successive bets. Doob confronted Lévy's intuition with the cold rigor of measure.
Kiyosi Itô was able to reconcile their visions, clothing Lévy's pathwise thinking in measure-theoretic rigor.
The reconciliation is now understood in terms of measure. But the game-theoretic intuition has been resurgent in applications to finance, and recent work shows that the game-theoretic picture can be made as rigorous as the measure-theoretic picture. In this game-theoretic picture, martingales regain their identity as capital processes and are used to define probability one and develop a purely game-theoretic version of Itô's calculus. Details are provided in my forthcoming book with Vladimir Vovk, Game-Theoretic Foundations for Probability and Finance.
Big Data: A Game Changer for Financial Mathematics?
Mathieu Rosenbaum- Read more
- The increasingly widespread availability of massive volumes of digital data is challenging the mathematical sciences. While the inflow of big data modifies our ways of accessing and managing data, quantitative methods remain, on the other hand, largely unchanged, even if they must hitherto be applied on a very large scale over very short time periods.
The aim of this Prisme is to evaluate the reformulations currently imposed by big data in the various fields of mathematics and the developments they are making possible in finance
This text is available in French only.
Saturation and Growth Over Time: When Demand for Minerals Peaks
Raimund Bleischwitz & Victor NechiforPrisme N°34 November 2016 (2,97mo).pdf
- Read more
- Decoupling is at the core of the contemporary debate about economic growth and natural resources: will the delinking of economic growth and resource use happen at all given the dynamics in developing countries? Will it occur through an invisible hand of progress and improvements in resource efficiency? What lessons can be learned from a long-term international perspective?
This Prisme combines the analytical strands of resource economics and material flow analysis to answer those questions. It looks at materialspecific demand and stock build-up trends over an extended time horizon of a century. Four materials (steel, cement, aluminium and copper) are analysed for a group of four industrialized countries (Germany, Japan, the UK and the USA) together with China, as the most pre-eminent emerging economy. In analysing a new set of per capita and gross domestic product indicators, our research confirms the relevance of a saturation effect with a number of specifications. We cautiously expect decoupling processes to occur in China over the next few decades, and most likely in other developing countries as well. Forecasts and modelling efforts should take such saturation into account.
A Time-Frequency Analysis of Oil Price Data
Josselin Garnier & Knut Sølna Prisme N°33 October 2017 (3,7 MiB)
- Read more
- Oil price data have a complicated multi-scale structure that may vary over time. We use time-frequency analysis to identify the main features of these variations and, in particular, the regime shifts. The analysis is based on a wavelet-based decomposition and a study of the associated scale spectrum. The joint estimation of the local Hurst coefficient and volatility is the key to detecting and identifying regime shifts and switches in the crude oil price since the mid-1980s until today.
How to Flee on a Straight Line
Tracking Self-Avoiding Random Walks
Laure Dumaz
- Read more
- This text focuses on the path of a particular random walk, that, for example, of a fugitive who is trying to escape her pursuers, but who can only move on a real number line. The aim of the fugitive is to find the continuous path that leaves behind the least information possible. This problem is addressed within the framework of the theory of random walks. Different kinds of random walks are presented, starting with the well-known Brownian motion. The scaling limit of the self-avoiding random walk is then explained before examining the self-repelling process, which provides an optimal strategy.
The Evolving Connection between Probability and Statistics.
Do Statisticians Need a Probability Theory?
Noureddine El Karoui
Prisme N°30 December 2014 (449 KiB)
- Read more
- Told from the perspective of the daily life and teaching of an academic statistician, the aim of this text is to show how the field of statistics has evolved and continues to evolve, especially in relation to probability theory. The text will use two examples to illustrate that purpose. In the first case, we will look at housing data and ask whether it is possible to predict a house’s future sale price based on its characteristics. In the second case, we will examine the possibility of building a SPAM filter for an e-mail account. The text will explore, at a high-level, various classical and progressively more modern statistical techniques that could be used to analyse these data, examining the role of probability theory in the development and use of these ideas, and thus illustrating the evolving connection between probability theory and statistics.
How Quantum Can a Computer Be?
Elham KashefiPrisme N°29 September 2014 (1.2 MiB)
- Read more
- This text briefly reviews the history of quantum computing to provide a backdrop to the new emerging field of quantum technology, which is raising new challenges. In particular, quantum computing has an acute verification and validation problem: on the one hand, since classical computations cannot scale up to the computational power of quantum mechanics, verifying the correctness of a quantum-mediated computation is challenging; on the other hand, the underlying quantum structure resists classical certification analysis. The text concludes with recent progress on how to evaluate today’s quantum computing devices so that we can effectively exploit tomorrow’s.
The Dynamics of Capitalism and Worker Participation: A Long-term Analysis
Bernard Gazier & Olivier BoylaudPrisme N°28 décembre 2013 (1.7 MiB)
- Read more
- This text is currently be translated into English.
What Makes Public-Private Partnerships Work? An Economic Analysis
Jean Bensaïd & Frédéric MartyPrisme N°27 June 2014 (2.8 MiB)
- Read more
- Public-private partnerships are long-term, global, administrative contracts by which a public authority entrusts a private contractor with some or all of the missions of design, construction, funding, operation and maintenance of an infrastructure or the provision of a public service. The private contractor recovers its initial investment and collects revenue for the service provided by means of tolls paid by users (depending on the traffic) or rent paid by the public authority (depending on the availability of the required service and the satisfaction of criteria of quality and performance).
Criticized for their cost, rigidity and lack of transparency, condemned on the basis of a number of failures or difficulties in their implementation, public-private partnerships are nevertheless an appropriate instrument for the realization of certain projects and for the efficient exploitation of public assets and infrastructures. This Prisme presents a dispassionate analysis of these contracts, highlighting the economic and financial parameters that can lead public authorities to choose this solution within the context of the search for transparency and the need to make efficient use of public moneys.
Private funding may prove to be indispensable, given the constraints currently imposed on public finances, to meet the needs for infrastructure investment. Likewise, the public-private partnership may create an efficient incentive framework to protect the public authority from spiralling costs or delays and to guarantee a service of quality throughout the duration of the contract.
Having said that, these contracts are no magic solution that can be applied to every project or in every situation. This Prisme explains how far and under what conditions the public-private partnership can fulfil its promise. It places particular emphasis on the financial dimension, which is the cornerstone of these contracts in terms of both efficiency and budgetary sustainability. And lastly, it examines the changes undergone by this model, especially those related to funding conditions.
This text was inspired by the presentation 30 Years of Public-Private Partnerships: A Macroeconomic Assessment given by Frédéric Marty and commented by Jean Bensaïd on 20 June 2013 at the Cournot Seminar.
Why France Should Give Co-determination a Chance
Jean-Louis Beffa & Christophe ClercPrisme N°26 Janvier 2013 (1.6 MiB)
- Read more
- This text is based on the presentation Co-determination à la française given by Jean-Louis Beffa and Christophe Clerc on 17 October 2012 at the Cournot Seminar.
Sifting Noise: The Role of Probability in Imaging
Josselin GarnierPrisme N°25 November 2012 (686.6 KiB)
- Read more
- Imaging techniques using waves to probe unknown media have long existed. Classically, these techniques can be divided into a phase of data gathering and a phase of data processing. During the data-gathering phase, waves are emitted by a source or source array, propagated through the medium being studied, and are then recorded by a receiver array. The processing phase consists in extracting information about the medium from the data recorded by the receivers. Recently, new ideas have emerged driven by observations made during time-reversal experiments. Based on these observations, new imaging methods have been developed using cross correlations of the signals recorded by sensor arrays. Mathematical analysis has shown that the cross correlation of signals recorded by two passive sensors essentially contains as much information about the medium as the signal that would have been recorded if one of the sensors were active (emitter) and the other passive (receiver). The important point demonstrated by this analysis is that uncontrolled sources of ambient noise can be used instead of controlled sources to compute cross correlations and use them for imaging. This possibility has attracted the attention of researchers in mathematics, in the domain of probabilities, for profound theoretical reasons, because the idea of useful noise overturns the customary distinction between signal and noise. This has also been the case in seismology for obvious practical reasons concerning the sparsity of sources (earthquakes) and the impossibility of controlling them. The aim of this paper is to describe how the idea of exploiting ambient noise to address problems of imaging took shape.
This text is based on the presentation, Noise from a Stochastic Perspective, given by Josselin Garnier on 20 October 2011 at the Cournot Centre’s seminar “The Probabilism Sessions”.
What Probabilities Measure
Mikaël Cozic & Bernard WalliserPrisme N°24 September 2012 (349.1 KiB)
- Read more
- Probability is one of the fundamental tools that modellers use to describe and explain. It can represent both the properties of all kinds of events (social, psychological or natural) and agents’ degrees of belief. Probability raises formidable conceptual challenges, which are the object of the philosophy of probability. The definition of probability is based on an often implicit ontology, and its evaluation raises specific epistemological problems. The purpose of this article is to outline a conceptual framework within which the fundamental categories of philosophers of probability and probabilists can communicate.
Probability is one of the fundamental tools available to modellers to describe and explain the phenomena they study. It is used to represent both the properties of all kinds of events (social, psychological or natural) and agents’ degrees of belief. There are formidable conceptual difficulties inherent in probability, and they are the particular subject of the philosophy of probability. The definition of probability is based on an often implicit ontology, and its evaluation raises specific epistemological problems. The purpose of this article is to outline a conceptual framework within which the fundamental categories of philosophers of probability and probabilists can communicate.
This text is based on the presentation, Probabilities of Probabilities, given by Bernard Walliser with comments by Mikaël Cozic on 21 January 2010 at the Cournot Centre’s seminar “The Probabilism Sessions”.
Dead Ends and Ways Out of the Crisis from a Macroeconomic Perspective
Xavier TimbeauPrisme N°23 June 2012 (516.1 KiB)
- Read more
- This text is based on the presentation, A State of Macroeconomics: Is Economics already Probabilistic?, given by Xavier Timbeau on 29 February 2012 at the Cournot Centre’s seminar “The Probabilism Sessions”.
Where Do Booms and Busts Come From?
Paul De GrauwePrisme N°22 October 2011 (481.0 KiB)
- Read more
- Capitalism is characterized by booms and busts. Periods of strong growth in output alternate with periods of declines in economic growth. Every macroeconomic theory should attempt to explain these endemic business cycle movements. In this text, I present two paradigms that attempt to explain these booms and busts. One is the Dynamic Stochastic General Equilibrium (DSGE) paradigm, in which agents have unlimited cognitive abilities. The other paradigm is a behavioural one, in which agents are assumed to have limited cognitive abilities. These two types of models produce radically different macroeconomic dynamics. I analyse these differences. I also study the different policy implications of these two paradigms.
Tracking the Random Race
Michel ArmattePrisme N°21 novembre 2012 (873.4 KiB)
- Read more
- The historical trajectory of randomness in scientific practices has not been smooth. What have been the different stages in its ascent, and how has it been interpreted? The classical view of 19th century probability, followed by the emergence of objective chance and the many different roles attributed to it from the 1830s on, led to the development of the theory of processes in the 20th century. The mathematics of chance has been marked out by the milestones of randomness as it has gradually penetrated the disciplines that owe it so much: physics, biology, economics and finance.
This text was inspired by the presentation, Three Sources of Probability Calculations in the 18th Century, given by Michel Armatte on 28 October 2009 at the Cournot Seminar.
Is Everything Stochastic?
Glenn ShaferPrisme N°20 December 2010 (241.1 KiB)
- Read more
- Kolmogorov said no, Popper said yes. My sympathies lie with Kolmogorov, the old-fashioned empiricist.
In the on-line setting, where we see previous outcomes before making the next probability forecast, we can give probabilities that have objective value because they pass statistical tests. This accounts for the success of many adaptive methods, and it is related to Leonid Levin’s notion of a universal prior probability distribution. It tells us that yes, everything is stochastic, but in a sense that is empirically empty because it is not falsifiable.
When we understand that success of adaptive methods does not depend on the world being stochastic in a falsifiable sense, we may want to be more parsimonious in causal modeling and more open to non-standard methods of probability judgement.
This text is based on the presentation, Is Everything Stochastic?, given by Glenn Shafer on 13 October 2010 at the Cournot Centre’s seminar “The Probabilism Sessions”.
Can Statistics Do without Artefacts?
Jean-Bernard ChatelainPrisme N°19 December 2010 (637.1 KiB)
- Read more
- This article presents a particular case of spurious regression, where a dependent variable has a simple correlation coefficient close to zero with two other variables, which are on the contrary highly correlated with each other. In this type of regression, the parameters measuring the magnitude of effects on the dependent variable are very high. They can be “statistically significant”. The tendency of scientific journals to favour the publication of statistically significant results is one reason why spurious regressions are so numerous, especially since it is easy to build them with variables that are lagged, squared or interacting with another variable. Such regressions can enhance the reputation of researchers, by stimulating the appearance of strong effects between variables. These often surprising effects are not robust. They often depend on a limited number of observations, and this fact has fuelled scientific controversies. The resulting meta-analyses, based on statistical synthesis of the literature evaluating this effect between two variables, confirm the absence of any effect. The text provides an example of this phenomenon in the empirical literature, with the aim of evaluating the impact of development aid on economic growth.
This text was the basis for the presentation, Spurious Regressions, given by Jean-Bernard Chatelain on 27 May 2010 at the Cournot Centre’s seminar “The Probabilism Sessions”.
The Impossible Evaluation of Risk
André OrléanPrisme N°18 April 2010 (647.9 KiB)
- Read more
- The current financial crisis stems from a massive under-estimation of mortgage risks, particularly of the subprime kind. This essay seeks to understand the origins of such an error. Economists most often advance the perverse incentive structure as the cause. This is a valid point, but it only provides a partial explanation. This text explores another hypothesis: the difficulty inherent in predicting the future when agents face uncertainty of a Knightian or Keynesian type. It seeks to show that economic uncertainty is of this type. Probability calculus cannot be applied to it. For that reason, economic uncertainty evades the only available method of prediction: statistical inference. Consequently, in a Knightian world, there is no such thing as an objective evaluation of risk. This point is illustrated by examining the case of the US presidential elections of 2000.
This text is based on the presentation, The Keynesian Concept of Uncertainty, given by André Orléan on 25 November 2009 at the Cournot Centre’s seminar “The Probabilism Sessions”.
A Moment of the Probabilistic Experience: The Theory of Stochastic Processes and their Role in the Financial Markets
Nicole El Karoui & Michel ArmattePrisme N°17 February 2010 (410.3 KiB)
- Read more
- Probabilists are often interested in the history of their discipline, and more rarely by the fundamental questions that they could ask about the facts they model. Works like Augustin Cournot: Modelling Economics, and especially the chapter by Glenn Shafer, throw light on some of my experience in the domain of probability over the last 40 years, which began in 1968, at the end of the first year of my Ph.D. They have prompted me to present my own point of view here.
I had the good fortune to participate in an extraordinary moment in the development of probability, more precisely the theory of stochastic processes. This was an unforgettable period for me. At the time, I had the feeling that I was witnessing science — probability theory — in the making. Subsequently, (rather by chance, it must be said) I switched over to the side of probability users, about 20 years ago, by focusing my research on one particular sector of finance. In the present text, I shall try to explain what interested me in the approach and in this aspect of finance on which I am still working today. To begin with, my account will be that of a pure probability theorist, and then that of an applied probabilist.
This text is based on the presentation, The Autonomization of Probability as a Science: The Experience of a Probabilist, given by Nicole El Karoui on 18 September 2008 at the Cournot Centre’s seminar “The Probabilism Sessions”.
Should You Take a Risk When You Do Not Know for Sure? From Judging to Acting since Condorcet
Pierre-Charles PradierPrisme N°16 March 2010 (331.3 KiB)
- Read more
- Condorcet proposed a principle of reasonable probability: actions entailing a prohibitive risk with a non-negligible probability should not be taken. This principle guides the development of knowledge as much as it guides the action itself. The mathematics developed by Laplace has allowed for the effective application of this principle in mathematical statistics (point estimates combined with a high confidence level) or in the management of insurance companies (calculating the loading rate to ensure the solvency of the company). During the same period, Tetens was developing related ideas – though with less mathematical efficacy. These ideas from the 18th century still apply today, both in (the interpretation of) certain modern decision models and in the informational and legal requirements that should be enforced to ensure that financial decisions are rational.
This text is based on the presentation, The Probabilization of Risk, given by Pierre-Charles Pradier on 30 September 2009 at the Cournot Centre’s seminar “The Probabilism Sessions”.
An Economic Analysis of Fair Value: Accounting as a Vector of Crisis
Vincent Bignon, Yuri Biondi & Xavier RagotPrisme N°15 August 2009 (412.9 KiB)
- Read more
- European legislation took its essential inspiration from the logic of historical cost: the valuation of balance-sheet assets was grounded in the depreciated historical cost of their acquisition. In July 2002, the European Parliament’s adoption of new accounting standards for quoted companies, which took effect 1 January 2005, oriented European accounting towards the new principle of fair value. Its introduction has imposed the determination of the value of assets by the present value of the expected profits that these assets can generate. It has involved establishing the value of each asset according to its future contribution to the profit of the business.
Contemporary research, however, does not have as its ultimate goal the replacement of historical cost by fair value. Recent work analysing business production processes plead, on the contrary, for limitation of its usage. Three concepts summarize this work: asymmetry of information, complementarities, and specificities of assets employed. Firms create wealth by making assets complementary, because they add to these assets characteristics specific to the production process deployed. These supplementary characteristics have no market value, and thus the value of each asset for a firm is always greater than its resale value. Consequently, the specificity of an asset is defined by the difference between its value for the firm and its market value. In order to preserve the competitive advantage flowing from this combination of specific assets, it is necessary to keep this type of information secret: hence, there exists an asymmetry of information between the firm and its environment.
In this context, the criterion of fair value poses important problems of asset valuation: the specificity and complementarity of assets force accountants to use valuation models in order to determine asset values. Financial analysts have recourse to such models in order to value businesses. The use of these models for accounting purposes does not, however, ensure the reliability of accounts; in effect, small changes in the assumptions can lead to large variations in the results. The purpose of accounting is rather to constitute a source of independent information, in a form that is relevant to valuation by financial markets.
In addition to the valuation problem, the principle of fair value may introduce the problem of financial volatility into accounting. The existence of excessive financial market volatility, which is demonstrable theoretically and empirically, creates superfluous risk and tends to reduce the investment capacity of firms. Lastly, fair value reinforces financial criteria to the detriment of the other valuation criteria of management teams. All stakeholders in the business, including shareholders and institutional investors, risk being its victims.
The financial crisis that began in the summer of 2007 confirms the intrinsic flaws of the fair-value accounting model. It did not help to prevent the crisis, but rather deepened it. Accounting must be an instrument of control and regulation, independent of the market and centred on the firm as an enterprise entity, without following daily market values. Accounting must thus establish itself as a central institution of market economies, essential to the functioning of the markets and in accordance with the public interest.
Why Contemporary Capitalism Needs the Working Poor
Bernard GazierPrisme N°14 December 2008 (1.1 MiB)
- Read more
- This short essay explores the apparent paradox of the “working poor” – persons remaining in poverty despite their working status. While it seems that the existence of the working poor is an inescapable by-product of capitalism, the size and modalities of this phenomenon vary considerably among countries.
The first section examines the various definitions of the working poor. Although great efforts have been made to gain a better statistical understanding and measurement of the working poor, researchers and governments are far from agreeing on one single definition. On the contrary, a set of different approximations, mixing low earnings, family composition and tax effects, are necessary for capturing what is a hybrid reality. The second section is devoted to a critical assessment of some selected empirical and comparative studies on Europe. They confirm the strong diversity in possible definitions, as well as in national situations and developments. They also suggest that a major role is played by institutions, not only transfers, but also the segmentation and organization of the labour market. The last section presents different theoretical perspectives on the working poor. It insists on the functional role played by low wages and the activation of social policies in jointly controlling the labour market and the workforce. Some public policy issues could contribute to mitigating this functional role.
History Repeating for Economists: An Anticipated Financial Crisis
Robert BoyerPrisme N°13 November 2008 (811.4 KiB)
- Read more
- Finance can contribute to growth through various mechanisms: the transfer ofsavings from lenders to borrowers, the smoothing of investment and consumption profiles over time or again the transfer of risk. Financial innovations have their own characteristics: the result of private profit-seeking strategies, new financial products can spread very fast, because their production process is immaterial. This rapid diffusion can have a significant impact on macroeconomic stability. Financial history shows that the effects of financial innovation, ultimately favourable to growth, materialize through a succession of crises and efforts at regulation to avoid their repetition. Historical analysis, unlike the theories that postulate the stability and efficiency of financial markets, also allows us to detect the emergence of financial crises. The crisis triggered by the subprime mortgage meltdown is no exception.
The sequence: “private financial innovation, diffusion, entry into a zone of financial fragility, open crisis” does not stem from the irrationality of agents’ behaviour. Is it then possible to avoid a financial crisis? Why not apply the same sort of certification procedures to financial innovations as we impose on food products, drugs, cars, public transport, banking and insurance? Up until now, the omnipotence of finance has prohibited any such public intervention.
Towards a Probabilistic Theory of Life
Thomas HeamsPrisme N°12 October 2008 (707.2 KiB)
- Read more
- Biology has long been dominated by a deterministic approach. The existence of a genetic code, even a “genetic programme”, has often led to descriptions of biological processes resembling finely-regulated, precise events written in advance in our DNA. This approach has been very helpful in understanding the broad outlines of the processes at work within each cell. However, a large number of experimental arguments are challenging the deterministic approach in biology.
One of the surprises of recent years has been the discovery that gene expression is fundamentally random: the problem now is to describe and understand that. Here I present the molecular and topological causes that at least partly explain it. I shall show that it is a wide-spread, controllable phenomenon that can be transmitted from one gene to another and even from one cell generation to the next. It remains to be determined whether this random gene expression is a “background noise” or a biological parameter. I shall argue for the second hypothesis by seeking to explain how this elementary disorder can give rise to order. In doing so, I hope to play a part in bringing probability theory to the heart of the study of life. Lastly, I shall discuss the possibility of moving beyond the apparent antagonism between determinism and probabilism in biology.
This text is based on the presentation, The Random Expression of Genes: Probability or Biological Parameter?, given by Thomas Heams on 20 March 2008 at the Cournot Centre’s seminar “The Probabilism Sessions”.
Does Fiscal Stimulus Work?
Edouard ChallePrisme N°11 November 2007 (410.7 KiB)
- Read more
- This article examines the way in which fiscal policy impulses (variations in government spending and tax cuts) affect aggregate variables such as GDP, consumption, investment and employment.
Economic theory distinguishes three potential channels of transmission for these impulses, according to whether they affect the equilibrium through their wealth effects, their aggregate demand effects, or their liquidity effects.
We therefore intend to evaluate the extent to which these theoretical channels are consistent with the empirically observed impacts of fiscal stimulus. Although economists have traditionally focused their attention on wealth effects and aggregate demand effects, traditionally associated with the “classical” and “Keynesian” paradigms, recent works on the subject show that liquidity effects also play an important role. Finally, in the presence of aggregate demand effects and liquidity effects, fiscal stimulus is all the more effective over the short term when it is financed by government debt issue. The gains achieved through debt-financed stimulus can, however, conflict with the social costs resulting from high levels of long-term public debt, and this raises a specific problem concerning the dynamic consistency of fiscal policy.
The Japanese Economy after the Flux Decade: Where Will Changes in Company Structure Lead?
Masahiko AokiPrisme N°10 September 2007 (233.5 KiB)
- Read more
- How should one interpret the changes in Japan’s company structure that have been affecting the Japanese economy since the early 1980s? This text proposes a conceptual framework from the firm’s point of view, after examining empirical evidence. Has Japan’s corporate governance made a substantive institutional transformation, and, if so in which direction? Four stylized analytical models of corporate governance are presented, and the conditions in which each would be viable are identified.
Using this theoretical background, the text examines the driving forces, as well as the historical constraints, of the changes taking place in Japan. The nature of the on-going institutional changes in Japan’s corporate governance can be interpreted as a possible transition from the traditional bank-oriented model to a hybrid model, built on the combination of managerial choice of business model, employees’ human assets, and stock-market evaluations. No single mechanism has emerged as dominant, but a variety of patterns seems to be evolving.
Building a Fiscal Territory in Europe: Prohibiting, Harmonizing, Approximating, Guaranteeing, Informing
Évelyne ServerinPrisme N°9 March 2007 (223.4 KiB)
- Read more
- Community law has created a fiscal territory by using a series of legal techniques founded on different parts of the Treaty. By dividing Europe’s complex legal structures into five categories of action, we seek to elucidate the foundations, and nature, of EU tax policy. We can separate these legal techniques into two groups, depending on whether they involve taxation, in the strict sense of the word, or other rights and liberties.
Three actions are directly associated with taxation: prohibition, harmonization and approximation. Two other actions do not affect substantial tax law, but influence its application: guaranteeing the exercise of fundamental liberties and informing the Member States. We shall evaluate the relative weight of these two groups of actions through an analysis of the instruments available to the Community and their associated jurisprudence.
Political Goals, Legal Norms and Public Goods: The Building Blocks of Europe?
Robert Boyer & Mario DehovePrisme N°8 November 2006 (770.5 KiB)
- Read more
- In the introduction of technical norms and the free circulation of goods and people, as in the harmonization of indirect taxes or the portability of social rights, the principle of competition dominates over all other principles in the building of Europe. This primacy of competition has aroused the distrust of many citizens regarding the Union and is now obstructing the emergence of public goods in Europe. While economic theory provides satisfactory explanations of public goods management, it is has great difficulty in analysing their genesis. This helps to explain the discrepancies between the theory’s predictions and the empirically observable distribution of powers.
Theories of justice maintain that the persistence of strong national traditions in areas such as professional relations or the expression of solidarity make the construction of a social Europe more difficult. Legal analysis highlights the decisive role played in all member states by judges and courts, whose jurisprudence continuously and practically delimits the role and prerogatives of all the players. By so doing, they create the conditions for a review of the allocation of these powers by the political authorities. The necessary reconstruction of European institutions must then anticipate the formation of new public goods as diverse as security and justice, science and energy security.
From Cournot to Public Policy Evaluation: Paradoxes and Controversies of Quantification
Alain DesrosièresPrisme N°7 April 2006 (323.6 KiB)
- Read more
- The French mathematician, economist and thinker Augustin Cournot inaugurated the philosophical treatment of the new probabilistic and quantitative modes of reasoning that emerged in the first half of the 19th century. The text reviews the legacy and implementation of Cournot’s intuitions concerning the distinction between so-called objective and subjective probabilities, and the interpretation of the categories constructed by statisticians according to “equivalence conventions”. Suggestive clues emerge for the empirical study of current statistical practices, in particular, those transactions that take place in the “contact zone”, where quantified assertions recorded in more or less formal models replace unquantified assertions formulated in natural language. Examples of these exchanges are illustrated in the cases of risk management, macroeconomic policy and public service performance evaluation.
In conclusion, the paper highlights how the ambivalence of Cournot’s thought is echoed in the controversies raised in some recent sociology of science, polarized between diverse forms of “realism” and “constructivism”. Questions suggested by Cournot are the starting point for an exploration of the sense in which quantification can be said to create objectivity.
Patent Fever in Developed Countries and its Fallout on the Developing World
Claude HenryPrisme N°6 May 2005 (282.9 KiB)
- Read more
- This paper examines the relationship between innovation and intellectual property rights. Over the past 25 years, the traditional balance between patent legislation and knowledge as public good has started to shift in favour of the former. The global, uniform, but flawed approach to patenting systems, driven by the United States and reflected in the TRIPS agreement, will cause negative externalities for developing countries. The paper suggests that these effects might be mitigated through appropriate instruments and prudent transposition of the TRIPS agreement into national legislations. It argues that the legal and economic foundations that have underpinned traditional intellectual property rights remain valid. Recent trends in approaches to intellectual property rights, including patent proliferation and geographical spread, are critically examined against the background of US-sponsored linkage of patent protection with free trade agreements. Examples from the life sciences and biotechnology illustrate the problems of unwarranted patents and excessive patent breadth, reinforcing doubts about the current uniformization of intellectual property rights protection, and highlighting the risk to innovation and development policy. In the final section, the paper explains how two developing countries have invoked the remedial measures embedded in the TRIPS agreement. These mechanisms include interpretative freedom, opposition procedures and compulsory licences. The paper concludes that from a Schumpeterian viewpoint, “open source” makes the main factors governing innovation more compatible than patent-based protection.
From Financial Capitalism to a Renewal of Social-Democracy
Michel Aglietta & Antoine RebériouxPrisme N°5 October 2004 (574.4 KiB)
- Read more
- Recent corporate governance scandals have brought to the fore the inherent contradictions of a capitalism dominated by financial markets. This text argues that capitalism’s basic premise – that companies be managed in the sole interest of their shareholders – is incongruent with the current environment of liquid markets, profit-hungry investors and chronic financial instability. In this context, this text also analyses the financial scandals of the Enron era, going beyond the malfunctioning of the gatekeepers (auditors, financial analysts, ratings agencies) to stress the failure of shareholder value and the inadequacy of measures intended to prevent such scandals.
A company should be managed as an institution where common objectives are developed for all stakeholders, and this democratic principle should be extended to the management of collective savings to reduce macro-financial instability. These two conditions could make contemporary capitalism a vehicle for social progress, giving shape to a new kind of social democracy.
This Prisme presents the conclusions of Corporate Governance Adrift: A Critique of Shareholder Value, published by Edward Elgar Publishing in 2005.
An Economic Analysis of Fair Value: The Evaluation of Accounting Principles in European Legislation
Vincent Bignon, Yuri Biondi & Xavier RagotPrisme N°4 March 2004 (288.5 KiB)
- Read more
- In July 2002, the European Parliament’s adoption of new accounting standards for quoted companies, to take effect from January 1, 2005, oriented European accounting towards a new principle, that of fair value. Hitherto, European legislation took its essential inspiration from the logic of historical cost: the valuation of balance sheet assets was grounded in the depreciated historical cost of their acquisition. The introduction of the principle of fair value will impose the determination of the value of assets by the present value of the expected profits that these assets can generate. It involves establishing the value of each asset according to its future contribution to the profit of the business.
Contemporary research, however, does not have as its ultimate goal the replacement of historical cost by fair value. Recent work analysing business production processes plead, on the contrary, for limitation of its usage. Three concepts summarize this work: asymmetry of information, complementarities, and specificities of assets employed. Firms create wealth by making assets complementary, because they add to these assets characteristics specific to the production process deployed.
These supplementary characteristics have no market value, and thus the value of each asset for a firm is always greater than its resale value. Consequently, the specificity of an asset is defined by the difference between its value for the firm and its market value. In order to preserve the competitive advantage flowing from this combination of specific assets, it is necessary to keep this type of information secret: hence, there exists an asymmetry of information between the firm and its environment.
In this context, the criterion of fair value poses important problems of asset valuation: the specificity and complementarity of assets force accountants to use valuation models in order to determine asset values. Financial analysts have recourse to such models in order to value businesses. The use of these models for accounting purposes does not, however, ensure the reliability of accounts; in effect, small changes in the assumptions can lead to large variations in the results. The purpose of accounting is rather to constitute a source of independent information, in a form that is relevant to valuation by financial markets.
In addition to the valuation problem, the principle of fair value may introduce the problem of financial volatility into accounting. The existence of excessive financial market volatility, which is demonstrable theoretically and empirically, creates superfluous risk and tends to reduce the investment capacity of firms. Lastly, fair value reinforces financial criteria to the detriment of the other valuation criteria of management teams. All stakeholders in the business, including shareholders and institutional investors, risk being its victims.
It is difficult to affirm that the net contribution of fair value to the improvement of accounting standards is positive. If far from ideal, the logic of historical cost appears as the least worst option in the presence of informational asymmetries, complementarities and specificities.
Reforming Europe: Is the Third Way the Only Way?
Bruno AmablePrisme N°3 January 2004 (261.5 KiB)
- Read more
- The subject of reform is at the heart of current economic debate in Europe. The “Sapir Report” is the latest example. It denounces the institutions of the European model for keeping the European Union from growing at a sufficient pace. These institutions, it claims, are creating roadblocks to structural changes, changes that have been made vital by the important role of innovation in today’s world. The Report claims that the answer lies in implementing reforms to increase “microeconomic” efficiency.
This text examines critically this argument. If Europe were to adopt these reforms, European countries would have to switch to a different model of capitalism. That would mean abandoning the European model – characterized by a high degree of social security and employment protection – for the neo-liberal model, with its reduced social security and flexible labour markets.
This booklet compares the growth and innovation performances of France and Germany with those of the U.K. and the U.S., as well as with those of Sweden and Finland. These comparisons reveal the need to question, at the very least, the current rhetoric of the uncontested superiority of the neo-liberal model. I underline that even if the different models are capable of providing comparable overall performances, they do not have the same consequences in terms of income distribution and coverage of social risks.
Consequently, choosing a model, by its very nature, is a political choice. Therefore, choosing the reform means choosing to forge ahead with changes that took place during the Conservative Revolution in the U.S. and the U.K. (i.e. the Margaret Thatcher and Ronald Reagan years). A new dimension of this debate is that centre-left parties are adopting the political project of converting to the neo-liberal model, which is usually only associated with conservative parties.
This text concludes by examining two scenarios of structural change. The first scenario envisages the completion of the reform and the transformation of the European model to a neo-liberal one. The second scenario involves a transition towards a social-democratic model of capitalism. Neither scenario is without significant political consequences.
Lessons Learned from U.S. Welfare Reform
Robert SolowPrisme N°2 November 2003 (155.4 KiB)
- Read more
- The 1996 U.S. Welfare Reform Act concentrates, almost solely, on getting people to work and off socially assisted programs. The reform has produced changes in the structure of benefits, introduced time limits, strengthened requirements for mandatory participation in work-related activities and changed various administrative procedures. The implementation of this federal Act has been largely left to the discretion of the individual states.
The law has been in effect for seven years and is up for reauthorization. It is thus time to assess its mechanisms and outcomes. Welfare reform is responsible for a portion of the increase in beneficiaries’ work and earnings. Most evidence from econometric studies points in this direction. Many of these studies, however, overlook the fact that employment and the demand for welfare assistance are heavily influenced by macroeconomic factors among other things.
In this booklet, Robert Solow evaluates these analyses and provides direction for future reforms.
Ramifications of the European Commission’s Directive on Takeover Bids
Jean-Louis Beffa, Leah Langenlach & Jean-Philippe TouffutPrisme N°1 September 2003 (326.0 KiB)
- Read more
- Launched in 1974, the idea of harmonizing public takeover bid legislation found its first expression in 1985 in a draft Directive. This early draft was rightly rejected in July 2001. Bolstered by 30 amendments, a second version of the Directive was adopted on December 16, 2003.
The initial objective of the Directive was to promote a common framework for cross-border takeovers, to facilitate corporate restructuring and to protect minority shareholders. In the interim between the rejection of the early draft and the adoption of the second proposal, three contentious articles generated extreme tension: the neutrality of the board of directors in the event of a takeover bid, restrictions on transfers of securities and multiple voting rights, and consultation with workforce representatives. The amendments adopted on these questions by the legal affairs committee of the European Parliament weaken the content of the Directive. It is left to EU member states to decide whether or not to apply the articles on the neutrality of the board of directors and on the exercise of multiple voting rights in the event of a public bid. With this optional feature comes an unpublished “reciprocity” clause. Nevertheless, the spirit of the Directive is unaltered: no article was withdrawn.
One question has not received adequate consideration in this debate: should takeover bids be encouraged? Takeover bids are one of the constitutive principles of a mode of capitalism propelled by the dynamics of financial markets. In economics, theoretical studies of public bids have been complemented by econometric analyses and field research. These show that public bids do not contribute to economic growth. Over the last 30 years, more than two-thirds of public bids have led to a decrease in business productivity and have contributed to a reduction in the overall economic growth rate. In light of this fact, should a Directive on Takeover Bids comply with financial logic, to the detriment of industrial logic? Research indicates that, on the contrary, safeguards necessary to protect firms from the instability of finance should be constructed.