プリズム

クルノセンターのプリズム

Prisme N°35

Big Data: A Game Changer for Financial Mathematics?

Mathieu Rosenbaum

Prisme N°35 March 2017.pdf

Read more
The increasingly widespread availability of massive volumes of digital data is challenging the mathematical sciences. While the inflow of big data modifies our ways of accessing and managing data, quantitative methods remain, on the other hand, largely unchanged, even if they must hitherto be applied on a very large scale over very short time periods.
The aim of this Prisme is to evaluate the reformulations currently imposed by big data in the various fields of mathematics and the developments they are making possible in finance


This text is currently being translated into English.
Prisme N°34

Saturation and Growth Over Time: When Demand for Minerals Peaks

Raimund Bleischwitz & Victor Nechifor

Prisme N°34 November 2016 (2,97mo).pdf

続き
Decoupling is at the core of the contemporary debate about economic growth and natural resources: will the delinking of economic growth and resource use happen at all given the dynamics in developing countries? Will it occur through an invisible hand of progress and improvements in resource efficiency? What lessons can be learned from a long-term international perspective?
This Prisme combines the analytical strands of resource economics and material flow analysis to answer those questions. It looks at materialspecific demand and stock build-up trends over an extended time horizon of a century. Four materials (steel, cement, aluminium and copper) are analysed for a group of four industrialized countries (Germany, Japan, the UK and the USA) together with China, as the most pre-eminent emerging economy. In analysing a new set of per capita and gross domestic product indicators, our research confirms the relevance of a saturation effect with a number of specifications. We cautiously expect decoupling processes to occur in China over the next few decades, and most likely in other developing countries as well. Forecasts and modelling efforts should take such saturation into account.
Prisme N°33

A Time-Frequency Analysis of Oil Price Data

Josselin Garnier & Knut Sølna

Prisme N°33 October 2017 (3,7 MiB)

続き
Oil price data have a complicated multi-scale structure that may vary over time. We use time-frequency analysis to identify the main features of these variations and, in particular, the regime shifts. The analysis is based on a wavelet-based decomposition and a study of the associated scale spectrum. The joint estimation of the local Hurst coefficient and volatility is the key to detecting and identifying regime shifts and switches in the crude oil price since the mid-1980s until today.
Prisme N°31

How to Flee on a Straight Line
Tracking Self-Avoiding Random Walks

Laure Dumaz
このテキストは、現在、英語に翻訳されています。

Prisme N°31 - English.pdf

続き
This text focuses on the path of a particular random walk, that, for example, of a fugitive who is trying to escape her pursuers, but who can only move on a real number line. The aim of the fugitive is to find the continuous path that leaves behind the least information possible. This problem is addressed within the framework of the theory of random walks. Different kinds of random walks are presented, starting with the well-known Brownian motion. The scaling limit of the self-avoiding random walk is then explained before examining the self-repelling process, which provides an optimal strategy.
プリズム 第30号

The Evolving Connection between Probability and Statistics.
Do Statisticians Need a Probability Theory?

Noureddine El Karoui

Prisme N°30 December 2014 (449 KiB)

続き
Told from the perspective of the daily life and teaching of an academic statistician, the aim of this text is to show how the field of statistics has evolved and continues to evolve, especially in relation to probability theory. The text will use two examples to illustrate that purpose. In the first case, we will look at housing data and ask whether it is possible to predict a house’s future sale price based on its characteristics. In the second case, we will examine the possibility of building a SPAM filter for an e-mail account. The text will explore, at a high-level, various classical and progressively more modern statistical techniques that could be used to analyse these data, examining the role of probability theory in the development and use of these ideas, and thus illustrating the evolving connection between probability theory and statistics.
プリズム 第29号

量子の到来をどのように宣言すべきか?

Elham Kashefi

Prisme N°29 September 2014 (1.2 MiB)

続き
翻訳中
プリズム第28号

資本主義の活力と労働者の先進国への参加:長期分析

Bernard Gazier & Olivier Boylaud

Prisme N°28 décembre 2013 (1.7 MiB)

Annexe Prisme 28 (138.5 KiB)

続き
This text is currently be translated into English.
プリズム第27号

公と民のパートナーシップの正当性と限界:経済学的分析

Jean Bensaïd & Frédéric Marty

Prisme N°27 June 2014 (2.8 MiB)

続き
Public-private partnerships are long-term, global, administrative contracts by which a public authority entrusts a private contractor with some or all of the missions of design, construction, funding, operation and maintenance of an infrastructure or the provision of a public service. The private contractor recovers its initial investment and collects revenue for the service provided by means of tolls paid by users (depending on the traffic) or rent paid by the public authority (depending on the availability of the required service and the satisfaction of criteria of quality and performance).

Criticized for their cost, rigidity and lack of transparency, condemned on the basis of a number of failures or difficulties in their implementation, public-private partnerships are nevertheless an appropriate instrument for the realization of certain projects and for the efficient exploitation of public assets and infrastructures. This Prisme presents a dispassionate analysis of these contracts, highlighting the economic and financial parameters that can lead public authorities to choose this solution within the context of the search for transparency and the need to make efficient use of public moneys.

Private funding may prove to be indispensable, given the constraints currently imposed on public finances, to meet the needs for infrastructure investment. Likewise, the public-private partnership may create an efficient incentive framework to protect the public authority from spiralling costs or delays and to guarantee a service of quality throughout the duration of the contract.

Having said that, these contracts are no magic solution that can be applied to every project or in every situation. This Prisme explains how far and under what conditions the public-private partnership can fulfil its promise. It places particular emphasis on the financial dimension, which is the cornerstone of these contracts in terms of both efficiency and budgetary sustainability. And lastly, it examines the changes undergone by this model, especially those related to funding conditions.

This text was inspired by the presentation 30 Years of Public-Private Partnerships: A Macroeconomic Assessment given by Frédéric Marty and commented by Jean Bensaïd on 20 June 2013 at the Cournot Seminar.
プリズム 第26号

フランス流共同決定のチャンス

Jean-Louis Beffa & Christophe Clerc

プリズム 第26号 2013年1月(英語) (1.6 MiB)

続き
This text is based on the presentation Co-determination à la française given by Jean-Louis Beffa and Christophe Clerc on 17 October 2012 at the Cournot Seminar.
Prisme N°25

確率論の篩を通して見たノイズ

Josselin Garnier

プリズム第25号 2012年11月(英語) (686.6 KiB)

続き
Imaging techniques using waves to probe unknown media have long existed. Classically, these techniques can be divided into a phase of data gathering and a phase of data processing. During the data-gathering phase, waves are emitted by a source or source array, propagated through the medium being studied, and are then recorded by a receiver array. The processing phase consists in extracting information about the medium from the data recorded by the receivers. Recently, new ideas have emerged driven by observations made during time-reversal experiments. Based on these observations, new imaging methods have been developed using cross correlations of the signals recorded by sensor arrays. Mathematical analysis has shown that the cross correlation of signals recorded by two passive sensors essentially contains as much information about the medium as the signal that would have been recorded if one of the sensors were active (emitter) and the other passive (receiver). The important point demonstrated by this analysis is that uncontrolled sources of ambient noise can be used instead of controlled sources to compute cross correlations and use them for imaging. This possibility has attracted the attention of researchers in mathematics, in the domain of probabilities, for profound theoretical reasons, because the idea of useful noise overturns the customary distinction between signal and noise. This has also been the case in seismology for obvious practical reasons concerning the sparsity of sources (earthquakes) and the impossibility of controlling them. The aim of this paper is to describe how the idea of exploiting ambient noise to address problems of imaging took shape.

This text is based on the presentation, Noise from a Stochastic Perspective, given by Josselin Garnier on 20 October 2011 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム第24号

確率が測ること

Mikaël Cozic & Bernard Walliser

プリスム 第24号 2012年9月 (489.1 KiB)

続き
Probability is one of the fundamental tools that modellers use to describe and explain. It can represent both the properties of all kinds of events (social, psychological or natural) and agents’ degrees of belief. Probability raises formidable conceptual challenges, which are the object of the philosophy of probability. The definition of probability is based on an often implicit ontology, and its evaluation raises specific epistemological problems. The purpose of this article is to outline a conceptual framework within which the fundamental categories of philosophers of probability and probabilists can communicate.

Probability is one of the fundamental tools available to modellers to describe and explain the phenomena they study. It is used to represent both the properties of all kinds of events (social, psychological or natural) and agents’ degrees of belief. There are formidable conceptual difficulties inherent in probability, and they are the particular subject of the philosophy of probability. The definition of probability is based on an often implicit ontology, and its evaluation raises specific epistemological problems. The purpose of this article is to outline a conceptual framework within which the fundamental categories of philosophers of probability and probabilists can communicate.

This text is based on the presentation, Probabilities of Probabilities, given by Bernard Walliser with comments by Mikaël Cozic on 21 January 2010 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム第23号

危機からの脱出:マクロ経済学の行くべき路と袋小路

Xavier Timbeau

プリズム第23号 2012年5月 (993.8 KiB)

続き
このテクストはクルノ・センターが「確率論(プロバビリズム)・セッション」セミナーの名の下に開催した2012年2月29日のXavier Timbeauによる発表マクロ経済学の現状:経済学はそもそも確率論上の議論であるのか? に基づくものである。
プリズム第22号

成長と危機の関係はどうなっているのか?

Paul De Grauwe

Prisme 第22号 2011年11月 (481.0 KiB)

続き
資本主義は、拡大と後退の繰り返しによって特徴付けられる。生産が力強く成長する時期は成長が衰退していく段階へと変わっていく。マクロ経済の理論は、行ったり来たりを繰り返す恒常的な運動のメカニズムを解明しようと苦労している。このテクストでは、拡大と後退という段階を説明しようとする二つのパラダイムを紹介している。一つは動学的確率的一般均衡( DSGE)モデルであり、そこではエージェントの認知能力は無限である。もう一つは行動経済学のモデルであり、そこでエージェントは限られた認知能力を与えられている。これら二つのタイプのモデルから、根本的に異なるマクロ経済の原動力が生じる。このプリズムはこれらの違いを分析し、政府の活動に対するこれら二つのパラダイムの影響を紹介する。
プリズム 第21号

偶然はどのように進行していくのか

Michel Armatte

Prisme N°21 novembre 2012 (873.4 KiB)

続き
The historical trajectory of randomness in scientific practices has not been smooth. What have been the different stages in its ascent, and how has it been interpreted? The classical view of 19th century probability, followed by the emergence of objective chance and the many different roles attributed to it from the 1830s on, led to the development of the theory of processes in the 20th century. The mathematics of chance has been marked out by the milestones of randomness as it has gradually penetrated the disciplines that owe it so much: physics, biology, economics and finance.

This text was inspired by the presentation, Three Sources of Probability Calculations in the 18th Century, given by Michel Armatte on 28 October 2009 at the Cournot Seminar.
プリズム 第20号

すべてがストキャスティックなのか?

Glenn Shafer

Prisme N°20 December 2010 (241.1 KiB)

続き
すべては偶然によっているのだろうか?コルモゴロフはノーと答え、ポパーはイエスと言った。私自身が感じているところで言えば、私はこの経験論者の旧来からの支持者としてコルモゴロフに賛意を示したい。私たちがサイコロの賭けを行う前にそれまでにどのような目が出たかを知っている時、一定の客観的意味を持つ確率=プロバビリティを公式化することができる。なぜなら確率=プロバビリティとは統計的なテストを経てその正確性が示された知だからである。このことによって、数多くのメソッドが実際に適応可能であることが説明されるし、レヴィンによる普遍的と前提される測定の概念とのつながりが示される。すなわち、この概念によってすべては偶然であるが、ある意味においてこの事実は経験的な価値を持たないことが示された。抵抗可能なメソッドの成功が反駁可能という意味では偶然の世界と関わりを持たないことが理解されるや否や、我々の世界は因果関係のモデル化を用いればよりよく測定することができ、プロバビリティの判断の型にはまらないメソッドに対してより開かれていることが示される。

This text is based on the presentation, Is Everything Stochastic?, given by Glenn Shafer on 13 October 2010 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム第19号

人為的操作なしに統計は可能なのか

Jean-Bernard Chatelain

Prisme 第19号 2010年10月(仏語)(1005.5 KiB))

続き
この論考は見せかけ回帰の特殊なケースについて扱う。すなわち独立変数がゼロにちかい単相関係数とともに他の2つの変数を持ち、それらの変数が逆に強い相関関係にあるときのケースである。このタイプの回帰においては独立変数に対する効果の規模を計るパラメーターは極めて高い。これらのパラメーターは「統計学的に意義深い」ものになり得る。学術雑誌にこのような意義深い側面を持った研究結果が優先的に掲載されるので、疑似回帰が世に溢れることになるのだが、というのも疑似回帰は、遅れ変数、二乗変数や他の変数と相互関係のある変数と組み合わせて構築するのが容易だからである。このような回帰は、変数間の高い効果の出現を刺激することことで、研究者の名声を高めることに貢献し得るものなのだ。意外性のあるこれらの効果は脆弱であることが多い。これらの効果はしばしば何らかの観察に基づいている。こういった経験的な論争は全く効果がないことと判明することがあるが、それはメタ分析が2つの変数の間のこういった効果を評価する文献の統計的総合が行なわれる場合である。我々は経済成長発展の援助の効果を評価することを目指した経験的文献における一例を示す。
プリズム第18号

評価不可能なリスク

André Orléan

Prisme N°18 April 2010 (647.9 KiB)

続き
The current financial crisis stems from a massive under-estimation of mortgage risks, particularly of the subprime kind. This essay seeks to understand the origins of such an error. Economists most often advance the perverse incentive structure as the cause. This is a valid point, but it only provides a partial explanation. This text explores another hypothesis: the difficulty inherent in predicting the future when agents face uncertainty of a Knightian or Keynesian type. It seeks to show that economic uncertainty is of this type. Probability calculus cannot be applied to it. For that reason, economic uncertainty evades the only available method of prediction: statistical inference. Consequently, in a Knightian world, there is no such thing as an objective evaluation of risk. This point is illustrated by examining the case of the US presidential elections of 2000.

This text is based on the presentation, The Keynesian Concept of Uncertainty, given by André Orléan on 25 November 2009 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム 第17号

プロバビリスティック=確率論的経験の時 — 統計プロセス理論と金融市場におけるその役割

Nicole El Karoui & Michel Armatte

Prisme N°17 2010年2月 (英語) (560.6 KiB)

続き
Probabilists are often interested in the history of their discipline, and more rarely by the fundamental questions that they could ask about the facts they model. Works like Augustin Cournot: Modelling Economics, and especially the chapter by Glenn Shafer, throw light on some of my experience in the domain of probability over the last 40 years, which began in 1968, at the end of the first year of my Ph.D. They have prompted me to present my own point of view here.

I had the good fortune to participate in an extraordinary moment in the development of probability, more precisely the theory of stochastic processes. This was an unforgettable period for me. At the time, I had the feeling that I was witnessing science — probability theory — in the making. Subsequently, (rather by chance, it must be said) I switched over to the side of probability users, about 20 years ago, by focusing my research on one particular sector of finance. In the present text, I shall try to explain what interested me in the approach and in this aspect of finance on which I am still working today. To begin with, my account will be that of a pure probability theorist, and then that of an applied probabilist.

This text is based on the presentation, The Autonomization of Probability as a Science: The Experience of a Probabilist, given by Nicole El Karoui on 18 September 2008 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム 第16号

確かな情報がない時にリスクを取ることは出来るのか?— コンドルセ以降の行動判断について

Pierre-Charles Pradier

Prisme N°16 March 2010 (331.3 KiB)

続き
Condorcet proposed a principle of reasonable probability: actions entailing a prohibitive risk with a non-negligible probability should not be taken. This principle guides the development of knowledge as much as it guides the action itself. The mathematics developed by Laplace has allowed for the effective application of this principle in mathematical statistics (point estimates combined with a high confidence level) or in the management of insurance companies (calculating the loading rate to ensure the solvency of the company). During the same period, Tetens was developing related ideas – though with less mathematical efficacy. These ideas from the 18th century still apply today, both in (the interpretation of) certain modern decision models and in the informational and legal requirements that should be enforced to ensure that financial decisions are rational.

This text is based on the presentation, The Probabilization of Risk, given by Pierre-Charles Pradier on 30 September 2009 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム 第15号

公正価値=フェアー・バリュー」の経済学的分析:危機のベクトルとしての会計

Vincent Bignon, Yuri Biondi & Xavier Ragot

Prisme N°15 2009年8月(英語) (537.1 KiB)

続き
European legislation took its essential inspiration from the logic of historical cost: the valuation of balance-sheet assets was grounded in the depreciated historical cost of their acquisition. In July 2002, the European Parliament’s adoption of new accounting standards for quoted companies, which took effect 1 January 2005, oriented European accounting towards the new principle of fair value. Its introduction has imposed the determination of the value of assets by the present value of the expected profits that these assets can generate. It has involved establishing the value of each asset according to its future contribution to the profit of the business.

Contemporary research, however, does not have as its ultimate goal the replacement of historical cost by fair value. Recent work analysing business production processes plead, on the contrary, for limitation of its usage. Three concepts summarize this work: asymmetry of information, complementarities, and specificities of assets employed. Firms create wealth by making assets complementary, because they add to these assets characteristics specific to the production process deployed. These supplementary characteristics have no market value, and thus the value of each asset for a firm is always greater than its resale value. Consequently, the specificity of an asset is defined by the difference between its value for the firm and its market value. In order to preserve the competitive advantage flowing from this combination of specific assets, it is necessary to keep this type of information secret: hence, there exists an asymmetry of information between the firm and its environment.

In this context, the criterion of fair value poses important problems of asset valuation: the specificity and complementarity of assets force accountants to use valuation models in order to determine asset values. Financial analysts have recourse to such models in order to value businesses. The use of these models for accounting purposes does not, however, ensure the reliability of accounts; in effect, small changes in the assumptions can lead to large variations in the results. The purpose of accounting is rather to constitute a source of independent information, in a form that is relevant to valuation by financial markets.

In addition to the valuation problem, the principle of fair value may introduce the problem of financial volatility into accounting. The existence of excessive financial market volatility, which is demonstrable theoretically and empirically, creates superfluous risk and tends to reduce the investment capacity of firms. Lastly, fair value reinforces financial criteria to the detriment of the other valuation criteria of management teams. All stakeholders in the business, including shareholders and institutional investors, risk being its victims.

The financial crisis that began in the summer of 2007 confirms the intrinsic flaws of the fair-value accounting model. It did not help to prevent the crisis, but rather deepened it. Accounting must be an instrument of control and regulation, independent of the market and centred on the firm as an enterprise entity, without following daily market values. Accounting must thus establish itself as a central institution of market economies, essential to the functioning of the markets and in accordance with the public interest.
Prisme 第14号

何故、現代資本主義はワーキング・プアを必要とするのか?

Bernard Gazier

プリズム N°14 2008年12月(英語) (426.5 KiB)

続き
This short essay explores the apparent paradox of the “working poor” – persons remaining in poverty despite their working status. While it seems that the existence of the working poor is an inescapable by-product of capitalism, the size and modalities of this phenomenon vary considerably among countries.

The first section examines the various definitions of the working poor. Although great efforts have been made to gain a better statistical understanding and measurement of the working poor, researchers and governments are far from agreeing on one single definition. On the contrary, a set of different approximations, mixing low earnings, family composition and tax effects, are necessary for capturing what is a hybrid reality. The second section is devoted to a critical assessment of some selected empirical and comparative studies on Europe. They confirm the strong diversity in possible definitions, as well as in national situations and developments. They also suggest that a major role is played by institutions, not only transfers, but also the segmentation and organization of the labour market. The last section presents different theoretical perspectives on the working poor. It insists on the functional role played by low wages and the activation of social policies in jointly controlling the labour market and the workforce. Some public policy issues could contribute to mitigating this functional role.
Prisme 第13号

経済学者を前にして繰り返す歴史:予知されていた金融危機

Robert Boyer

プリズム N°13 2008年11月(英語) (1.4 MiB)

続き
Finance can contribute to growth through various mechanisms: the transfer ofsavings from lenders to borrowers, the smoothing of investment and consumption profiles over time or again the transfer of risk. Financial innovations have their own characteristics: the result of private profit-seeking strategies, new financial products can spread very fast, because their production process is immaterial. This rapid diffusion can have a significant impact on macroeconomic stability. Financial history shows that the effects of financial innovation, ultimately favourable to growth, materialize through a succession of crises and efforts at regulation to avoid their repetition. Historical analysis, unlike the theories that postulate the stability and efficiency of financial markets, also allows us to detect the emergence of financial crises. The crisis triggered by the subprime mortgage meltdown is no exception.

The sequence: “private financial innovation, diffusion, entry into a zone of financial fragility, open crisis” does not stem from the irrationality of agents’ behaviour. Is it then possible to avoid a financial crisis? Why not apply the same sort of certification procedures to financial innovations as we impose on food products, drugs, cars, public transport, banking and insurance? Up until now, the omnipotence of finance has prohibited any such public intervention.
プリズム 第12号

生命体に関するプロバリスト理論へ向けて

Thomas Heams

Prisme N°12 2008年9月(英語) (551.8 KiB)

続き
Biology has long been dominated by a deterministic approach. The existence of a genetic code, even a “genetic programme”, has often led to descriptions of biological processes resembling finely-regulated, precise events written in advance in our DNA. This approach has been very helpful in understanding the broad outlines of the processes at work within each cell. However, a large number of experimental arguments are challenging the deterministic approach in biology. One of the surprises of recent years has been the discovery that gene expression is fundamentally random: the problem now is to describe and understand that. Here I present the molecular and topological causes that at least partly explain it. I shall show that it is a wide-spread, controllable phenomenon that can be transmitted from one gene to another and even from one cell generation to the next. It remains to be determined whether this random gene expression is a “background noise” or a biological parameter. I shall argue for the second hypothesis by seeking to explain how this elementary disorder can give rise to order. In doing so, I hope to play a part in bringing probability theory to the heart of the study of life. Lastly, I shall discuss the possibility of moving beyond the apparent antagonism between determinism and probabilism in biology.

This text is based on the presentation, The Random Expression of Genes: Probability or Biological Parameter?, given by Thomas Heams on 20 March 2008 at the Cournot Centre’s seminar “The Probabilism Sessions”.
プリズム 第11号

財政出動は効果的か?

Edouard Challe

Prisme N°11 2007年11月(英語) (254.3 KiB)

続き
This article examines the way in which fiscal policy impulses (variations in government spending and tax cuts) affect aggregate variables such as GDP, consumption, investment and employment.

Economic theory distinguishes three potential channels of transmission for these impulses, according to whether they affect the equilibrium through their wealth effects, their aggregate demand effects, or their liquidity effects.

We therefore intend to evaluate the extent to which these theoretical channels are consistent with the empirically observed impacts of fiscal stimulus. Although economists have traditionally focused their attention on wealth effects and aggregate demand effects, traditionally associated with the “classical” and “Keynesian” paradigms, recent works on the subject show that liquidity effects also play an important role. Finally, in the presence of aggregate demand effects and liquidity effects, fiscal stimulus is all the more effective over the short term when it is financed by government debt issue. The gains achieved through debt-financed stimulus can, however, conflict with the social costs resulting from high levels of long-term public debt, and this raises a specific problem concerning the dynamic consistency of fiscal policy.
プリズム 第10号

日本企業変革の特殊性

Masahiko Aoki

Prisme N°10 2007年9月(英語) (186.3 KiB)

続き
How should one interpret the changes in Japan’s company structure that have been affecting the Japanese economy since the early 1980s? This text proposes a conceptual framework from the firm’s point of view, after examining empirical evidence. Has Japan’s corporate governance made a substantive institutional transformation, and, if so in which direction? Four stylized analytical models of corporate governance are presented, and the conditions in which each would be viable are identified.

Using this theoretical background, the text examines the driving forces, as well as the historical constraints, of the changes taking place in Japan. The nature of the on-going institutional changes in Japan’s corporate governance can be interpreted as a possible transition from the traditional bank-oriented model to a hybrid model, built on the combination of managerial choice of business model, employees’ human assets, and stock-market evaluations. No single mechanism has emerged as dominant, but a variety of patterns seems to be evolving.
プリズム 第9号

ヨーロッパにおける財政的テリトリーの構築:禁止・協調・近似化・保証・情報提供

Évelyne Serverin

Prisme N°9 2007年3月(英語) (176.5 KiB)

続き
Community law has created a fiscal territory by using a series of legal techniques founded on different parts of the Treaty. By dividing Europe’s complex legal structures into five categories of action, we seek to elucidate the foundations, and nature, of EU tax policy. We can separate these legal techniques into two groups, depending on whether they involve taxation, in the strict sense of the word, or other rights and liberties.

Three actions are directly associated with taxation: prohibition, harmonization and approximation. Two other actions do not affect substantial tax law, but influence its application: guaranteeing the exercise of fundamental liberties and informing the Member States. We shall evaluate the relative weight of these two groups of actions through an analysis of the instruments available to the Community and their associated jurisprudence.
プリズム 第8号

ヨーロッパ構築は可能か?政策目標・法規範・公共財

Robert Boyer & Mario Dehove

Prisme N°8 2006年11月(英語) (671.2 KiB)

続き
In the introduction of technical norms and the free circulation of goods and people, as in the harmonization of indirect taxes or the portability of social rights, the principle of competition dominates over all other principles in the building of Europe. This primacy of competition has aroused the distrust of many citizens regarding the Union and is now obstructing the emergence of public goods in Europe. While economic theory provides satisfactory explanations of public goods management, it is has great difficulty in analysing their genesis. This helps to explain the discrepancies between the theory’s predictions and the empirically observable distribution of powers.

Theories of justice maintain that the persistence of strong national traditions in areas such as professional relations or the expression of solidarity make the construction of a social Europe more difficult. Legal analysis highlights the decisive role played in all member states by judges and courts, whose jurisprudence continuously and practically delimits the role and prerogatives of all the players. By so doing, they create the conditions for a review of the allocation of these powers by the political authorities. The necessary reconstruction of European institutions must then anticipate the formation of new public goods as diverse as security and justice, science and energy security.
プリズム 第7号

クルノから公共政策評価へ:数量化に関する矛盾と論争

Alain Desrosières

Prisme N°7 2006年4月(英語) (258.2 KiB)

続き
The French mathematician, economist and thinker Augustin Cournot inaugurated the philosophical treatment of the new probabilistic and quantitative modes of reasoning that emerged in the first half of the 19th century. The text reviews the legacy and implementation of Cournot’s intuitions concerning the distinction between so-called objective and subjective probabilities, and the interpretation of the categories constructed by statisticians according to “equivalence conventions”. Suggestive clues emerge for the empirical study of current statistical practices, in particular, those transactions that take place in the “contact zone”, where quantified assertions recorded in more or less formal models replace unquantified assertions formulated in natural language. Examples of these exchanges are illustrated in the cases of risk management, macroeconomic policy and public service performance evaluation.

In conclusion, the paper highlights how the ambivalence of Cournot’s thought is echoed in the controversies raised in some recent sociology of science, polarized between diverse forms of “realism” and “constructivism”. Questions suggested by Cournot are the starting point for an exploration of the sense in which quantification can be said to create objectivity.
プリズム 第6号

先進国における特許熱と発展途上国への再転落

Claude Henry

Prisme N°6 2005年5月(英語) (135.4 KiB)

続き
This paper examines the relationship between innovation and intellectual property rights. Over the past 25 years, the traditional balance between patent legislation and knowledge as public good has started to shift in favour of the former. The global, uniform, but flawed approach to patenting systems, driven by the United States and reflected in the TRIPS agreement, will cause negative externalities for developing countries. The paper suggests that these effects might be mitigated through appropriate instruments and prudent transposition of the TRIPS agreement into national legislations. It argues that the legal and economic foundations that have underpinned traditional intellectual property rights remain valid. Recent trends in approaches to intellectual property rights, including patent proliferation and geographical spread, are critically examined against the background of US-sponsored linkage of patent protection with free trade agreements. Examples from the life sciences and biotechnology illustrate the problems of unwarranted patents and excessive patent breadth, reinforcing doubts about the current uniformization of intellectual property rights protection, and highlighting the risk to innovation and development policy. In the final section, the paper explains how two developing countries have invoked the remedial measures embedded in the TRIPS agreement. These mechanisms include interpretative freedom, opposition procedures and compulsory licences. The paper concludes that from a Schumpeterian viewpoint, “open source” makes the main factors governing innovation more compatible than patent-based protection.
プリズム 第5号

金融資本主義から社会民主主義の再生へ

Michel Aglietta & Antoine Rebérioux

Prisme N°5 2004年10月(英語) (206.6 KiB)

続き
Recent corporate governance scandals have brought to the fore the inherent contradictions of a capitalism dominated by financial markets. This text argues that capitalism’s basic premise – that companies be managed in the sole interest of their shareholders – is incongruent with the current environment of liquid markets, profit-hungry investors and chronic financial instability. In this context, this text also analyses the financial scandals of the Enron era, going beyond the malfunctioning of the gatekeepers (auditors, financial analysts, ratings agencies) to stress the failure of shareholder value and the inadequacy of measures intended to prevent such scandals.

A company should be managed as an institution where common objectives are developed for all stakeholders, and this democratic principle should be extended to the management of collective savings to reduce macro-financial instability. These two conditions could make contemporary capitalism a vehicle for social progress, giving shape to a new kind of social democracy.

This Prisme presents the conclusions of Corporate Governance Adrift: A Critique of Shareholder Value, published by Edward Elgar Publishing in 2005.
プリズム 第4号

公正価値=フェアー・バリューの経済学的分析

Vincent Bignon, Yuri Biondi & Xavier Ragot

Prisme N°4 2004年3月(英語) (234.0 KiB)

続き
In July 2002, the European Parliament’s adoption of new accounting standards for quoted companies, to take effect from January 1, 2005, oriented European accounting towards a new principle, that of fair value. Hitherto, European legislation took its essential inspiration from the logic of historical cost: the valuation of balance sheet assets was grounded in the depreciated historical cost of their acquisition. The introduction of the principle of fair value will impose the determination of the value of assets by the present value of the expected profits that these assets can generate. It involves establishing the value of each asset according to its future contribution to the profit of the business.

Contemporary research, however, does not have as its ultimate goal the replacement of historical cost by fair value. Recent work analysing business production processes plead, on the contrary, for limitation of its usage. Three concepts summarize this work: asymmetry of information, complementarities, and specificities of assets employed. Firms create wealth by making assets complementary, because they add to these assets characteristics specific to the production process deployed.
These supplementary characteristics have no market value, and thus the value of each asset for a firm is always greater than its resale value. Consequently, the specificity of an asset is defined by the difference between its value for the firm and its market value. In order to preserve the competitive advantage flowing from this combination of specific assets, it is necessary to keep this type of information secret: hence, there exists an asymmetry of information between the firm and its environment.

In this context, the criterion of fair value poses important problems of asset valuation: the specificity and complementarity of assets force accountants to use valuation models in order to determine asset values. Financial analysts have recourse to such models in order to value businesses. The use of these models for accounting purposes does not, however, ensure the reliability of accounts; in effect, small changes in the assumptions can lead to large variations in the results. The purpose of accounting is rather to constitute a source of independent information, in a form that is relevant to valuation by financial markets.

In addition to the valuation problem, the principle of fair value may introduce the problem of financial volatility into accounting. The existence of excessive financial market volatility, which is demonstrable theoretically and empirically, creates superfluous risk and tends to reduce the investment capacity of firms. Lastly, fair value reinforces financial criteria to the detriment of the other valuation criteria of management teams. All stakeholders in the business, including shareholders and institutional investors, risk being its victims.

It is difficult to affirm that the net contribution of fair value to the improvement of accounting standards is positive. If far from ideal, the logic of historical cost appears as the least worst option in the presence of informational asymmetries, complementarities and specificities.
プリズム 第3号

EUは保守改革を宣告されなければならないのか?

Bruno Amable

Prisme N°3 2004年1月(英語) (387.4 KiB)

続き
The subject of reform is at the heart of current economic debate in Europe. The “Sapir Report” is the latest example. It denounces the institutions of the European model for keeping the European Union from growing at a sufficient pace. These institutions, it claims, are creating roadblocks to structural changes, changes that have been made vital by the important role of innovation in today’s world. The Report claims that the answer lies in implementing reforms to increase “microeconomic” efficiency.

This text examines critically this argument. If Europe were to adopt these reforms, European countries would have to switch to a different model of capitalism. That would mean abandoning the European model – characterized by a high degree of social security and employment protection – for the neo-liberal model, with its reduced social security and flexible labour markets.

This booklet compares the growth and innovation performances of France and Germany with those of the U.K. and the U.S., as well as with those of Sweden and Finland. These comparisons reveal the need to question, at the very least, the current rhetoric of the uncontested superiority of the neo-liberal model. I underline that even if the different models are capable of providing comparable overall performances, they do not have the same consequences in terms of income distribution and coverage of social risks.

Consequently, choosing a model, by its very nature, is a political choice. Therefore, choosing the reform means choosing to forge ahead with changes that took place during the Conservative Revolution in the U.S. and the U.K. (i.e. the Margaret Thatcher and Ronald Reagan years). A new dimension of this debate is that centre-left parties are adopting the political project of converting to the neo-liberal model, which is usually only associated with conservative parties.

This text concludes by examining two scenarios of structural change. The first scenario envisages the completion of the reform and the transformation of the European model to a neo-liberal one. The second scenario involves a transition towards a social-democratic model of capitalism. Neither scenario is without significant political consequences.
プリズム 第2号

アメリカ合衆国における社会福祉改革の教訓

Robert Solow

Prisme N°2 2003年11月(英語) (108.4 KiB)

続き
The 1996 U.S. Welfare Reform Act concentrates, almost solely, on getting people to work and off socially assisted programs. The reform has produced changes in the structure of benefits, introduced time limits, strengthened requirements for mandatory participation in work-related activities and changed various administrative procedures. The implementation of this federal Act has been largely left to the discretion of the individual states.

The law has been in effect for seven years and is up for reauthorization. It is thus time to assess its mechanisms and outcomes. Welfare reform is responsible for a portion of the increase in beneficiaries’ work and earnings. Most evidence from econometric studies points in this direction. Many of these studies, however, overlook the fact that employment and the demand for welfare assistance are heavily influenced by macroeconomic factors among other things.

In this booklet, Robert Solow evaluates these analyses and provides direction for future reforms.
プリズム 第1号

欧州委員会株式公開買付指令を如何に解釈するか?

Jean-Louis Beffa, Leah Langenlach & Jean-Philippe Touffut

Prisme N°1 2003年9月(英語) (262.3 KiB)

続き
Launched in 1974, the idea of harmonizing public takeover bid legislation found its first expression in 1985 in a draft Directive. This early draft was rightly rejected in July 2001. Bolstered by 30 amendments, a second version of the Directive was adopted on December 16, 2003.

The initial objective of the Directive was to promote a common framework for cross-border takeovers, to facilitate corporate restructuring and to protect minority shareholders. In the interim between the rejection of the early draft and the adoption of the second proposal, three contentious articles generated extreme tension: the neutrality of the board of directors in the event of a takeover bid, restrictions on transfers of securities and multiple voting rights, and consultation with workforce representatives. The amendments adopted on these questions by the legal affairs committee of the European Parliament weaken the content of the Directive. It is left to EU member states to decide whether or not to apply the articles on the neutrality of the board of directors and on the exercise of multiple voting rights in the event of a public bid. With this optional feature comes an unpublished “reciprocity” clause. Nevertheless, the spirit of the Directive is unaltered: no article was withdrawn.

One question has not received adequate consideration in this debate: should takeover bids be encouraged? Takeover bids are one of the constitutive principles of a mode of capitalism propelled by the dynamics of financial markets. In economics, theoretical studies of public bids have been complemented by econometric analyses and field research. These show that public bids do not contribute to economic growth. Over the last 30 years, more than two-thirds of public bids have led to a decrease in business productivity and have contributed to a reduction in the overall economic growth rate. In light of this fact, should a Directive on Takeover Bids comply with financial logic, to the detriment of industrial logic? Research indicates that, on the contrary, safeguards necessary to protect firms from the instability of finance should be constructed.