October 20, 2014

Nobel to Jean Tirole, the French Economist ….

As I was anxiously waiting for the announcement of this year’s Nobel under Economics discipline to hear a name more pleasing to my ears (…sure you can easily guess my stupidity—the name I was looking forward to), on October 13th the Academy awarded this year’s Sveriges Riksbank Prize in Economics to Jean Marcel Tirole, the French professor  from the Toulouse School of Economics, in recognition of his important theoretical contributions that clarify as to  “how to understand and regulate industries with a few powerful firms.”

The significance of his contributions to the field of economics can well be gauged from the mere fact of the academy choosing a non-American after more than a decade for the award.  Jean Tirole, who obtained his first degree in engineering from the most prestigious Grand Ecoles in France and later PhD in economics from MIT under the supervision of Eric Maskin who himself is a Nobel Laureate, is the third French economist to win the Nobel in economics.

Traditionally, economists engaged in regulatory economics had based much of their work on two benchmark cases: either a monopoly firm or the frictionless, hypothetical world of perfect competition where all firms are identical and enjoy no market power. Accordingly, they designed simple policies that governments could adopt for regulating all industries.

Moving away from this traditional approach, Tirole argued that the regulatory mechanism  that works well in one industry may not work well for another industry/situation, for, according to him, in the real world, markets are not dominated by one firm but by a small number of big firms with power in the markets. What he meant is: they are oligopolies. He built simple and elegant mathematical models based on the very fundamental features of various industries and their specialties. As these models reflect the strategic behavior and  information economics, they afford a better understanding of the interaction between firms, paving the way for the governments to design optimal regulatory policies. 

Thus, Tirole, making use of the modern micro-economic techniques such as game theory, asymmetric information and contract theory provided a new standard of rigor in theory to ‘regulation’ that captured specific economic environments, while, of course,  giving an orderly shape to an otherwise unwieldy literature. Making use of game theory framework, Tirole explained how the consequences of regulation depended not only on what action the regulator took, but also on how the few big firms interacted with each other.  He stressed on the fact that ‘information asymmetry’ keeps the regulators unaware of the actual costs of production of firms because of which the regulations launched by governments can at times result in unintended consequences.        
  
For instance, the government fearing that, say, pharmaceutical firms are likely to exploit consumers by charging high prices on drugs, may impose price caps. But such caps on price tend to induce firms to reduce their costs to the extent where a few firms alone would remain in the market, which might unwittingly allow them to make excess profits. And it is certainly not in the long term interest of consumers, argues Tirole.

Therefore, his prescription is: offer a portfolio of cleverly designed regulatory contracts to firms so that they will automatically select the one that best suits their business model. Now the question is: What makes a contract look like cleverly designed in different situations? It is these aspects that Tirole studied along with his co-authors, Laffont and others and, borrowing ideas from auction design and game theory, suggested that in such situations, regulators should offer cost-plus contracts and set price contracts. This shall make a firm that has scope for cutting costs opt for cost-plus contracts, while an innovative firm would opt for a set price contract.  This choice of the firms automatically makes the regulator know as to what sort of a firm they are dealing with, which in turn enables them to bargain a best deal for the consumers.  

Stretching himself beyond the regulation of monopolies, Tirole also studied the impact of ‘over investment’ in new technology on competition and observed such investment can at times eliminate competition in the market, for rival companies see no business sense in competing in such an environment. Interestingly, in one of his recent publications with Jean-Charles Rochet, he observed that the interconnectedness of modern financial systems is such that it would make the failure of big banks impossible, and the obvious fallout of such a situation would be: reckless behavior of banks, for they are sure of being bailed out by the exchequer. His suggestion to check such moral hazard is to cap their leverage.

It is in the light of these challenges that the regulators face in managing competition that Tirole’s work becomes indispensible to regulators, and as the Academy observed, his work certainly enabled “governments” understand the peculiarities of each firm and be able to “encourage powerful firms to become more productive and, at the same time, prevent them from harming competitors and customers.”

Besides his theoretical contributions to the discipline of economics, Tirole also wrote two text books, The Theory of Industrial Organization and Game Theory with his classmate Drew Fudenberg, which have indeed become iconic in their own right. The former book is a distillation of all the research that took place in the field of industrial organization and importantly presented in one unified framework facilitating easy comprehension for the graduate students. His book on game theory introduces the principles of non-cooperative game theory covering strategic form games, Nash equilibria, sub-game perfection, repeated games and games of incomplete information in an easily understandable style, duly accompanied by many applications, examples and exercises. 

October 08, 2014

Banking & Research - V

Monetary policy is known to have short and long-term effects. Traditionally, monetary policies have been aiming at promoting growth, achieving full employment, smoothing the business cycle, averting financial crisis, and stabilizing long-term interest rates and real exchange rates. Ideally, it would have made greater sense had central banks had a single overwhelming objective of ‘price-stability’.

It is generally believed that the monetary policy actions affect the real sector with lead and variable lags, while its effects on financial markets usually have short-run implications. The lag after which the monetary policy impact is felt on the real sector varies from country to country depending on the kind of transmission mechanism in existence. Secondly, there would always be uncertainties associated with transmission channels depending on the maturity of the financial markets. Central banks, therefore, usually design their monetary strategies and tactics keeping in view the transmission mechanisms that are in operation.

The most frequently used channels for transmission of monetary policy actions are: The quantum channel, the interest rate channel, the exchange rate channel, and the asset prices channel. The impact of transmission of policy impulses through the interest rate channel, the exchange rate channel, and the asset prices are indirect, while the policy impulses transmitted through the quantum channel affect the price level directly.

The exact transmission of monetary policy through these channels is always considered difficult, given the uncertainties associated with the economic system in terms of responsiveness of economic agents to monetary process signals and the ability of monetary authority to assess its impact on others. This phenomenon is much more complex in the context of developing economies since they are constantly in the process of evolution.

In view of these uncertainties associated with the transmission of monetary policy initiatives, central banks in industrialized countries are often found using market-oriented instruments to influence short-term interest rates. In the process, they would supply liquidity to the money market at a price, which they intend to set as a target interest rate with a hope that it would ‘pass through’ the system to influence the array of short-term money market rates set by banks and other financial intermediaries. In other words, if the monetary policy actions are to be effective, the official rate changes should be completely ‘passed through’ to the market and the retail rates over a reasonably short period. As against these theoretical underpinnings, practical experience indicates that the official rate changes do not always instantaneously transmit to retail rates within a short period of time. Earlier studies carried out by Paisley (1994) and Heffernan (1997) using conventional linear methods to investigate these relationships have given mixed evidence for the ‘pass through’.

Incidentally, the RBI Annual Report 2003-04 echoes a similar observation pertaining to India: “The key issue confronting the conduct of monetary policy through the LAF operations is to transmit short-term interest rate signals to the long end of the yield curve. This process will determine the efficacy of monetary policy in pursuit of macroeconomic stabilization. Some preliminary results from work in progress indicate that during April 2000 to March 2003, a 100 basis points reduction in the Bank Rate led to about 40 basis points moderation in the Prime Lending Rate (PLR) in the same month and by another 38 basis points in the subsequent months. As regards the ‘pass through’ to yields on government securities, a 100 basis points reduction in the Bank Rate induced a moderation of 66 basis points in the yield on a 10-year government paper in the same month. Thus, policy signals have an impact on the yield curve and the PLR in the desired direction albeit at less than the desired pace”.

The paper entitled “Base Rate Pass-through: Evidence from Banks’ and Building Societies’ Retail Rates” by Prof. Paul Mizen and Prof. Boris Hofmann provides a theoretical and econometric framework for assessing the commonly prevailing belief that the base rates ‘pass through’ to retail rates by using 14 years of monthly data for interest rates on deposit and mortgage products offered by the UK banks and building societies. The study reveals “complete ‘pass through’ to be the norm in the long run for deposit rates, but not for the mortgage rates”. It is also revealed that the “endogenous and exogenous non-linearity’s increase the adjustment speeds of retail rates to base rates quite dramatically when the ‘gap’ between the retail rate and the base rate is widening but slowdown the adjustment when base rates are moving in a direction that will ‘automatically’ close the gap”.

The need for a sound financial system has perhaps led to a more regulatory intervention in the banking industry than anywhere else. Traditionally, this interference has been mostly in the form of the ‘command and control’ type, but owing to the embedded informational asymmetry, adoption of such an intervention has been found to create moral hazard problems, resulting in the emphasis being shifted to the “incentive compatible” approach. Though this approach sounds theoretically superior, it has its own share of practical problems such as the very limited scope for back testing, a model relating to prediction of very large risks that are potentially the most dangerous for the solvability of banks. In this context, “command and control” type of regulation continues to be the normal style of regulatory intervention under which “capital requirements” have emerged as the most important regulatory investment. But some studies have revealed that capital requirements affect the profitability of efficient banks more than inefficient banks. This finding has now been questioned by Peter JG Vlaar in his article, “Capital Requirements and Competition in the Banking Industry” with a firm conclusion that the distortionary effects of capital requirements are mild.

As a sequel to this, the article, “Issues of Political Economy in Banking Regulation: A Survey”, presents an overview of the current status of the regulatory intervention by the government agencies in the banking industry across the globe under the heads: Rationale for banking regulation, instruments of banking regulation, endogenous regulation, failure of government in regulating banking, and banking regulation as a dialectic process.

The article “Efficiency and Profitability in Indian Public Sector Banks” attempts to analyze the operating efficiency and its relationship with profitability in the Indian public sector banking industry by using the statistical tool known as ‘Data Envelopment Analysis’. The study revealed that the banks affiliated to the SBI group are more efficient than the nationalized banks. It also revealed that the profitability significantly influences the operating efficiency.

Keeping in view the current growing momentum of the wireless revolution and the mCommerce explosion, the last article of this issue, “Reconsidering the Challenges of mPayments: A Roadmap to Plotting the Potential of the Future mCommerce Market” discusses the current status of mCommerce and the expected growth potential for mPayment in the coming decade. The paper attempts to trace the key challenges to successful mPayments particularly in the absence of common standards. To sum up, the article provides “an insight into the key challenges surrounding the mPayments’ environment, with a view to providing a framework for reassessing the future directions for a more successful market”.



IUPJBM Vol. II No.4 Nov, 2003

Recent Posts

Recent Posts Widget