Chủ Nhật, 28 tháng 8, 2011

Trung thu: Capitalsim and Socialism


Cứ đến gần tháng 8, thị trường lại chào đón 1 sản phẩm xuất hiện chỉ 3 tháng: bánh trung thu. Không biết mỗi năm các hãng bánh sản xuất bao nhiêu cái bánh, nhưng chắc chắn là 1 con số không nhỏ. [Chỉ tính riêng Kinh Đo, năm 2010, sản xuất 1,900 tấn bánh. Mỗi cái bánh nặng trung bình là 300gr, vậy chi có 6,400,000 cái bánh. Bibica cho ra 500 tấn, tương đương 1,600,000 cái. Tính thêm mấy hãng bánh khác như Đồng Khánh thì có lẽ số bánh rơi vào khoảng trên dưới 10,000,000 cái]. Tại sao có thể nói số lượng bánh là quá nhiều dù không có số liệu chính thức? 1 điểm đặc biệt của bánh trung thu là nó là 1 cái bánh. Điều này có nghĩa nó không thể được giữ quá 4,5 tháng. Vì thế nhà sản xuất buộc phải bán tống bán tháo nếu như còn dư hàng [vì không thể lưu trữ trong kho được]. Không cần phải qua rằm tháng 8, trước đó 1 tuần, lúc cầu đang ở mức cao nhất, thì giá bánh trung thu đồng loạt giảm [ở nhiều cách thức như mua 1 tặng 1]. Nếu số bánh đáp ứng 1 cách tương đối lượng cầu thì hiện tượng trên không thể xảy ra. Do hiện tượng trên xảy ra ở cường độ lớn [giá bánh giảm sâu], nên có thể kết luận được hiện tượng thừa bánh. Bánh trung thu không được tiêu thụ chắc chắn sẽ bị tiêu hủy, và điều này đồng nghĩa với lãng phí resources.

1 điểm cần lưu ý là hiện tượng này không chỉ xảy ra với bánh trung thu. rất nhiều mặt hàng tuân theo cơ chế thị trường đều có thể rơi vào tình trạng này. Nếu hiện tượng này diễn ra ở mức độ lớn hơn, chúng ta sẽ có cái gọi là khủng hoảng thừa, 1 trong những nguyên nhân dẫn đến khủng hoảng của nền kinh tế đi theo capitalism.

Marx cho chúng ta 1 giải pháp: Socialism, centrally-planned governmennt. Thay vì để thị trường tự quyết định sản lượng, mà thường dẫn đến sản xuất thừa gây lãng phí, nhà nước nên ấn định tất cả. 90,000,000 triệu người Việt Nam sẽ tiêu thụ 9,000,000 cái bánh là hợp lí. Bánh sản xuất ra được gửi đến từng người có nhu cầu thực sự [gửi vể các đơn vị cơ sở như phường đã phân phát]. Thế là ai cần bánh sẽ có bánh, xã hội đạt đến độ toàn dụng về hiệu quả.

Tuy nhiên, có lẽ Max đã quên, hoặc cố tình bỏ qua, 1 vài vấn đề. Xét trong ví dụ bánh trung thu này, nhà nước có thể có được số lượng chính xác lượng cầu, nhưng không thế có được chính xác số loại bánh, số hương vị được yêu cầu. Vì thế, thay vì chúng ta có thập cẩm gà quay, heo quay, xá xíu, vi cá, … blah blah blah, chúng ta sẽ chỉ nhận được hoặc bánh đậu xanh [cũng chả bik mấy trứng nữa] hoặc thập cảm gì đó. Mấy bác ăn kiêng, ăn chay thì chịu khó nhịn vì cầu của các bác quá nhỏ so với những người khác. Mọi người vẫn sẽ có bánh, có điều sẽ không đáp ứng được 100% mong muốn. Thứ 2, với việc ấn định như vậy, các nhà sản xuất mất động lực tìm kiếm cách giảm giá thành sản phẩm. Nền kinh tế sẽ rất ít khi gặp phải khủng hoảng, nhưng bù lại, nền kinh tế hầu như đi ngang hoặc phát triển với tốc độ thấp. Trái lại, tuy Capitalism hay dẫn đến crisis, nền kinh tế vẫn tận hưởng được những khoảng thời gian phát triển thịnh vượng, và điều quan trọng hơn cả, phía sau những crisis và những đột phá mới về các lý thuyết và khoa học công nghệ.

Đó là 1 trong những lí do tại sao đến bây giờ Capitalism vẫn tồn tại và phát triển.

Kz
Zed
August, 2011

Thứ Sáu, 26 tháng 8, 2011

Economics: Rituals of rigour

 http://www.ft.com/cms/s/0/faba8834-cf09-11e0-86c5-00144feabdc0.html#ixzz1WBOG83VG

Economics: Rituals of rigour

After mistaken claims made ahead of the global crisis won much academic support, long-held assumptions were called into question – but the real world often remains overlooked or ignored
Man in front of a board
The reputation of economists, never high, has been a casualty of the global crisis. Ever since the world’s financial system teetered on the abyss following the collapse of Lehman Brothers three years ago next month, critics from Queen Elizabeth II downwards have posed one uncomfortable yet highly pertinent question: are economists of any use at all?
Some of this criticism is misconceived. Specific predictions of economic growth or levels of the stock market – gross domestic product will rise by 1.8 per cent; the FTSE 100 index will stand at 6,500 by year-end – assert knowledge that those making such predictions cannot have. Economic systems are typically dynamic and non-linear. This means that outcomes are likely to be very sensitive to small changes in the parameters that determine their evolution. These systems are also reflexive, in the sense that beliefs about what will happen influence what does happen.
If you ask why economists persist in making predictions despite these difficulties, the answer is that few do. Yet that still leaves a vocal minority who have responded cynically to the insatiable public demand for forecasts. Mostly they are employed in the financial sector – for their entertainment value rather than their advice.
Economists often make unrealistic assumptions but so do physicists, and for good reasons. Physicists will describe motion on frictionless plains or gravity in a world without air resistance. Not because anyone believes that the world is frictionless and airless, but because it is too difficult to study everything at once. A simplifying model eliminates confounding factors and focuses on a particular issue of interest. This is as legitimate a method in economics as in physics.
Since there are easy responses to these common criticisms of bad predictions and unrealistic assumptions, attacks on the profession are ignored by professional academic economists, who complain that the critics do not understand what economists really do. But if the critics did understand what economists really do, public criticism might be more severe yet.
Even if sharp predictions of individual economic outcomes are rarely possible, it should be possible to describe the general character of economic events, the ways in which these events are likely to develop, the broad nature of policy options and their consequences. It should be possible to call on a broad consensus on the interpretation of empirical data to support such analysis. This is very far from being the case.
The two branches of economics most relevant to the recent crisis are macroeconomics and financial economics. Macroeconomics deals with growth and business cycles. Its dominant paradigm is known as “dynamic stochastic general equilibrium” (thankfully abbreviated to DSGE) – a complex model structure that seeks to incorporate, in a single framework, time, risk and the need to take account of the behaviour of many different companies and households.
The study of financial markets revolves meanwhile around the “efficient market hypothesis” – that all available information is incorporated into market prices, so that these prices at all times reflect the best possible estimate of the underlying value of assets – and the “capital asset pricing model”. This latter notion asserts that what we see is the outcome of decisions made by a marketplace of rational players acting on the belief in efficient markets.
. . .
A close relationship exists between these three theories. But the account of recent events given by proponents of these models was comprehensively false. They proclaimed stability where there was impending crisis, and market efficiency where there was gross asset mispricing.
Regulators such as Alan Greenspan, former chairman of the US Federal Reserve, asserted that the growth of trade in complex financial investments represented new and more effective tools of risk management that made the economy more stable. As late as 2007, the International Monetary Fund would justify its optimism about the macroeconomic outlook with the claim that “developments in the global financial system have played an important role, including the ability of the United States to generate assets with attractive liquidity and risk management features”.
These mistaken claims found substantial professional support. In his presidential lecture to the American Economic Association in 2003, Robert Lucas of the University of Chicago, the Nobel prizewinning doyen of modern macroeconomics, claimed that “macroeconomics has succeeded: its central problem of depression prevention has been solved”. Prof Lucas based his assertion on the institutional innovations noted by Mr Greenspan and the IMF authors, and the deeper theoretical insights that he and his colleagues claimed to have derived from models based on DSGE and the capital asset pricing model.
The serious criticism of modern macroeconomics is not that its practitioners did not anticipate that Lehman would fall apart on September 15 2008, but that they failed to understand the mechanisms that had put the global economy at grave risk.
Subsequent policy decisions have been pragmatic and owe little to any economic theory. The recent economic policy debate strikingly replays that after 1929. The central issue is budgetary austerity versus fiscal stimulus, and – as in the 1930s – the positions of the protagonists are entirely predictable from their political allegiances.
Why did the theories put forward to deal with these issues prove so misleading? The academic debate on austerity versus stimulus centres around a property observed in models based on the DSGE programme. If government engages in fiscal stimulus by spending more or by reducing taxes, people will recognise that such a policy means higher taxes or lower spending in the future. Even if they seem to be better off today, they will later be poorer, and by a similar amount. Anticipating this, they will cut back and government spending will crowd out private spending. This property – sometimes called Ricardian equivalence – implies that fiscal policy is ineffective as a means of responding to economic dislocation.
John Cochrane, Prof Lucas’s Chicago colleague, put forward this “policy ineffectiveness” thesis in a response to an attack by Paul Krugman, Nobel laureate economist, on the influence of the DSGE school. (In an essay in the New York Times Prof Krugman described comments from the Chicago economists as “the product of the Dark Age of macroeconomics in which hard-won knowledge has been forgotten”.) Prof Cochrane at once acknowledged that the assumptions that give rise to policy ineffectiveness “are, as usual, obviously not true”. For most, that might seem to be the end of the matter. But it is not. Prof Cochrane goes on to say that “if you want to understand the effects of government spending, you have to specify why the assumptions leading to Ricardian equivalence are false”.
That is a reasonable demand. But the underlying assumptions are plainly not true. No one, including Prof Cochrane himself, really believes that the whole population calibrates its long-term savings in line with forecasts of public debt and spending levels decades into the future.
. . .
But Prof Cochrane will not give up so easily. “Economists”, he goes on, “have spent a generation tossing and turning the Ricardian equivalence theory, and assessing the likely effects of fiscal stimulus in its light, generalising the ‘ifs’ and figuring out the likely ‘therefores’. This is exactly the right way to do things.” The programme he describes modifies the core model in ways that make it more complex, but not necessarily more realistic, by introducing parameters to represent failures of the model assumptions that are frequently described as frictions, or “transactions costs”.
Why is this procedure “exactly the right way to do things”? There are at least two alternatives. You could build a different analogue economy. For example, Joseph Stiglitz – another Nobel laureate – and his followers favour a model that retains many of the Lucas assumptions but attaches great importance to imperfections of information. After all, Ricardian equivalence requires that households have a great deal of information about future budgetary options, or at least behave as if they did.
Another possibility is to assume that households respond mechanically to events according to specific behavioural rules, rather like rats in a maze – an approach often called agent-based modelling. Such models can – to quote Prof Lucas – also “be put on a computer and run”. It is not obvious whether the assumptions or conclusions of these models are more, or less, plausible than those of the kind of model favoured by Profs Lucas and Cochrane.
Another line of attack would discard altogether the idea that the economic world can be described by any universal model in which all key relationships are predetermined. Economic behaviour is influenced by technologies and cultures, which evolve in ways that are certainly not random but that cannot be fully, or perhaps at all, described by the kinds of variables and equations with which economists are familiar. The future is radically uncertain and models, when employed, must be context specific.
In that eclectic world Ricardian equivalence is no more than a suggestive hypothesis. It is possible that some such effect exists. One might be sceptical about whether it is very large, and suspect its size depends on a range of confounding and contingent factors – the nature of the stimulus, the overall political situation, the nature of financial markets and welfare systems. The generation of economists who followed John Maynard Keynes engaged in this ad hoc estimation when they tried to quantify one of the central concepts of his General Theory – the consumption function, which related aggregate spending in a period to current national income. Thus they tried to measure how much of a fiscal stimulus was spent – and the “multiplier” that resulted.
But you would not nowadays be able to publish similar work in a good economics journal. You would be told that your model was theoretically inadequate – it lacked rigour, failed to demonstrate consistency. To be “ad hoc” is a cardinal sin. Rigour and consistency are the two most powerful words in economics today.
. . .
Consistency and rigour are features of a deductive approach, which draws conclusions from a group of axioms – and whose empirical relevance depends entirely on the universal validity of the axioms. The only descriptions that fully meet the requirements of consistency and rigour are completely artificial worlds, such as the “plug-and-play” environments of DSGE – or the Grand Theft Autocomputer game.
For many people, deductive reasoning is the mark of science: induction – in which the argument is derived from the subject matter – is the characteristic method of history or literary criticism. But this is an artificial, exaggerated distinction. Scientific progress – not just in applied subjects such as engineering and medicine but also in more theoretical subjects including physics – is frequently the result of observation that something does work, which runs far ahead of any understanding of why it works.
Not within the economics profession. There, deductive reasoning based on logical inference from a specific set of a priori deductions is “exactly the right way to do things”. What is absurd is not the use of the deductive method but the claim to exclusivity made for it. This debate is not simply about mathematics versus poetry. Deductive reasoning necessarily draws on mathematics and formal logic: inductive reasoning, based on experience and above all careful observation, will often make use of statistics and mathematics.
Economics is not a technique in search of problems but a set of problems in need of solution. Such problems are varied and the solutions will inevitably be eclectic. Such pragmatic thinking requires not just deductive logic but an understanding of the processes of belief formation, of anthropology, psychology and organisational behaviour, and meticulous observation of what people, businesses and governments do.
The belief that models are not just useful tools but are capable of yielding comprehensive and universal descriptions of the world blinded proponents to realities that had been staring them in the face. That blindness made a big contribution to our present crisis, and conditions our confused responses to it. Economists – in government agencies as well as universities – were obsessively playing Grand Theft Auto while the world around them was falling apart.
The writer, an FT columnist, is a visiting professor at the London School of Economics and a fellow of St John’s College, Oxford
Macroeconomic modelling: Ways to simplify that are very different from those of a physicist
Robert Lucas, aged 73, is John Dewey professor of economics at the University of Chicago. Prof Lucas and Chicago colleagues, along with others such as Edward Prescott and Thomas Sargent, are the founders of the programme known as “dynamic stochastic general equilibrium” (DSGE), which dominates teaching and research in macroeconomics.
The programme has been described as “freshwater economics”, because the leading proponents have been based at locations such as Chicago, Rochester and Minnesota, in contrast to those in the seaboard strongholds of “saltwater economics” at Harvard, MIT and Stanford, who followed a more Keynesian tradition.
In 1995 Prof Lucas was awarded the Nobel prize for economics, and in his prize lecture he provides a succinct summary of his central model. He makes a number of assumptions. Individuals are rational, calculating welfare maximisers. They live through two periods: work in the first and retirement in the second. There is only one good, which cannot be stored, or invested in capital projects. There is only one kind of work, and older and younger generations do not support each other.
This simplification method is very different from the physicist’s simplification, which abstracts to focus on a single element of a problem. Prof Lucas has described his objective as “the construction of a mechanical artificial world populated by interacting robots”. An economic theory is something that “can be put on a computer and run”.
Structures such as these are “analogue economies”, complete systems that loosely resemble the world, but a world so pared down that everything is either known or can be made up.
Such models are akin to a computer game. If game compilers are good at their job, events and outcomes loosely resemble those observed in the real world – they can, in a phrase that Prof Lucas and colleagues popularised, be calibrated against observation.
But it obviously cannot be inferred that policies that work in a computer game are appropriate for governments and businesses. It is in the nature of these self-contained systems that successful strategies are the product of assumptions made by the authors.

Chủ Nhật, 14 tháng 8, 2011

Bóng đá và nỗi buồn dân tộc


Một trong những thứ tốt đẹp nhất mà thể thao mang lại cho nhân loại có thể được đút kết trong câu nói: “Kẻ mạnh là kẻ chiến thắng”. Có ai mà lại muốn thua cuộc trong các cuộc đua.

Có 1 điều chắc chắn là cảm giác chiến thắng luôn rất tuyệt vời. Nhưng cũng khá chắc chắn rằng, là 1 kẻ yếu và chiến thắng luôn luôn tuyệt vời hơn 1 kẻ mạnh giành thắng lợi. Cái cảm giác sát cánh cùng đồng đội, chịu trận những đợt lên bóng của đối phương, và rồi chiến thắng chỉ với 1 đường bóng sau bao nhiêu cam chịu chắc chắn bùng nổ hơn cảm giác đứng yên trong sân và nhìn đồng đội chiến thắng. Có thể, kẻ mạnh là kẻ chiến thắng, nhưng kẻ chiến thắng chưa chắc là người vui vẻ nhất.

[Dạo gần đây, mọi người [báo chí là chủ yếu] bàn nhau về việc Fabregas có rời Arsenal đã qua Barcalone không. Rốt cuộc thì đã có. Nhiều người, chủ yếu là fan Arsenal, có lẽ sẽ hơi trách móc sự vô tình của Fabregas. Nhưng có lẽ họ thật sự chưa đá banh nhiều để hiểu được cảm giác của kẻ mạnh. Ai mà lại không thích như vậy? Ai không thích là kẻ mạnh, kẻ chiến thắng?

Nhưng người viết tin rằng, đa phần những người còn lại chỉ thất vọng với F4, vì đã không đủ kiên nhẫn và không đủ niềm tin.]


Nhắc chuyện bóng đá, nhớ đến Việt Nam. Người ta hay thất vọng với tuyển quốc gia bóng đá Việt Nam. Người ta thắc mắc tại sao hồi trước VN thắng cả Nhật Bản, giờ lại thua? Tại sao hoài mà không phát triền được? Họ trách móc này nọ, với đủ lí do. Những lí do ấy đúng, 1 phần và đối với 1 vài cầu thủ nhất định. Nhưng những người ấy, có lẽ đa phần họ chưa đá bóng nhiều, để biết rằng bóng đá, suy cho cùng, cũng chỉ là 1 môn thể thao nặng về thể chất. Bóng đá đòi hỏi thể chất tốt. Nó không phải là cờ vua. Nó là môn thể thao cần những đôi chân nhanh, những thân hình to lớn cùng với một cái đầu đủ tốt. Người Việt có thể có những cái đầu tốt, nhưng chắc rằng không thể có những ưu thế về vật chất như các nước khác. Phải đi đá bóng mới hiểu được cảm giác té ngược khi húc vào 1 bức tường di động trên sân. Hoặc là cảm giác bất lực khi đôi chân không có được những bước sải chân dài như đối phương. Mà những cái đó, không phải cứ tập luyện là có được. Tất nhiên, tập luyện giúp cơ thể phát triển lên, nhưng không thể bằng được người nước ngoài. Và tập luyện không thể di truyền. Những đặc điểm có phần ưu thế ấy được quy định trong gen, và gen thì sao chép gần như hoàn hảo qua các thế hệ. Nghe kể, Asley Cole của tuyển Anh, và bây giờ là Theo Walcott chạy 100m nhanh hơn kỉ lục điền kinh Việt Nam. Như vậy, thì làm sao chúng ta đánh lại được họ? Dùng những điểm mạnh của ta? Ta có gì? Kỹ thuật? Tinh thần? Nói như vậy, để mọi người hiểu rằng không phải chỉ cần 1 cái đầu tốt và tinh thần cao là có thể thắng được. 1 cái đầu tốt là cần thiết. Tinh thần cũng vậy, nhưng như vậy là chưa đủ. Người Việt Nam hay có tư duy “chỉ cần quyết tâm là làm được”, tư duy “có sức người sỏi đá cũng thành cơm”, tư duy mang nặng tính duy tâm. Thời đại của công nghệ, của khoa học, mà vẫn mang tư tưởng ấy thì đúng là chỉ có ăn “cơm” chứ chả bao giờ có “thịt”.

Nhiều người, có thể do may mắn hoặc thật sự là chuyên gia, nhận thấy vấn đề của tuyển Việt Nam nằm ở thể trạng, và từ đó đề xuất cho cầu thủ nước ngoài vào. Giải pháp hợp lí. Nhưng đó là giải pháp mà qua đó người ta thấy căn bệnh thứ 2 của người Việt: thành tích. Thể thao là gì? Thể thao là cuộc thi để thể hiện tinh thần và khí phách dân tộc, không phải là nơi để lấy huân chương thành tích. Một ngày nào đó nếu VN vào World Cup với một nửa đội hình da đen thì cũng chẳng nhiều người tự hào lắm. Người nước ngoài sẽ có 1 cái nhìn sai lệch về con người Việt Nam. Nhắc chuyện thành tích, điển hình chắc phải kể đến Quang Liêm của cờ vua. Anh này là 1 chuyên gia cờ vua của Việt Nam, có thể nói ngoài Tiến Minh ra thì anh này nổi tiếng thứ nhì. Giải ở Đức vừa rồi, đánh 10 trận thì anh ấy huề 9 trận, thắng 1 trận. Chiến thuật? Hay lại vì thành tích? Tại sao không một lần thể hiện khí phách Việt mạnh mẽ, quyết liệt. Cái á quân giải ấy hay ho gì khi anh ấy chỉ thắng được 1 lần duy nhất, còn lại là cố gắng cầm hòa?

Nỗi buồn dân tộc. Nỗi buồn ấy đến từ việc cả 1 thế hệ người Việt bị thua thiệt về mặt thể chất. Nỗi buồn đến từ 1 nhóm người vô tình rơi vào cái bẫy của tư duy.

Entry này sẽ ít người đọc, dù nó có ghi “nỗi buồn dân tộc” ở tựa đề. Và đó mới thật sự là nỗi buồn lớn nhất của dân tộc: sự vô tâm.

Karl and Zed
August, 2011

Thứ Sáu, 5 tháng 8, 2011

Paper and plastic cup

A friend asked me a question: “Why does there exist together plastic and paper cup in fast food restaurants? You surely know that by producing and using only 1 kind of material, businessmen can reduce cost. So, why are they using at the same time plastic and paper cup?”

To answer the question, first, we have to make clear what drinks are contained in plastic cup and paper cup?

I had a small research around some fast food stores in HCM city in order to find the answer. and here is the result:
1/ With paper cup: coke [ice of course], hot drinks [like tea, coffee, milk], lemonade [ice]
2/ With plastic cup: milk [ice], ice-cream, milkshake, cream-shake, color drinks.

As you can see, nearly all hot drinks are contained in paper cup. It seems that hot temperature may have some impact on plastic. Or at least, it is what sale man believes that people think. However, I do not think 70-80 degree Celsius will be likely to cause plastic to release some harmful chemicals. Another reason related with the heat is that people may find it inconvenient to hold a plastic cup containing hot coffee. There is no doubt that paper absorbs heat better than plastic does. Therefore, if fast food restaurants’ owners want to satisfy consumers, they have to use paper cup for hot drinks.

While paper cup has some physical advantages, plastic cup is used mostly for economic purposes. One of paper cup’s weaknesses is that paper cannot be seen through. If we look again at the research, we can notice that most of drinks related with kids are contained in plastic cup. From iced milk to ice-cream, cream-shake, ...kid drinks are rarely poured into paper cup. There must be a reason why people invented coloring chemicals used in food and beverage. Those unhealthy chemicals are used mainly to attract people’s eyes, especially kids’. Our childhoods are filled with rainbow and flower. Kids love colorful things. We adult can control ourselves not to succumb to the temptation from colors. But, kids cannot. When a kid sees another one drinking an iced red cream-shake in a plastic cup, he will surely demand his parents to buy him one. To kids, those red and green drinks filled with iced are so sexy. If those drinks are contained in paper cup, which cannot be seen through, how can they attract kids? One may argue that iced coke is also able to attract large amount of consumers, why is it contained in paper cup? The argument is rational. In fact, in some fast food restaurants, coke is still contained in glasses. However, because glasses demands high fixed cost at the beginning, it is no longer used widely now. Moreover, more and more people now do not eat at the restaurant but take away instead. Therefore, it is more wisely to give consumers paper cup than glasses. Although consumers prefer glasses to paper, sale man has to choose to use only paper in order to reduce cost. But it still remains that why they do not use plastic cup to contain coke. As we have known, plastic cup is used mainly to attract kids. This function leads to 3 characteristics of plastic: be able to be seen through, small and having inverted triangle shape. The first characteristic is explained above. About small, I have never seen any plastic cup used in fast food restaurant that is over 500ml. I am not sure about the reasons. But I guess that milk and cream-shake and other kids drink are relatively more costly than coke. However, the price of those drinks cannot be too high, because kids cannot and do not pay for that. Their parents pay, and they do not want to pay too much for those drinks. Therefore, producer believes that it is quite unnecessary to make big plastic cup used in fast food restaurants. On the contrary, adults are able to pay for a little more coke. So, coke is contained in paper cup which has various sizes. About the inverted triangle shape, I believe such design is useful in containing some specific drinks such as milk shakes, cream-shakes, ... and in attracting kids. With coke, there is no need for that.

And finally, one more plus for plastic comes from environment aspect. Many people believe that paper is more environment-friendly than plastic. However, it is somewhat not true. Due to a research [ “Paper Vs. Plastic Bags?”] by Rachel Decker and Anders Graff of Lawrence University, the making of paper can waste many thousands of gallons of water, as can the recycling of paper. The human and mechanical efforts and costs are very high, not forgetting the physical cost to loggers and those who work around the numerous chemicals. Plastic is, by comparison, efficient and low energy to produce, and, easily and efficiently recycled. Plastic reduces, recycles marvelously, and in that, is reused. [More information at http://karlzed.wordpress.com/. Link to the research: https://docs.google.com/leaf?id=0Bz2cvdwxEoKRNWM1YzNhZDMtOTYyMC00N2EzLWI3MTYtNGY0NDU4M2RiNDJk&hl=en_US]. Now, the producers have one more incentive to produce plastic.

In conclusion, we can see that both plastic and paper have disadvantages and advantages. Because one’s advantages cannot offset its disadvantages, neither of them can dominate the market. If human cannot find out a new material which can combine all the goods and eliminate all the bads of plastic and paper, we will still see them existing together in a fast food restaurant.

Kz
August, 2011

Thứ Ba, 2 tháng 8, 2011

What’s With All the Bernanke Bashing?

He left a comfortable professorship at Princeton to run the Federal Reserve— and this is what he gets.
Mr. Bernanke has worked tirelessly to shepherd the economy through the worst financial crisis since the Great Depression, and yet, for all his efforts, seems vastly underappreciated.
CNBC recently asked people, “Do you have confidence in the way Ben Bernanke is handling the economy?” Ninety-five percent of the respondents said no.
Yes, the CNBC survey was hardly scientific. Nonetheless, it reflected the deep unease that many Americans feel about our central bank and its policies. Critics on both the left and right see much to dislike in how Mr. Bernanke and his Fed colleagues have been doing their jobs.
Let’s review the complaints.
Critics on the left look at the depth of the recent recession and the meager economic recovery we are experiencing and argue that the Fed should have done more. They fear that the United States might slip into a long malaise akin to Japan’s lost decade, in which unemployment remains high and the risks of deflation deter people from borrowing, investing and returning the economy to its potential.
Critics on the right, meanwhile, worry that the Fed has increased the nation’s monetary base at a historically unprecedented pace while keeping interest rates near zero — an approach that they say will eventually ignite inflation. Some in this camp have gone so far as to propose repealing the Fed’s dual mandate of simultaneously maintaining price stability — that is, holding inflation at bay — while maximizing sustainable employment. Better, these people say, to replace those twin goals with a single-minded focus on inflation.
Yet Mr. Bernanke’s record shows that the fears of both sides have been exaggerated.
Mr. Bernanke became the Fed chairman in February 2006. Since then, the inflation measure favored by the Fed — the price index for personal consumption, excluding food and energy — has averaged 1.9 percent, annualized. A broader price index that includes food and energy has averaged 2.1 percent.
Either way, the outcome is remarkably close to the Fed’s unofficial inflation target of 2 percent. So, despite the economic turmoil of the last five years, the Fed has kept inflation on track.
Of course, this record could come undone in future years. Yet the signals in the financial markets are reassuring. The interest rate on a 10-year Treasury bond, for instance, is now about 2.8 percent. A 10-year inflation-protected Treasury bond yields about 0.4 percent.
The difference between those yields, the so-called “break-even inflation rate,” is the inflation rate at which the two bonds earn the same return. That figure is now a bit over 2 percent, a sign that the market does not expect inflation in the coming decade to differ much from that experienced over the last five years. Inflation expectations are anchored at close to their target rate.
Could the Fed have done substantially more to avoid the recession and promote recovery? Probably not. The Fed used its main weapon against recession — cuts in short-term interest rates — aggressively as the depth of the downturn became apparent. And it turned to various unconventional weapons as well, including two rounds of quantitative easing — essentially buying bonds — in an attempt to lower long-term interest rates.
A few economists have argued, with some logic, that the employment picture would be brighter if the Fed raised its target for inflation above 2 percent. They say higher expected inflation would lower real interest rates, thus encouraging borrowing. That, in turn, would expand the aggregate demand for goods and services. With more demand for their products, companies would increase hiring.
Even if that were true, a higher inflation target is a political nonstarter. Economists are divided about whether a higher target makes sense, and the public would likely oppose a more rapidly rising cost of living. If Chairman Bernanke ever suggested increasing inflation to, say, 4 percent, he would quickly return to being Professor Bernanke.
What the Fed could do, however, is codify its projected price path of 2 percent. That is, the Fed could announce that, hereafter, it would aim for a price level that rises 2 percent a year. And it would promise to pursue policies to get back to the target price path if shocks to the economy ever pushed the actual price level away from it.
Such an announcement could help mollify critics on both the left and right. If we started to see the Japanese-style deflation that the left fears, the Fed would maintain a loose monetary policy and even allow a bit of extra inflation to make up for past tracking errors. If we faced the high inflation that worries the right, the Fed would be committed to raising interest rates aggressively to bring inflation back on target.
MORE important, an announced target path for inflation would add more certainty to the economy. Americans planning their retirement would have a better sense about the cost of living a decade or two hence. Companies borrowing in the bond market could more accurately pin down the real cost of financing their investment projects.
Mr. Bernanke cannot remove all of the uncertainty that households and businesses face, but he can eliminate one small piece of it. Less uncertainty would, other things being equal, encourage spending and promote more rapid recovery. It might even raise Mr. Bernanke’s approval ratings a bit.
N. Gregory Mankiw is a professor of economics at Harvard. He is advising Mitt Romney, the former governor of Massachusetts, in the campaign for the Republican presidential nomination.