전체기사 최신뉴스 GAM
KYD 디데이
글로벌

속보

더보기

도널드 콘 '테일러 준칙' 연설 원문(영문)

기사입력 :

최종수정 :

※ 본문 글자 크기 조정

  • 더 작게
  • 작게
  • 보통
  • 크게
  • 더 크게

※ 번역할 언어 선택

Vice Chairman Donald L. Kohn
At the Conference on John Taylor's Contributions to Monetary Theory and Policy, Federal Reserve Bank of Dallas, Dallas, Texas
October 12, 2007

John Taylor Rules

The Role of Simple Rules in Monetary Policymaking
It is a pleasure and an honor to speak at this conference honoring John Taylor and his contributions to monetary theory and policy. As you have already heard from Chairman Bernanke and the other speakers today, John has made a number of seminal contributions to the field of macroeconomics. What has distinguished John's work, in my view, is that he takes policymaking in the real world seriously.1

Taking policymaking seriously involves understanding the constraints imposed on our decisions by partial information and incomplete knowledge of economic relationships. It also implies the use of empirically valid models that acknowledge the efforts of households and businesses to anticipate the future and maximize their welfare over time. In the late 1980s and early 1990s, macroeconomics was focused mainly on real business cycles and endogenous growth theory. During this period, John was one of a very small number of academic economists who continued to pursue research aimed at informing the conduct of monetary policy. John's Carnegie Rochester conference paper published in 1993 is an excellent example of this research.

Importantly, John's legacy to the Federal Reserve has not been confined to enhancing our understanding of monetary policy. In addition, he has turned out legions of students who have followed in his footsteps in their interest in policy. Many of them have spent time in the Federal Reserve, producing a rich array of contributions to policymaking and research.

John and I have spent countless hours discussing how the Federal Reserve arrives at decisions about monetary policy and how it should arrive at decisions. Those conversations began in earnest in the late 1980s, when John was on the Council of Economic Advisers, and they have continued to the present day. They have occurred not only in offices and classrooms in Washington and Stanford and at numerous conferences around the globe, but also around dinner tables in Washington and Palo Alto and on hiking trails from Vermont to Wyoming. Those conversations made me a better policy adviser and then policymaker, and they have had the added and very special bonus of allowing Gail and me to count John and Allyn among our friends. I can't think of a better way to honor John's contributions than to continue that discussion around the dinner tables of Dallas by reflecting on the role of simple rules in informing policymaking.

Three Benefits of Simple Rules in Monetary Policymaking
In his Carnegie Rochester conference paper, John considered a simple policy rule under which the nominal federal funds rate is adjusted in response to both the gap between real and trend gross domestic product (GDP) and the gap between the inflation rate and policymakers' target. Based on data for the previous few years, John calibrated the long-run target for inflation and the two parameters that determine the responsiveness of the federal funds rate to the two gaps. The equilibrium real interest rate was based on a longer history of actual real interest rates. In the handout, Figure 1A depicts the actual nominal funds rate and the Taylor rule prescriptions between 1987 and 1992, as presented in John's paper. Despite its simplicity, this policy rule fits the data remarkably well; it described a period of generally successful policymaking; and it adhered to the Taylor principle of adjusting the nominal rate more than one-for-one with changes in the inflation rate, so it provided a plausible template for future success. It is no wonder that John has been such a dedicated salesman and that his efforts have been so well received in academia and policy councils.



Following John's seminal contribution, many other economists have engaged in research on similar policy rules and, together with John, have identified several benefits of such rules in conducting monetary policy. I will elaborate on three of them.

The first benefit of looking at a simple rule like John's is that it can provide a useful benchmark for policymakers. It relates policy setting systematically to the state of the economy in a way that, over time, will produce reasonably good outcomes on average. Importantly, the emphasis is on levels and gaps, not growth rates, as inputs to the policy process. This emphasis can be a problem when a level, say of potential GDP, is in question, but in many respects it is also a virtue. For the United States, the two gaps relate directly to the legislative mandate of the Federal Reserve to achieve stable prices and maximum employment. Moreover, those two gaps fit directly into most modern macroeconomic theories, which tell us something about their relationship and how that relationship can be affected by the type of shock hitting the economy.

Model uncertainties make the simplicity of the rule particularly important for the policymaker because research suggests that the prescriptions from simple rules can be more robust than optimal-control policies. Optimal-control policies can depend critically on the exact specification of the model, and clearly there is no consensus about which model best describes the U.S. economy.

Federal Reserve policymakers are shown several versions of Taylor rules in the material we receive before each meeting of the Federal Open Market Committee (FOMC). I always look at those charts and tables and ask myself whether I am comfortable with any significant deviation of my policy prescription from those of the rules.

A second benefit of simple rules is that they help financial market participants form a baseline for expectations regarding the future course of monetary policy. Even if the actual policy process is far more sophisticated than any simple rule could completely describe, the rule often provides a reasonably good approximation of what policymakers decide and a framework for thinking about policy actions. Indeed, many financial market participants have used the Taylor rule to understand U.S. monetary policy over the past fifteen years. Investors and other market participants are going to form expectations about policy and act on those expectations. The more accurate and informed those expectations are, the more likely are their actions to reinforce the intended effects of policy.

A third benefit is that simple rules can be helpful in the central bank's communication with the general public. Such an understanding is important for the transmission mechanism of monetary policy. Giving the public some sense of how the central bank sees the output and inflation gaps and how they are expected to evolve will help it understand the central bank's objectives and how policymakers are likely to respond to surprises in incoming data.

Four Limitations of Simple Rules
Simple rules have limitations, of course, as benchmarks for monetary policy. To quote from John's Carnegie Rochester paper, "a policy rule can be implemented and operated more informally by policymakers who recognize the general instrument responses that underlie the policy rule, but who also recognize that operating the rule requires judgment and cannot be done by computer" (p. 198). In that context, four limitations of simple rules are important.

The first limitation is that the use of a Taylor rule requires that a single measure of inflation be used to obtain the rule prescriptions. The price index used by John in the Carnegie Rochester paper was the GDP price deflator. Other researchers have used the inflation measure based on the consumer price index (CPI). Over the past fifteen years, the Federal Reserve has emphasized the inflation rate as measured by changes in the price index for personal consumption expenditures (PCE). Many researchers have also explored the use of core price indexes, which exclude the volatile food and energy components, as better predictors of future inflation or as more robust indicators of the sticky prices that some theories say should be the targets of policy. To be sure, over long periods, most of these measures behave very similarly. But policy is made in the here and now, and the various indexes can diverge significantly for long stretches, potentially providing different signals for the appropriate course of monetary policy.

Second, the implementation of the Taylor rule and other related rules requires determining the level of the equilibrium real interest rate and the level of potential output; neither of them are observable variables, and both must be inferred from other information. John used 2 percent as a rough guess as to the real federal funds rate that would be consistent with the economy producing at its potential. But the equilibrium level of the real federal funds rate probably varies over time because it depends on factors such as the growth rate of potential output, fiscal policy, and the willingness of savers to supply credit to households and businesses. Inaccurate estimates of this rate will mislead policymakers about the policy stance required to achieve full employment. In a similar vein, real-time estimates of potential output can be derived in a number of ways and--as shown by Orphanides (2003) and others--they are subject to large and persistent errors. If policymakers inadvertently rely on flawed estimates, they will encounter persistent problems in achieving their inflation objective.

The third limitation of using simple rules for monetary policymaking stems from the fact that, by their nature, simple rules involve only a small number of variables. However, the state of a complex economy like that of the United States cannot be fully captured by any small set of summary statistics. Moreover, policy is best made looking forward, that is, on the basis of projections of how inflation and economic activity may evolve. Lagged or current values of the small set of variables used in a given simple rule may not provide a sufficient guide to future economic developments, especially in periods of rapid or unusual change. For these reasons, central banks monitor a wide range of indicators in conducting monetary policy. In his Carnegie Rochester paper, John mentioned the stock market crash of October 1987 as an example of how other variables can and should influence the course of monetary policy in some situations.

The final limitation I want to highlight is that simple policy rules may not capture risk-management considerations. In some circumstances, the risks to the outlook or the perceived costs of missing an objective on a particular side may be sufficiently skewed that policymakers will choose to respond by adjusting policy in a way that would not be justified solely by the current state of the economy or the modal outlook for output and inflation gaps.

Policy Rules around 2003
Some of the ambiguities and potential pitfalls in the use of simple policy rules are highlighted by considering their prescriptions for a period earlier in this decade. Turning to Figure 1B, the solid line indicates the actual federal funds rate between the first quarter of 1993 and the second quarter of 2007, and the dashed line shows the prescriptions of the Taylor rule using the same methodology that John used in his Jackson Hole remarks this year.2 For the earlier part of the sample, the prescription from this simple rule tracks the actual funds rate relatively well. As John pointed out, a notable deviation happened beginning in 2002, and I would like to discuss that period to illustrate the limitations I noted earlier.



Inflation Measure
The first limitation is related to the measure used for the inflation variable included in the rules. The rule prescriptions depicted by the dashed line in Figure 1B are based on the headline CPI. But as you know, the FOMC often looks at core inflation, stripping out the effects of energy and food prices, as a better indicator of future price behavior. The dotted line represents the rule prescriptions based on the chain-weighted core CPI, which the Bureau of Labor Statistics has produced since 2000. Using this measure lowers the prescribed funds rate by about 2 percentage points during 2003, bringing the rule prescriptions much closer to the actual path of policy. The reason for the improvement is evident from Figure 2A, on the other side of the handout: Even though the headline and core CPI measures were broadly similar in the mid- to late 1990s, these measures diverged substantially between 2003 and 2005.


Potential Output
The second limitation relates to the challenge of judging the level of potential output in real time. To illustrate this point, Figure 2B plots three measures of the output gap. The solid line is the real-time estimate by the Congressional Budget Office (CBO) that was used in the Taylor rule prescriptions in Figure 1B, while the dashed line depicts the CBO's ex post estimate of the output gap as of the third quarter of 2007. Back in 2003, the CBO estimated that output at that time was below potential by only 1 percent. With the benefit of four more years of data, the CBO currently estimates that the output gap for the first half of 2003 was considerably wider--about 3 percent. In addition, the dotted line represents an alternative measure of resource utilization derived from the unemployment rate and an estimate of the natural rate of unemployment (NAIRU) taken from the Board staff's FRB/US model. In fact, the unemployment rate was rising through the middle of 2003, so the FOMC had every reason to believe that the output gap was widening at that time. Using this unemployment-based measure rather than the real-time CBO measure would reduce the prescriptions of simple policy rules by roughly 1/2 percentage point in early 2003.


Other Variables
The third limitation in my list was that the small set of economic measures included in simple rules may not fully reflect the state of the economy. Around 2003, financial market conditions may not have been adequately summarized by the assumed 2 percent equilibrium federal funds rate. Accounting scandals caused economic agents to lose confidence in published financial statements and in bond ratings. The result was higher uncertainty about the financial health of firms, and credit spreads widened substantially. Figure 2C shows that risk spreads on corporate bonds were elevated in this period. Other things equal, such spreads would reduce the federal funds rate needed to achieve full employment, perhaps explaining a portion of the gap between the actual federal funds rate and the outcome from the policy rule during this period.


Risk Management
The last item on my list of limitations was that simple rules do not take account of risk-management considerations. As shown in Figure 2A, the core CPI inflation rate for 2003 was falling toward 1 percent. The real-time reading of the core PCE inflation rate (not shown) was on average even lower than the comparable CPI figure. Given these rates, the possibility of deflation could not be ruled out. We had carefully analyzed the Japanese experience of the early 1990s; our conclusion was that aggressively moving against the risk of deflation would pay dividends by reducing the odds on needing to deal with the zero bound on nominal interest rates should the economy be hit with another negative shock. This factor is not captured by simple policy rules.

A Final Note
I have offered this analysis in the spirit of so many of the discussions I have had with John. His framework has been enormously important to policymaking in the Federal Reserve, and it has yielded many benefits. Nevertheless, it's important to keep in mind that some significant practical limitations also are associated with the application of such rules in real time. In other words, it's not so simple to use simple rules!

References
Orphanides, Athanasios (2003). "The Quest for Prosperity without Inflation," Leaving the Board Journal of Monetary Economics, vol. 50 (April), pp. 633-63.

Poole, William (2007). "Understanding the Fed (210 KB PDF)," Federal Reserve Bank of St. Louis, Review, vol. 89 (January/February), pp. 3-14, http://research.stlouisfed.org/publications/review/past/2007.

Taylor, John B. (1993). "Discretion versus Policy Rules in Practice," Leaving the Board Carnegie-Rochester Conference Series on Public Policy, vol. 39, pp. 195-214, http://econpapers.repec.org/article/eeecrcspp/default1993.htm.

_________ (2007). "Housing and Monetary Policy (244 KB PDF)," speech delivered at "Housing, Housing Finance, and Monetary Policy," a symposium sponsored by the Federal Reserve Bank of Kansas City, held in Jackson Hole, Wyo., August 30-September 1, www.kansascityfed.org/publicat/sympos/2007/pdf/2007.09.04.Taylor.pdf.

Footnotes

1. I am sure my colleagues join me in honoring John. However, my thoughts on policy rules are my own and not necessarily those of my colleagues on the Federal Open Market Committee. Jinill Kim and Andrew Levin, of the Board's staff, contributed to the preparation of these remarks.

2. Following John, the rule specification and the data used for the prescriptions closely follow the implementation of the Taylor rule in Bill Poole's speech in August 2006 (Poole, 2007). The inflation measure used for this rule is the four-quarter average headline CPI inflation rate, with the benchmark value set to 2 percent. Through 2001, the gap between real GDP and its potential is the value measured in real time by the staff of the Board of Governors. Because subsequent staff estimates of the output gap are not yet publicly available, the rule prescriptions for the post-2001 period are computed with the real-time output gap as constructed by the Congressional Budget Office.

[관련키워드]

[뉴스핌 베스트 기사]

사진
LG전자, 홈로봇 '클로이드' CES 공개 [라스베이거스=뉴스핌] 김아영 기자 = LG전자가 오는 6일(현지시간) 미국 라스베이거스에서 개막하는 세계 최대 가전·IT 전시회 CES 2026에서 홈로봇 'LG 클로이드(LG CLOiD)'를 공개한다고 4일 밝혔다. LG 클로이드는 AI 홈로봇의 역할과 가능성을 보여주는 콘셉트 제품이다. 사용자의 스케줄과 집 안 환경을 고려해 작업 우선순위를 정하고, 여러 가전을 제어하는 동시에 일부 가사도 직접 수행하며 비서 역할을 수행한다. 이번 공개는 '가사 해방을 통한 삶의 가치 제고(Zero Labor Home, Makes Quality Time)'를 지향해온 LG전자 가전 전략의 연장선이라는 것이 회사 측 설명이다. LG 클로이드가 세탁 완료된 수건을 개켜 정리하는 모습. [사진=LG전자] ◆CES서 보여주는 '제로 레이버 홈' 관람객은 CES 전시 부스에서 클로이드가 구현하는 '제로 레이버 홈' 시나리오를 볼 수 있다. 출근 준비로 바쁜 거주자를 대신해 전날 세운 식단에 맞춰 냉장고에서 우유를 꺼내고, 오븐에 크루아상을 넣어 아침 식사를 준비하는 모습 등이 연출된다. 차 키와 발표용 리모컨 등 일정에 맞는 준비물을 챙겨 전달하는 장면도 포함된다. LG 클로이드가 크루아상을 오븐에 넣으며 식사를 준비하는 모습. [사진=LG전자] 거주자가 집을 비운 동안에는 세탁물 바구니에서 옷을 꺼내 세탁기에 넣고, 세탁이 끝난 수건을 개켜 정리하는 시나리오가 제시된다. 청소로봇이 움직일 때 동선 위 장애물을 치워 청소 효율을 높이는 역할도 수행한다. 홈트레이닝 시에는 아령을 들어 올린 횟수를 세어주는 등 거주자의 일상 케어 기능도 시연한다. 이러한 동작은 상황 인식, 라이프스타일 학습, 정교한 모션 제어 능력이 결합돼 구현된다는 설명이다. ◆가사용 폼팩터·VLM·VLA로 최적화 클로이드는 머리와 두 팔이 달린 상체와 휠 기반 자율주행 하체로 구성된다. 허리 각도를 조정해 높이를 약 105cm에서 143cm까지 바꿀 수 있으며, 약 87cm 길이의 팔로 바닥이나 다소 높은 위치의 물체도 집을 수 있다. LG 클로이드가 거주자 위한 식사로 크루아상을 준비하는 모습.[사진=LG전자] 양팔은 어깨 3축(앞뒤·좌우·회전), 팔꿈치 1축, 손목 3축(앞뒤·좌우·회전) 등 총 7자유도(DoF)를 적용해 사람 팔과 유사한 움직임을 구현한다. 다섯 손가락도 개별 관절을 가져 섬세한 동작이 가능하도록 설계됐다. 하체에는 청소로봇·Q9·서빙·배송 로봇 등에서 축적한 휠 자율주행 시스템을 적용해 무게 중심을 아래에 두고, 외부 힘에도 균형을 유지하면서 상체의 정밀한 움직임을 지원한다. 이족보행보다 비용 부담이 낮다는 점도 상용화 측면의 장점으로 꼽힌다. LG 클로이드가 홈트레이닝을 돕는 모습. [사진=LG전자] 머리 부분은 이동형 AI 홈 허브 'LG Q9' 기능을 수행한다. 칩셋, 디스플레이, 스피커, 카메라, 각종 센서, 음성 기반 생성형 AI를 탑재해 언어·표정으로 사용자를 인식·응답하고, 라이프스타일과 환경을 학습해 가전 제어에 반영한다. LG전자는 자체 개발 시각언어모델(VLM)과 시각언어행동(VLA) 기술을 칩셋에 적용했다. 피지컬 AI 모델 기반으로 수만 시간 가사 작업 데이터를 학습시켜 홈로봇에 맞게 튜닝했다는 설명이다. VLM은 카메라로 들어온 시각 정보를 언어로 해석하고, 음성·텍스트 명령을 시각 정보와 연계해 이해하는 역할을 맡는다. VLA는 이렇게 통합된 시각·언어 정보를 토대로 로봇의 구체적인 행동 계획과 실행을 담당한다. 여기에 LG의 AI 홈 플랫폼 '씽큐(ThinQ)', 허브 '씽큐 온'과 연결 가전이 더해지면 서비스 범위가 넓어진다. 예를 들어 가족과 씽큐 앱에서 나눈 메뉴 대화를 기반으로 식단을 계획하고, 날씨 정보와 창문 개폐 상태를 조합해 비가 오면 창문을 닫는 등의 시나리오가 가능하다. 퇴근 시간에 맞춰 세탁·건조를 마치고 운동복과 수건을 꺼내 준비하는 연출도 제시된다. ◆로봇 액추에이터 브랜드 'LG 악시움' 첫 공개 LG전자는 홈로봇을 포함한 로봇 사업을 중장기 성장축으로 보고 조직·기술 강화에 나서고 있다. 최근 조직개편에서 HS사업본부 산하에 HS로보틱스연구소를 신설해 전사에 흩어져 있던 홈로봇 관련 역량을 모으고, 차별화 기술 확보와 제품 경쟁력 제고를 목표로 삼았다. LG 액추에이터 악시움(AXIUM) 이미지. [사진=LG전자] 이번 CES에서는 로봇용 액추에이터 브랜드 'LG 액추에이터 악시움(LG Actuator AXIUM)'도 처음 공개한다. '악시움'은 관절을 뜻하는 'Axis'와 Maximum·Premium을 결합해 고성능 액추에이터를 지향한다는 의미를 담았다. 액추에이터는 모터·드라이버·감속기를 통합한 모듈로 로봇 관절에 해당하며, 로봇 제조원가에서 비중이 큰 핵심 부품이다. 피지컬 AI 확산과 함께 성장성이 높은 후방 산업으로 평가된다. LG전자는 가전 사업을 통해 고성능 모터·부품 기술을 축적해왔다. AI DD 모터, 초고속 청소기용 모터(분당 15만rpm), 드라이버 일체형 모터 등 연간 4,000만 개 이상 모터를 자체 생산하고 있다. 회사는 이 같은 기술력이 액추에이터의 경량·소형·고효율·고토크 구현에 기반이 될 것으로 기대한다. 휴머노이드 한 대에 수십 개 액추에이터가 필요한 만큼, LG의 모듈형 설계 역량도 맞춤형 다품종 생산에 도움이 될 것으로 전망된다. ◆홈로봇 성능·폼팩터 진화 지속…축적된 로봇 기술은 가전에 확대 적용 LG전자는 집안일을 하는 데 가장 실용적인 기능과 형태를 갖춘 홈로봇을 지속 개발하는 동시에 청소로봇과 같은 '가전형 로봇(Appliance Robot)'과 사람이 가까이 가면 문이 자동으로 열리는 냉장고처럼 '로보타이즈드 가전(Robotized Appliance)' 등 축적된 로봇 기술을 가전에도 확대 적용할 계획이다. AI가전과 홈로봇에게 가사일을 맡기고, 사람은 쉬고 즐기며 가치 있는 일에만 시간을 쓰는 AI홈을 만드는 것이 목표다. 백승태 LG전자 HS사업본부장 부사장은 "인간과 교감하며 깊이 이해해 최적화된 가사 노동을 제공하는 홈로봇 'LG 클로이드'를 비롯해 '제로 레이버 홈' 비전을 향한 노력을 지속해 나갈 것"이라고 밝혔다. aykim@newspim.com 2026-01-04 10:00
사진
의대 정시 지원자 5년 만에 최저 [서울=뉴스핌] 정일구 기자 = 올해 의과대학 정시모집 지원자가 큰 폭으로 줄어 최근 5년 중 최저치를 기록했다. 4일 종로학원에 따르면 2026학년도 전국 39개 의대 정시모집 지원자는 7125명으로 전년대비 32.3% 감소했다. 지원자는 2022학년도 9233명, 2023학년도 844명, 2024학년도 8098명, 2025학년도 1만518명으로 집계됐다. 사진은 4일 서울 시내의 한 의과대학 모습. 2026.01.04 mironj19@newspim.com   2026-01-04 15:57
기사 번역
결과물 출력을 준비하고 있어요.
종목 추적기

S&P 500 기업 중 기사 내용이 영향을 줄 종목 추적

결과물 출력을 준비하고 있어요.

긍정 영향 종목

  • Lockheed Martin Corp. Industrials
    우크라이나 안보 지원 강화 기대감으로 방산 수요 증가 직접적. 미·러 긴장 완화 불확실성 속에서도 방위산업 매출 안정성 강화 예상됨.

부정 영향 종목

  • Caterpillar Inc. Industrials
    우크라이나 전쟁 장기화 시 건설 및 중장비 수요 불확실성 직접적. 글로벌 인프라 투자 지연으로 매출 성장 둔화 가능성 있음.
이 내용에 포함된 데이터와 의견은 뉴스핌 AI가 분석한 결과입니다. 정보 제공 목적으로만 작성되었으며, 특정 종목 매매를 권유하지 않습니다. 투자 판단 및 결과에 대한 책임은 투자자 본인에게 있습니다. 주식 투자는 원금 손실 가능성이 있으므로, 투자 전 충분한 조사와 전문가 상담을 권장합니다.
안다쇼핑
Top으로 이동