(1) HOME COMPUTERS AND HUMAN CAPITAL
Ofer Malamud and Cristian Pop-Eleches
Providing home computers to low-income children in Romania lowered academic achievement even while it improved their computer skills and cognitive ability. ----------------------------------------------------------------------
Several nations -- including Brazil, Uruguay, Peru, and Colombia -- have used subsidized programs to get personal computers into poor households. Governments have promulgated such programs despite little credible evidence that the technology improves children's academic performance or their behavior. Euro 200, a program administered by the Romanian Ministry of Education, gave out approximately 35,000 vouchers toward the purchase of a home computer in 2008.
The Euro 200 program met with mixed results, according to NBER researchers Ofer Malamud and Cristian Pop-Eleches. The voucher program boosted the likelihood of households owning a home computer by more than 50 percentage points and led to increased computer use. On one hand, children in families that received a voucher scored significantly higher on tests of computer skills and cognitive ability than their counterparts without a voucher. On the other hand, children in families that received a voucher had significantly lower school grades in math, English, and Romanian than their counterparts without vouchers. The authors conclude that "providing home computers to low-income children in Romania lowered academic achievement even while it improved their computer skills and cognitive ability."
In Home Computer Use and the Development of Human Capital (NBER Working Paper No. 15814), the authors include some evidence that winning a computer voucher also reduced the time spent doing homework, watching TV, and reading. "These results may not be so surprising given that few parents or children report having educational software installed on their computer, and few children report using the computer for homework or other educational purposes," the authors write. "Instead, most computers had games installed and children reported that most of the computer time was spent playing games."
The study focused on students about one year after their families would have gotten their computer. However, the authors also look at a smaller sample of households (647 versus 3,354 in the main study) who participated in the same Euro 200 program four years earlier. They find that these families had significantly higher levels of computer ownership than non-voucher families, too. In light of the small numbers in this second sample, "we do not wish to draw any strong conclusions," the authors write. "Nevertheless, taken as a whole, these results are consistent with the persistence of long-term negative effects on academic achievement, and positive long-term effects on cognitive ability and computer skills."
Parental rules can help ameliorate some of the negative effects of a computer in the home the study finds, especially if they target the right activity. For example, the authors find that if parents have a rule limiting computer use, it tends to reduce the benefits of increased computer skills without boosting grades. By contrast, if they have a rule enforcing homework time, it tends to ameliorate the computer's negative effect on grades without diminishing the benefits of increased computer skills and cognitive ability.
-- Laurent Belsie
http://papers.nber.org/papers/W15814
(2) FORECASTING FLU DIFFUSION TAKING ACCOUNT OF AVOIDANCE TACTICS
Byung-Kwang Yoo, Megumi Kasajima, and Jay Bhattacharya
More lives are saved and infection rates are reduced when both avoidance behaviors and a vaccination campaign begin before the flu has spread ----------------------------------------------------------------------
Novel influenza A or nH1N1, also known as the swine flu, appeared in the spring of 2009. Its first fatality was reported in Oaxaca, Mexico that April, and two months later the World Health Organization declared that it had reached pandemic status worldwide. Unlike seasonal influenza, nH1N1 afflicts children, pregnant women, and young adults, not newborns or the elderly. Infected people do not necessarily present fevers or coughs, frustrating both diagnosis and infection control. According to estimates from the Centers for Disease Control (CDC), nearly 10,000 people around the world had died from nH1N1 infection by December 2009.
The public and the public health community responded to the spread of the virus immediately and globally with voluntary avoidance of public spaces, school closings, and the ubiquitous placement of hand-sanitizer pumps. The widespread avoidance response in 2009 was both unprecedented and significant, but at that time, officials had no flu forecasting models that took avoidance response into account.
In Public Avoidance and the Epidemiology of novel H1N1 Influenza A (NBER Working Paper No. 15752), co-authors Byung-Kwang Yoo, Megumi Kasajima, and Jay Bhattacharya use data on avoidance behaviors in the United States and Australia to build nH1N1 forecasting models that explicitly account for the impact of avoidance on the epidemic's severity. The researchers compare their forecasting models to a model showing the cumulative path of confirmed nH1N1 infected cases -- the pandemic's actual course -- based on data from the CDC. They find that including a population's avoidance behavior results in a substantially better forecast of the actual spread of the flu.
When 196 million doses of an nH1N1 vaccine became available in the United States between October and December 2009, this nation's avoidance response to the epidemic broadened. The researchers include data on the new vaccine in their forecasting models. They consider a worst-case scenario, with a 50 percent effective vaccine and 50 percent of the population receiving it, and a better scenario, with 50 percent vaccine effectiveness and 90 percent of the population receiving it. In both cases, the vaccine's effectiveness is time-sensitive: the earlier the vaccination campaign begins, the lower the proportion of the population that will be infected. The authors find that more lives are saved and infection rates are reduced when both avoidance behaviors and a vaccination campaign begin before the flu has spread.
Their research also explores how prevalence of the flu influences individuals' decisions about whether to practice or postpone avoidance behaviors. In the first stages of the epidemic, the governments and mass media both here and in Australia released information on nHN1 to the general public. Despite early vaccine shortages in the United States, widespread avoidance behaviors quickly slowed the nH1N1 infection rate. Yet, as the flu grew less prevalent and was less discussed in the media, the number of people receiving nH1N1 vaccinations lessened, even among health care workers and those who received seasonal flu shots.
-- Sarah H. Wright
http://papers.nber.org/papers/W15752
(3) FEDERAL MORTGAGE MODIFICATION PROGRAMS
Casey B. Mulligan
[There is] ... a tradeoff between the number of foreclosures prevented in the short term and the durability of foreclosure prevention efforts. ----------------------------------------------------------------------
There have been multiple efforts to reduce home mortgage foreclosures via federal government loan modification or debt forgiveness programs since 2008. According to NBER Research Associate Casey Mulligan writing in Foreclosures, Enforcement, and Collections under the Federal Mortgage Modification Guidelines (NBER Working Paper No. 15777), despite the existence of these programs more than five million homes in the United States were either in foreclosure as a result of non-payment or were delinquent and potentially facing foreclosure in early 2009. Roughly 14 million more mortgages were "underwater" -- the amount owed exceeded the market value of the collateral, a scenario that frequently leads to loan default and foreclosure.
Through the third quarter of 2009, fewer than two million mortgages had been modified or had their payments otherwise adjusted under the federal modification programs -- including those of the Federal Deposit Insurance Corp., the Federal National Mortgage Association (Fannie), the Federal Home Loan Mortgage Corp. (Freddie), and a more recent Treasury Department effort called the Home Affordable Modification Program (HAMP). HAMP, which replaced the Fannie and Freddie programs, includes $75 billion in potential subsidies. All of these programs were created to prevent foreclosures on a large class of mortgages. Typically, a loan modification would result in lower monthly payments over a five-year period, achieved primarily by trimming interest costs, but with payments after the five-year period unchanged.
Most of the loan modifications, which are voluntary on the part of borrowers, seek to reduce the monthly mortgage payment so that no more than a target fraction of the borrower's average monthly income is devoted to total housing expense, which includes principal, interest, taxes, and insurance. Depending on the federal program, that percentage can range from 31 percent to 38 percent of gross income reported on the mortgagor's federal income tax return.
A further requirement of these programs is that the post-modification loan must have a value at least as great as the value of the collateral to the lender. Mulligan finds that the income target and collateral value tests combine to "create a tradeoff between the number of foreclosures prevented in the short term and the durability of foreclosure prevention efforts, because they make it impossible to both write down principal and offer modification to a wide range of borrowers." Another consequence of this tradeoff, he continues, "is to reduce collections, increase foreclosures and their costs, and reduce efficiency as compared to alternative means tested mortgage modification rules."
In many instances, the programs' guidelines also result in implicit marginal income tax rates in excess of 100 percent, and sometimes as large as 400 percent when principal reduction is included in the modified payment. That creates a stark incentive for borrowers to hide income in order to get lower mortgage payments. Mulligan suggests that alternative means-tested modification rules, based on a framework of optimal income taxation, might simultaneously reduce the number of foreclosures while improving collections and the efficacy of the process.
-- Frank Byrt
http://papers.nber.org/papers/W15777
(4) EXPLAINING THE RISE IN EDUCATIONAL GRADIENTS IN MORTALITY
David M. Cutler, Fabian Lange, Ellen Meara, Seth Richards, and Christopher J. Ruhm
The mortality gap between males with and without a college degree rose 21 percentage points during [the 1971-2000 period]. ----------------------------------------------------------------------
The long-standing inverse relationship between education and mortality strengthened substantially late in the twentieth century. In 2000, college-educated 25-year olds could expect to live seven years longer than their peers with less schooling. In Explaining the Rise in Educational Gradients in Mortality (NBER Working Paper No. 15678), co-authors David Cutler, Fabian Lange, Ellen Meara, Seth Richard, and Christopher Ruhm investigate the extent to which behavioral risk factors, such as smoking and obesity, may explain how and why education-related disparities in mortality rates changed between the early 1970s and the end of the twentieth century.
The authors report several important facts about recent mortality trends. First, even after one controls for smoking and body weight, the college-educated have lower expected mortality rates than their less educated peers. This differential increased from 12 percent to 25 percent for men and from 9 percent to 20 percent for women between 1971 and 2000. Second, current smoking is associated with a much larger increase in mortality rates than other risk factors, and the adverse effects of smoking on mortality may have actually increased over time. Severe obesity raises mortality risk, too.
Third, although higher levels of mortality among the less educated are due in part to higher rates of smoking and obesity, the trends in smoking and obesity explain little if any of the relative increase in mortality for the less educated over the last three decades. The mortality gap between males with and without a college degree rose 21 percentage points during that time, while the authors estimate that differential changes in smoking and obesity would have led to a 4 or 5 point decrease. For women, patterns of smoking and obesity only can explain approximately 3 points out of the 42 percentage point increase.
Finally, the authors examine deaths from cardiovascular diseases (CVD) and cancer, the two most important sources of mortality in the United States. Both sources of mortality are influenced by behavioral risks such as smoking and obesity. The authors find that changes in cancer mortality play a key role in explaining the trends in the educational gradients in total mortality, but they find little evidence that changing risk profiles explain the widening in the education mortality gradient. Instead, changing returns to education and to various behavioral risk factors have favored the more educated. Thus, it is the return to education, conditional on health behaviors, that is important. The mortality returns to risk factors and, conditional on risk factors, the return to education, have grown over time for reasons that are not yet understood.
There are several possible explanations for these findings. One is that the highly educated have better access to medical care and better adherence rates to prescribed regimes. Another is that environmental and geographically-based risks may have declined more over time for the highly educated. But these results do not imply that improvements in the health-related lifestyles of the less educated would yield no benefits -- they suggest that reducing smoking, obesity, hypertension, and elevated cholesterol should have resulted in reduced mortality. However, the results suggest that even the complete elimination of disparities in behavioral risks across education groups would be unlikely to do away with education-differentials in mortality.
-- Claire Brunel
http://papers.nber.org/papers/W15678
(5) EVIDENCE FROM A TAX AUDIT EXPERIMENT IN DENMARK
Henrik J. Kleven, Martin B. Knudsen, Claus T. Kreiner, Soren Pedersen, and Emmanuel Saez
[Announcing high probabilities of future audits generated] substantial future tax revenue through behavioral responses to a higher perceived probability of detection. ----------------------------------------------------------------------
In Unwilling or Unable to Cheat? Evidence From a Randomized Tax Audit Experiment in Denmark (NBER Working Paper No. 15769), co-authors Henrik Kleven, Martin Knudsen, Claus Kreiner, Soren Pedersen, and Emmanuel Saez conclude that the low overall rate of tax evasion enjoyed by advanced economies has more to do with an information environment in which third-party reporting makes it difficult to cheat than it does with any moral reluctance to cheat on the part of taxpayers.
Observing the behavior of a random sample of 42,784 Danish taxpayers in 2007 and 2008, they find that income categories in which both taxpayers and third parties report payments have evasion rates that fall "between 0.2 percent and 0.9 percent" of each type of income. Income categories that are self-reported and not subject to third-party reporting have evasion rates of roughly 37 percent. Although self-reported income constitutes only about 5 percent of total income among the sample in this study, it is responsible for 87 percent of the detected tax evasion.
The authors also report on the results of an experiment in which taxpayers were randomly divided into two groups, one group in which all were audited and the other in which none were audited. The people in the 100-percent-audit group had comprehensive unannounced tax audits of their 2007 tax returns. None of the returns of the other (zero-percent-audit) group were audited. The following year, people from each group were randomly selected into three sub-groups with different audit probabilities (zero, 50 percent, and 100 percent, respectively) and informed in advance about the probability that their 2008 returns would be audited.
The combination of pre- and post-audit data makes it possible for the authors to examine taxpayer behavioral responses to actual audits and to the threat of audits, and to study the effect of high marginal tax rates on tax evasion decisions. The results show that audits had a strong and significant effect on subsequent reporting of self-reported income and it is concluded that audits generate "substantial future tax revenue through behavioral responses to a higher perceived probability of detection." Similarly, audit threats led to upward adjustments in self-reported income.
Many Danish taxpayers bunch at the kink points of the income tax schedule where marginal tax rates jump from 49 percent to 62 percent, which is evidence of behavioral responses to marginal tax rates. However, the audits show that the vast majority of those bunching taxpayers do not evade taxes. This implies that behavioral responses to marginal tax rates are due mostly to labor supply or tax avoidance rather than tax evasion. Overall, this evidence shows that broadening information reporting requirements can have a substantial impact on tax compliance.
-- Linda Gorman
http://papers.nber.org/papers/W15769
(6) EVIDENCE ON THE LONG-RUN ELASTICITY OF LABOR SUPPLY
Orley C. Ashenfelter, Kirk B. Doran, and Bruce Schaller
Taxi drivers appear to have worked just a little bit less in response to an increase in the fare structure. ----------------------------------------------------------------------
Many public policies regarding taxation, social safety nets, and the redistribution of income are designed based on assumptions about the long-run effect of after-tax wage rates on labor supply. For men, many of the estimates of this elasticity of labor supply suggest values near zero, implying that permanent wage increases have relatively small effects on labor supplied. However, all of these studies have faced the problem that most workers cannot alter their hours of work without changing jobs, and that it is difficult to measure the actual changes in net-of-tax wages that workers face.
In A Shred of Credible Evidence on the Long-Run Elasticity of Labor Supply (NBER Working Paper No. 15746), co-authors Orley Ashenfelter, Kirk Doran, and Bruce Schaller introduce a simple, natural experiment to deal with these problems. They rely on a dataset of New York City taxi drivers who choose their own hours, and who experienced two permanent fare increases instituted by the New York City Taxi and Limousine Commission in March 1996 and May 2004. While this approach has the obvious advantage of transparency, the authors note that it may not be appropriate to generalize the findings here to other workers.
The data indicate that the effect of increases in the fare structure on the number of hours worked is small, and negative: taxi drivers appear to have worked just a little bit less in response to an increase in the fare structure. Indeed, the fare increases analyzed here resulted in higher total revenue per hour -- the number of passengers hailing cabs did not drop off enough to offset the additional revenue from higher fares per trip. Miles driven did decline, however, by 4.2 percent on average following a fare increase.
The wages of the cab drivers, on the other hand, were strongly affected by the increases in the fare structure. The fare increases appear to be associated with an average 19 percent increase in revenue per mile.
Taken together, the evidence implies that the long-run uncompensated elasticity of labor supply is -0.23, and that the income effects of a fare increase dominate the substitution effects. This result is consistent with a broad variety of historical evidence that suggests that the massive increases in real wages seen in the United States and Europe since 1879 have been accompanied by significant declines in annual hours worked per worker.
-- Claire Brunel
http://papers.nber.org/papers/W15746
|