Social Welfare Ch 8
Mark J. Stern; June Axinn
In 1971, President Richard Nixon vetoed the amendments to the Economic Opportunity Act. The veto was not that surprising. After all, this legislation carried on many War on Poverty programs that Nixon and his party had opposed. What was startling was the tone of his objections. Nixon abandoned traditional mainstream conservative objections to the cost and potential waste associated with social welfare programs and instead focused on its potential for undermining the American family. In his veto message, Nixon asserted that “for the Federal Government to plunge headlong financially into supporting child development would commit the vast moral authority of the National Government to the side of communal approaches to child-rearing over [and] against the family-centered approach.”1 Within a year, the battle over the family widened with the Supreme Court’s decision in Roe v. Wade that American women had a constitutional right to abortion services. Congressional passage of the Equal Rights Amendment to the U.S. Constitution in 1972 set off another symbolic battle over the nature of gender and the relationship of the public and private spheres.
Two major forces framed the history of social welfare between 1968 and 1992. First, in contrast to the previous two decades, the economy grew at a much slower pace. The transition to a postindustrial, information economy combined with a deliberate effort to control inflation to keep economic growth slow and unemployment high. As a result, public welfare faced a host of new demands for services and benefits at a time when there was little economic growth to pay for them.
Second, the 1971 veto message marked the beginning of a sustained ideological attack on social welfare programs. The conservative New Right was deeply suspicious of all government programs and saw social welfare as connected to the declining competitiveness of the American economy, the weakening of its family structure, and the general decline of American moral character.
For a time, the economic conservatism of the battle against inflation and the social conservatism of the battle over the family combined to form a powerful social movement that sought to reduce fundamentally the public commitment to social welfare. President Ronald Reagan used the early years of his presidency to lay out a new departure for social welfare, one that would shift power from the federal government to the states, from the public sector to voluntary associations, from open-ended “entitlements” to federal aid to limited block grants.
For the most part, “Reaganism” did not accomplish its goals. The structure of social welfare in 1992 looked more or less the way it had a decade earlier. However, the large tax cuts implemented by Congress and the president in 1981 combined with slow economic growth to limit the reach and effectiveness of existing programs. Rather than marking a new departure in social welfare, the 1980s were a period of declining effectiveness and the neglect of new, pressing social problems such as homelessness and the AIDS epidemic.
The United States was not unique. The welfare state came under attack throughout the Western world. One response to the poor performance of the economy was the abandonment of social welfare programs and efforts to mitigate the inequalities of the marketplace. In constant dollars, per capita expenditures for social welfare had increased at an annual rate of 6.5 percent in 1970. It continued to increase during the 1970s, but at a slower rate. By the late 1980s, the rate of increase had fallen to 1 percent. The impact of conservatism was not the elimination of programs, but incremental declines in funding for them.2
The United States was not unique. The welfare state came under attack throughout the Western world.
Economic and Social Trends
A Struggling Economy
Unemployment during the 1970s averaged 5.4 percent in the first half of the decade and jumped to 8.3 percent in the recession of 1975, before falling to 5.8 percent in 1979. Consumer prices, driven by the sharp rise in oil prices, went up throughout the decade. Inflation remained out of control until the recession of 1982–1983. The recession slowed the inflation rate, but at a heavy price for American workers. Unemployment rose to 9.5 percent and declined only slowly in the years following. At the end of the 1980s, another recession increased unemployment again.
Nowhere were the slowdown of economic growth and the shortfall in its distribution to American families from 1970 to 1990 clearer than in the data on gross national product (GNP) and on family income. GNP and median family income grew during the period, but at a disappointing pace. After adjusting for inflation, the rise in GNP between 1970 and 1990 was just over 40 percent. Household median income in constant dollars rose and fell during the period in response to changes in levels of unemployment and to price increases. But in 1990, it was roughly what it had been in 1979 and only slightly higher than it was in 1970.3 While median income remained level, the gap between the very rich and the very poor grew markedly. The number of billionaires quadrupled during the 1980s; the number of people below the poverty line increased by 35 percent.4
During the 1970s, the Organization of Petroleum Exporting Countries (OPEC) twice restricted output to increase the price of oil. As millions of Americans waited in line at gas stations, it became clear that the United States was one part of a global economy. Domestic inflation had disturbing implications for those who lived on fixed incomes especially, as well as for those who worked and found their ability to purchase goods and services declining. Rising prices combined with rising unemployment hit most heavily those least able to withstand wage and job loss.
Economic security had grown more elusive for many Americans. From World War II until the 1970s, Americans had assumed that economic growth, consumerism, and affluence were the answers to the nation’s social and economic problems. During the 1970s and 1980s, however, a new set of concerns—the environment, urban decline, and globalization—framed unregulated economic growth as part of the problem, not the solution. Where the War on Poverty had assumed that economic growth could increase the incomes of the poor without harming the rest of society, a “zero sum” economy increasingly pitted social groups against one another. Unions demanded wage and benefits increases to offset the erosions of inflation. Employers sought a more flexible labor market and lower taxes to stimulate investments.
Changing Employment Patterns
The transition to an information economy was complicated by government monetary policy and its implication for unemployment. Since 1946, the federal government had committed itself to pursuing maximum employment policies, a pledge it had reiterated with the passage of the Humphrey–Hawkins Full Employment and Balanced Growth Act of 1978. Yet, the same year, the Federal Reserve Board began using high interest rates as a means of slowing inflation. As a result, unemployment began a steady rise, reaching 11 percent in 1982, the highest level since the Great Depression. How could government commit itself both to “full employment” and to using unemployment to fight inflation? Economists had provided an answer with one acronym: NAIRU. The “nonaccelerating-inflation rate of unemployment” was defined as the lowest unemployment rate that was consistent with low inflation. This rate—estimated at around 6 percent during the 1980s—was redefined as the “natural” rate of full employment. Thus, for more than a decade, government policy defined a stagnant economy with more than 6 million unemployed workers as full employment.
The fight against inflation during the 1970s and 1980s harmed not only those who were unemployed; the loss of production because of slow economic growth harmed all Americans. Indeed, a conservative estimate of the lost production between 1974 and 1995 was equal to $1.6 trillion—nearly three months of the gross domestic product (GDP) in 1995.5 The years between 1968 and 1992 were characterized by a shift from an industrial economic base to a service economy. From 1970 to 1990, employment increased by 47 percent, but manufacturing employment increased by only 2 percent, while the number of jobs in service industries rose by 92 percent. Of the 18 million new jobs created between 1980 and 1990, 88 percent were in services, finance, and sales. Factory employment declined during this period, although manufacturing productivity and output increased.6
For displaced workers of the industrial labor force, the result was unemployment and increased economic uncertainty. The expanding service sector had historically been a major source of employment for women and ethnic minorities, albeit one with generally low wages and poor working conditions. In 1990, 45 percent of the workforce was female, but 61 percent of those employed in services were women; 10 percent of those in the workforce were African Americans, 12 percent of those in services. In the lowest-paying sector of the service industry, personal services and hospitals, about 75 percent of the employees were female and 15 percent black; less than 10 percent were white men. The new service jobs were likely to carry lower wages and be subject to involuntary temporary, part-time, and part-year employment. These workers were less likely to receive pensions or health care through their employer. As a result, the proportion of workers who did not receive employer-provided health care, but could not afford individual policies, increased.
The Changing Family
Conservatives were right that family life was changing in important ways during the 1970s and 1980s, although they erred in seeing all of the trends as negative.7 At the start of the 1990s, family life was very different from the typical domestic unit for which our social programs had been designed 55 years earlier. In 1935, at the time of the passage of the Social Security Act, a majority of families consisted of a mother who was a homemaker, a father who went to work, and perhaps three children. In contrast, the average household in 1990 was no longer a two-parent home with several children. This description applied to less than 30 percent of American households and 38 percent of families. Only 13 percent of families included two parents, one wage earner, one homemaker, and one or more children. The median age of first marriage rose during the 1980s, and single-person households increased as young adults were likely to spend a number of years living on their own before they married. The continuing popularity of marriage was countered by rising divorce rates.
The number of female-headed households rose sharply, with the fastest increase during the 1970s. In the 1980s, the increase in families headed by a single woman or by a grandmother continued, but at a slower pace. By 1990, 51 percent of all black children under age 18, 27 percent of Hispanic children, and 16 percent of white children were living with just one parent. In 20 years, the proportion of families that were headed by women had doubled. The number of such families increased by 75 percent among African Americans and by 106 percent among whites. Because absent parents rarely provided sufficient child support, the children in these households were being supported largely by their mothers.
The average size of the U.S. household decreased about 21 percent from 1970 to 1990. However, this drop was not distributed equally throughout the population. There were more people who never married, more families that had no children, and more families that had just one child. In 1990, half of all American families had no current child-rearing responsibilities. There were also families that continued to have three, four, or five children. The result was that 80 percent of children were supported by only 30 percent of the adult population. This unequal distribution of the task of supporting children had a racial dimension. The birthrate was significantly higher in the black community than in the white community.
The distribution of children had political implications. The sentimental view of mothers and children that had fueled maternalism earlier in the century was in decline. Single mothers were no longer the objects of pity. Rather, they were often seen as at fault and expected to both work and rear their children. As a result, it became harder to obtain money for programs for the support of women and children that were seen as a less universal need than it was in the past. Some indications of this effect were seen in the increasing reluctance to support public education in some parts of the country.
The battle over the government’s role in helping families was seen in the effort to guarantee family leave for new parents. The Democratic-controlled Congress passed the Family and Medical Leave Act in 1990, but George H. W. Bush vetoed the bill and Congress upheld his veto. That bill would have required employers with 50 or more employees to provide unpaid leave to new parents as well as to workers who needed to care for a sick family member. At the time, about 15 states had legislation protecting jobs for workers when they needed extended time away for family or medical reasons, but for most American families, there was no parental leave. Finally, the Family and Medical Leave Act became the first legislation signed into law by President Clinton in 1993.8
Many Americans had fond memories of the “traditional” breadwinner family of the 1950s. Even though American families were better off because of women’s earnings, and men and women shared the responsibilities of parenthood more equally, nostalgia prompted many Americans to decry the changes in family life. As a result, efforts by the Carter administration to reach a consensus on “family policy” foundered. Debates over abortion, equal rights, and the proper role of the federal government found no middle ground during these years.
In 1935, when the Social Security Act was passed, persons 65 years of age and over represented 6 percent of the total population. Between 1970 and 1990, this percentage increased from 9.7 to more than 12. The proportion of the working population—persons between 18 and 64 years of age—declined. Falling birthrates, combined with increased life expectancy, led to a new population pattern and a new set of social welfare needs. They also led to many concerns about the dependency ratio—the ratio between workers and nonworkers in society. Although increasing productivity more than compensated for the increasing dependency ratios, many academic and political leaders called for radical changes in retirement and Social Security policies to respond to the new demographic realities.
With increased numbers, the aging became more politically effective. They needed and demanded new programs and expanded benefits, as the working population may have felt less able to provide them.9 This was offset in part by the increase of women in the labor force. The proportion of women working rose sharply in the 20th century. In 1900, 20 percent of adult women worked. The proportion of adult women who were in the labor force increased from 49 percent in 1970 to 69 percent in 1990.10 The reality of women in the workforce and the increased expectation that women hold jobs outside the home had an impact not only on family social and economic status, but also on many social welfare programs. The social insurance system came under scrutiny for its treatment of working women. Issues of child care moved to the fore of public attention, while programs for poor women “not expected to work” became more suspect than in the past.
The decade of the 1970s brought a current of economic, social, and political malaise to American life. As in other eras leading to periods of reform, old values and old beliefs were scrutinized. But the frustrations and disappointments of the 1970s made “traditional” values more attractive to many Americans. Indeed, as distrust of government and of political leaders deepened, and the economic situation worsened, the general citizenry seemed to lose hope that positive change would occur for the country as a whole; individuals and self-interest groups began to look even more to their own. As faith in collective action faded, the traditional American confidence in the efficacy of individual effort prevailed. Ronald Reagan won the presidential election of 1980 on the promise of a return to conservative values that would restore the country’s greatness.
The cost of high unemployment during the late 1970s and early 1980s fell disproportionately on low-income families. Their situation was further undermined by the budget policies of the Reagan administration. Although President Reagan failed to secure the large cuts in domestic programs he sought early in his administration, he did succeed in reducing federal income taxes, especially for those in high-income brackets. As a result, throughout the 1980s, there was great pressure on the federal government to reduce spending. The resulting scramble to protect existing programs meant that new social needs—such as the increased visibility of the homeless and the emergence of the AIDS epidemic—had a hard time securing the funding they needed. Increasingly, government returned to a “residual” approach to the problems of the poor. Public programs acted only as a tattered safety net for those who could not fend for themselves.
Poverty and Income Distribution
The American economy had hit the equivalent of the trifecta between World War II and 1970. Rapid economic growth and the expansion of social welfare programs had reduced both the poverty rate and overall income inequality across the nation. During the early 1970s, a number of commentators suggested that “the goal of eliminating income poverty as stated by President Johnson in 1964 had been virtually achieved before the onset of the 1974–75 recession.”11 If Americans felt like they had hit the jackpot during the early postwar years, the 1970s and 1980s made many Americans feel snakebitten. Suddenly, in the late 1970s, economic inequality began to increase, a trend that would persist for most of the next four decades.
The War on Poverty had led to a dramatic decrease in poverty in the late 1960s. During the 1970s, the poverty rate remained stable at about 11 to 12 percent, but rose to 15.2 percent during the Reagan recession of 1982–1983. Indeed, by 1990, 33.6 million people were counted poor—32 percent more than in 1970.
Second, the distribution of poverty was far from random. On the contrary, it was structured by race, ethnicity, sex, family situation, age, and employment status. For whites, the official poverty rate was about 11 percent in 1990; for Hispanics, 25 percent; for African Americans, 30 percent; and for Native Americans, 31 percent. The rate for households where the head worked full time was 4 percent; the rate for nonworking households, 25 percent.
Poverty among two-parent families decreased sharply while that of single-parent families rose. By the end of the period, 34 percent of one-parent, female-headed households were poor. During much of the 1970s, increased job opportunities and wage hikes helped. But much of the financial success was dependent upon the higher labor-force participation of women, of working wives and mothers. Obviously, two-parent (and possibly two-earner) families were in better financial shape, but the implications of having both parents working and out of the home remained unexplored. Major questions about marriage, family life, and child care remained unanswered.
The biggest decline in poverty was among the aging. Although Social Security had constructed a floor under retirement income, that floor still left nearly a quarter of older Americans living in poverty. While the national poverty rate remained stable during the 1970s and 1980s, the rate for older Americans was cut in half, reaching 12 percent in 1990. This dramatic decline in poverty was due largely to the expansion of Social Security and other programs for the aging during the early 1970s. Old Age, Survivors, and Disability Insurance (OASDI) benefits had increased by only 27 percent between 1959 and 1968; but in the next three years, they increased by 52 percent. Starting in 1975, increases in program benefits were indexed, so they kept pace with increases in prices and average wages. In 1974, Congress passed legislation that combined Old Age Insurance (OAI), Aid to the Blind, and Aid to the Temporarily and Permanently Disabled to create Supplemental Security Income (SSI). Where benefits and eligibility had been left to the states in the old programs, SSI was federalized with a uniform level of benefits and eligibility criteria across the nation, although some states decided to add a supplement to federal benefits. Additionally, the passage of the Supplemental Security Act, which began operation in January 1974, provided a means-tested income transfer program guaranteeing a minimum income for the elderly and disabled. Furthermore, SSI benefits, like social insurance, were indexed to keep pace with inflation. Automatic increases in transfer payments continued to help this group when food, heat, and housing costs did not outstrip gains. Nevertheless, the aging continued to be vulnerable to poverty. Many, if not “in poverty,” lived close to poverty. More than 20 percent of the aged lived below 125 percent of the poverty line.
At the same time, changes in family structure placed more children at risk of poverty. In 1959, the poverty rate for children under the age of 18 was 26 percent, about one-quarter higher than that for the population as a whole. During the 1960s, the poverty rate for children fell faster than that for the entire population, to 15 percent in 1970. However, over the next 20 years, much of this improvement was lost. By 1990, the child poverty rate stood at 20 percent, nearly 1½ times the total population figure of 13.5 percent.
A number of social realities increased the risk of poverty for American children. Changes in fertility rates meant that poorer groups in the population were having a larger share of all children. In addition, increased rates of divorce and single parenthood meant that a larger proportion of children relied on the income of one parent. Finally, beginning in the middle of the 1970s, states began a concerted effort to restrict Aid to Families with Dependent Children (AFDC). States used a variety of administrative barriers to restrict the growth of the welfare rolls and failed to adjust benefits for inflation, leading to a decline of more than a third in the value of benefits, when corrected for inflation. African American and Hispanic children suffered the most from all of these changes. By 1990, nearly 40 percent of them were living in poverty.
For Native Americans, the years of Presidents Nixon, Ford, and Carter were a mix of progress and retreat. There were improvements in health services and education. There was even some political representation in Western states. But the loss of land was still in progress, with only minimal repayments.
Reagan administration policies hit Native Americans very hard. Unemployment on the reservations reached 80 percent during the early 1980s. Partly in response to the leasing of tribal reservation land to white families, partly in response to the high unemployment rates and lack of opportunities on the reservations, many Native Americans migrated to urban areas. They still fared very poorly. High unemployment rates and very low incomes were typical. Native Americans were not eligible for state social services or support. They had to rely on federal programs for job training, housing assistance, and child welfare services. Thus, the social service cutbacks of the Reagan years affected this population with particular force. Poverty rates reached as high as 57 percent in some parts of the West.12
Frustration in regard to the eradication of poverty was aggravated by failure to agree on how to define it. Measuring poverty as an absolute dollar amount, as with the annually computed poverty index, was increasingly questioned. In 1959, the poverty line for a family of four was about one-half the median family income ($2,793 versus $5,417). As the growth of family incomes outstripped inflation, the gap between the two increased. In 1990, the median family income of $35,353 was nearly three times the poverty line for a family of four ($12,293). The deprivation of the poor resulted both from their lack of income and from the fact that they were falling behind the experience of average families.
Since the end of World War II, income distribution in the United States became less equal. As shown in Table 8.1, most of this change occurred after 1980. This happened despite economic growth, a War on Poverty, an increase in the labor-force participation of women, and an expansion of social insurance benefits and social services. Control of income and of wealth in the United States became more concentrated at the top. The growth of wealth of the top-income quintile, increased inequality of wages, rising regressive Social Security taxes and state and local sales taxes, and a decrease in the progressivity of federal income taxes all played their part. A combination of market factors and government policy led to a situation in which the rich became richer and the poor, poorer.13
Table 8.1 Distribution of Income
Source: Author’s calculation from Current Population Survey, various years.
Percentage of National Income
Income Class
1947
1980
1990
Lowest quintile
5.1
5.1
3.9
Second quintile
11.8
11.6
9.6
Third quintile
16.7
17.5
16.0
Fourth quintile
23.2
24.3
24.1
Highest quintile
43.2
41.5
46.4
Property and other assets were even less equally distributed than income. Whereas the top 1 percent of families controlled 4 percent of all income in 1988, they controlled 20 percent of all net financial assets, a larger share of the nation’s wealth than the bottom 80 percent of the population. The economic gap between the races was more dramatic for wealth than income. African American married couples had a family income in 1988 that was about 80 percent that of white couples, but the net worth of the average black couple ($17,437) was just 27 percent that of white couples ($65,024).14
Innovations in Social Welfare
Expenditures for Social Welfare
Total federal expenditures for social welfare continued to expand during the 1970s and 1980s, but the increase was not steady across the period. Government expenditures for health, education, and welfare rose from $3.9 billion in the pre-Depression year of 1929 to $145.6 billion in 1970, and then almost doubled to $290 billion by 1975. By 1978, expenditures stood at $394 billion. The difference between 1970 and 1978 was almost $250 billion. Some of the increase was due to inflation and some due to population growth. But, in 1987 prices, per capita welfare expenditures rose from $698 in 1950 to $1,952 in 1970 to $3,364 in 1978. In 1990, as in 1975, social welfare expenditures were 19.1 percent of GDP and 56.6 percent of total government outlays. The early 1970s represented a high point in the expansion of federal spending on social welfare, as Social Security, public assistance, food, and community development programs all gained widespread support in Congress. After the middle of the 1970s, however, public support for welfare expansion declined. Only programs that included automatic cost-of-living increases were able to keep pace with the high inflation of the period.
The dualism of the American social welfare system—the split between generous programs for the worthy and punitive ones for the undeserving—expanded during these years. About half the $1,045 billion spent for social welfare purposes went for non–means-tested social insurances and government retirement programs and 25 percent went for education. Non–means-tested programs are programs for all income groups, not programs only for the poor. Less than 16 percent went to programs particularly targeted to the poor: public assistance, Medicaid, food stamps, and some service and employment training programs.
The dualism of the American social welfare system—the split between generous programs for the worthy and punitive ones for the undeserving—expanded during these years.
The conservative resurgence did reverse the trend toward the federal government accounting for a larger and larger share of public expenditures. In 1929, the federal government contributed 21 percent of total governmental outlays for social welfare. By 1970, the federal share had risen to more than 50 percent. It peaked at 63 percent in 1984 and, by 1990, had dropped to just under 60 percent. With expenditures for education excluded, the federal government’s share of the cost of public social welfare in 1990 was 76 percent.
The pressure to limit the federal budget highlighted the difference between entitlement programs—for which the government committed itself to providing funding for all those eligible for a program—and discretionary programs—for which there were no such guarantees. In large measure, the increases in federal expenditures for social welfare were due to outlays that were not subject to administrative discretion. The bases of outlays for the social insurances, other governmental health and retirement programs, public assistance programs, and others are mandated by law.
The persistent effort to limit social welfare expenditure also meant that government was less ready to meet new social problems as they emerged. Homelessness, although not an entirely new problem, became more common and visible during the 1970s and 1980s as a result of cuts in general assistance, the increasing cost of housing, and deinstitutionalization all increased the numbers of persons without a fixed residence. Efforts to address the problem typically involved voluntary agencies and local government until the passage of the McKinney–Vento Homeless Assistance Act of 1987, which provided funding for a variety of services. The halting commitments of the 1980s focused on providing temporary shelter for homeless individuals and families, and shelters remained the dominant response to homelessness for the next three decades.15
The health emergency associated with the human immunodeficiency virus (HIV), too, suffered from the budget pressures during this period. First identified among intravenous drug-users and gay men in the early 1980s, HIV/AIDS, without any known treatment or cure, quickly spread to other groups in the population. Still, its association with “undeserving” populations combined with the antigovernment rhetoric of the 1980s served to slow governments’ response to the epidemic.16
Although “cutting welfare” remained a popular goal, identifying who and where to cut remained a challenge. The efforts of Richard Nixon in the budgets of 1974 and 1975 to eliminate social welfare programs that he deemed unsuccessful and wasteful met with limited success.17 The campaign to abolish the Office of Economic Opportunity (OEO) was instructive. OEO itself was abolished, but its major programs—the community action programs, community legal services, and Head Start—were assigned to other agencies and continued with budgets essentially uncut. In the case of Head Start, the budget was even increased. Gerald Ford also discovered that efforts to cut programs often failed. Notwithstanding the president’s opposition, Congress refused to postpone salary increments for federal employees and resisted limiting the increase in veterans’ benefits to the levels suggested by the Ford administration, even when these measures were touted as anti-inflationary. President Carter found it necessary to yield to congressional pressures against withholding funds for a variety of local public works and to respond to the demands of black organizations and their leaders that funds for the employment of black youth be restored. The Reagan administration efforts to restructure and cut back some Social Security benefits were unsuccessful, as were his protracted efforts to eliminate community legal services. President Bush continued the effort to reduce health, education, and welfare benefits, but constituencies for social welfare measures were organized and acted effectively.
Food stamps were one program that did not suffer from efforts to limit its growth. Starting with a relatively small appropriation of $550 million in 1970, it reached $17.7 billion in 1990.18 The program is funded totally by the federal government. The program’s focus on reducing hunger, its administrative home in the Department of Agriculture, and the continued support of agribusiness and farm state legislators assured the program’s popularity. As a fully federally funded program, Food Stamps—in contrast to AFDC—did not provoke opposition from governors and state legislatures.
Challenging the Welfare State: Welfare Reform
In 1935, when provision was made in the Social Security Act for aid to dependent children, the conviction held that mothers, despite their poverty, should remain at home with their children. The 1967 amendments to the Social Security Act officially reversed this historic policy, reflecting the extent to which mothers of young children had moved, in public thinking, from unemployable to employable. Congress increased appropriations for day-care programs and instituted the Work Incentive Program (WIN). WIN required that an assessment be made of the employability not only of unemployed fathers and out-of-school older children, but also of mothers. The transfer of administrative responsibility for OEO “work experience” programs from the Department of Health, Education, and Welfare to the Department of Labor further clarified the new congressional direction. In 1971, Congress moved again to increase the work incentives for welfare recipients. Ideological support for this new approach came in 1971, when the House Ways and Means Committee extolled as a primary virtue of day care its ability to free mothers for work:
Your committee is convinced that . . . the child in a family eligible under these programs will benefit from the combination of quality child care and the example of an adult in the family taking financial responsibility for him.19
Efforts to reform welfare during the 1970s and 1980s were caught in social and policy crosscurrents. For a time, the concept of a negative income tax (NIT)—in which low-income families would be guaranteed a minimum annual income—commanded broad support across the political spectrum. Yet, arguments about how high to set the minimum effectively blocked legislation from moving forward. Although both liberals and conservatives supported expanding the work opportunities of welfare recipients, conservatives were more interested in using “workfare” as a disincentive to keep people off the rolls, whereas liberals favored the expansion of job training and education as a means of making recipients more employable.20
The best opportunity for a shift toward a national, NIT system occurred during the early 1970s after President Nixon’s address to the nation about his welfare reform plan in August 1969.21 For those “American families who cannot care for themselves in whichever state they live,” a new program, the Family Assistance Plan (FAP), was proposed as a substitute for AFDC. “Workfare” would replace welfare. An NIT mechanism was designed to set a floor on income while still encouraging people to work. The FAP included a national minimum income, although it was set so low that only states with extremely low AFDC payments would benefit. Where working adults were involved, the subsidy was to be reduced as earned income went up, until a “breakeven point” in total income was reached. The program would, therefore, be available to the working as well as the nonworking poor. Mothers with very young children would be permitted to remain at home. In immediately employable families, however—that is, in two-parent families or in single-parent families with school-age children—financial aid would be conditioned upon the willingness of at least one adult to accept training or employment. In the president’s words, FAP coupled “basic benefits to low-income families with children with incentives for employment and training to improve the capacity for employment of members of such families.” In essence, workfare became the ideological base for different levels and plans of support for different groups of recipients. For poor families judged employable, inadequate grants were to be the incentive for work.
The Commission on Income Maintenance Programs, appointed by President Johnson, released its report in November 1969, four months after President Nixon’s message on reform in welfare. Like the Nixon plan, the commission’s recommendations involved an NIT, although benefits were more generous. Although its policy recommendations were similar to Nixon’s, the commission relied on a more structural explanation of poverty to justify them. The commission wrote:
It is often argued that the poor are to blame for their own circumstances and should be expected to lift themselves from poverty. The Commission has concluded that these are incorrect. Our economic and social structure virtually guarantees poverty for millions of Americans. Unemployment and underemployment are basic facts of American life. The risks of poverty are common to millions more who depend on earnings for their income. . . . The simple fact is that most of the poor remain poor because access to income through work is currently beyond their reach.22
The legislative battles over Nixon’s Family Assistance Plan marked either a missed opportunity or a critical turning point in the history of welfare policy. FAP was debated throughout 1970, 1971, and much of 1972. Although both parties were dissatisfied with the existing AFDC program, they could not agree on the nature of its problems or the contours of a solution. Ultimately, while moderate Democrats and Republicans sought to create a compromise that would become law, conservatives and liberals refused to support the package. Conservatives like Russell Long, senator from Louisiana and chair of the Senate Finance Committee, pushed for tighter work requirements and lower benefits while liberals sought more generous payments. The National Welfare Rights Organization—a grassroots organization of welfare mothers and social workers—pushed liberals to resist a compromise that would attract moderate support. Eventually, Nixon himself abandoned welfare reform. In its place, a number of piecemeal measures became law, the most important of which was the creation of Supplemental Security Income (SSI) to replace categorical programs for the aged, blind, and disabled. The new federal program became law in 1973.
SSI marked a major policy success in addressing continuing poverty among older Americans and people with disability. In place of a patchwork of state programs with different benefit levels and eligibility requirements that combined state and federal funding, SSI provided a uniform program that was federally funded. The program was designed to supplement old age and disability insurance. Indeed, beneficiaries who were eligible for both Social Security and SSI typically escaped poverty, although many SSI recipients—including children and those who immigrated as older adults—did not qualify for the social insurance program.23
However, public interest in poor families receded once the most favored of the poor had been cared for. Pressures from the states for reform in welfare lessened as the elimination of the Old Age Assistance, Aid to the Blind, and Aid to the Disabled categories of public assistance lowered demands on state treasuries. Pressures from the elderly and disabled diminished with SSI’s guarantee of income. Additionally, the introduction of “income disregards” as work incentive, and expansions of AFDC–UP and of in-kind programs that reduced inequalities between the working and nonworking poor, lowered the push for change. In the absence of comprehensive reform, smaller changes like these reshaped the policy landscape.
President Carter introduced a more elaborate NIT plan in 1978. The Better Jobs and Income Program, as it was named, was similar to that proposed earlier by Nixon. A federal guarantee of minimum income was provided; benefit scales provided work incentives; job training and child care were planned. The Carter proposal moved beyond the Nixon FAP in that its coverage would have included individuals as well as families, and in its provision of jobs. But, like the Nixon proposal, Carter’s plan failed to receive congressional approval.
The failure to reach consensus on welfare reform during the 1970s had been attributed to several factors. Arguments had ensued as to whether to pursue an incremental or comprehensive strategy. Incrementalism was meant to improve the current system without changing its categorical nature and its divided administrative structure. Proponents of incrementalism pointed not only to the reality of a system in place but also to the more likely chance that piecemeal change could be achieved. Proponents of comprehensive reform held the many criticisms of public welfare to mean that a new package of programs, integrated toward common purposes, had to be devised.24
The Carter proposal had hoped to achieve reform at no increase in costs beyond those of current public welfare programs. This goal had to be abandoned almost immediately. The job proposal became a source of difficulty, too. Some criticized the plan for establishing the federal government as a true “employer of the last resort.” And as with the Nixon plan, there was sharp disagreement about the level of grants. Political agreement among interest groups was impossible to come by.25
The historical opportunity for creating an NIT system had passed by the end of the 1970s. Welfare had become caught in the vortex of the battle over the “traditional family.” Public support of single mothers through AFDC became a convenient target for conservatives who objected to the new values around gender equality and sexuality that were spreading through American society. At the same time, the findings from income maintenance experiments in New Jersey, Denver, and Seattle failed to put to rest concerns over the impact of higher welfare payments on family stability and work effort. Finally, the new reality of married women’s involvement in the workplace undermined a welfare system based on the assumption that women’s place was in the home.26
Still, the Family Support Act of 1988—passed after arduous negotiations between Congress and the Reagan administration—should have laid the basis for a new approach to welfare. Senator Daniel Patrick Moynihan of New York, who earlier in his career had sounded alarm over the problems of African American families as part of the Johnson administration and had designed the FAP when he served as a domestic policy advisor to Nixon, championed the Family Support Act. For social conservatives, the 1988 law included increased requirements for paternity establishment and child support enforcement. All welfare recipients except those with very young children were required to either find employment or make efforts to increase their self-sufficiency through education or training. To back up the work requirements of the Family Support Act, Congress made new funding available for training and education through the Jobs Opportunities and Basic Skills (JOBS) program. In addition, states were required to provide transitional child care, transportation, and health care benefits to women who had moved from welfare to work.27
Yet, historical timing was not on the side of the Family Support Act. Within a year of its passage, the United States had fallen into recession. As unemployment increased, the number of welfare recipients began to rise for the first time since the early 1980s. The states found themselves having to contend with a 40 percent increase in welfare recipients as the total number of individuals on AFDC and general assistance increased from 10.4 million in 1987 to 14.9 million in 1992. Faced with the increased welfare rolls, the states largely failed to comply with mandates around transitional transportation and child care and refused to commit the matching funds that were necessary to make use of the JOBS funding. By the early 1990s, the rise in recipients set off a backlash at the state level—tied to reducing the cost of the program—that shifted the welfare reform debate in a new, more conservative direction.
Changes in family life and in the labor-force participation of women raised questions about the social insurance program, just as they had about the AFDC program. The social insurance program—old age insurance—was designed for a nuclear family with two parents and one (male) wage earner. For non-employed women, in an era of widespread marital instability, problems arise in a system where benefits derive from the income of the employed spouse and are not portable. Employed women, too, appear dissatisfied with the way in which benefits were calculated, feeling that there were often inequities.28
Yet, there was no way to improve the treatment of women without increasing the cost of the program. A full recognition of women’s earning through a system of “earnings sharing” that gave women credit for both their and their spouse’s contributions would benefit many working wives; but to keep the system revenue neutral, the benefits of stay-at-home spouses would have to be cut. With most of the attention during the 1980s focused on the fiscal soundness of the Social Security funds, increasing the system’s gender equity went unaddressed.29
Child Welfare and the Aging
The rejection of welfare reform set federal policies for children and older Americans on divergent policy trajectories. In place of a system of income support that would reduce child poverty, Congress showed increased interest in the treatment of abused and neglected children. At the same time, the elderly benefited from increased funding on income support programs—even in the face of public concern about their cost—and expanded services.
The perceived success of Head Start during the War on Poverty provided momentum for the expansion of federal involvement in early childhood education. The Comprehensive Child Development Act of 1971, which was passed by both houses of Congress, would have provided significant funding for child care and set national standards of child-care workers. Although the bill would have made it easier for poor mothers to seek employment, the bill became a lightning rod for conservative concerns about an overreaching federal government. When President Nixon vetoed the bill, he suggested that it would cause the “Sovietization” of child-rearing in the United States.30
Still, the years between 1968 and 1992 were a period of increased federal involvement in child welfare. The identification of the “battered child syndrome” by Henry Kempe in 1962 has sparked an interest in possible child abuse and the passage of mandatory reporting legislation in all 50 states by 1968. In 1974, Congress passed the Child Abuse Prevention and Treatment Act (CAPTA), which provided the first federal funding for child protective services. These efforts were expanded by the Adoption Assistance and Child Welfare Act of 1980, which took a more active role in shaping the nature of child protective services and efforts to prevent “foster care drift,” in which children found themselves without a plan for either a return to their parents or another permanent home.31
The increases in Social Security benefits legislated in the late 1960s and early 1970s transformed old age insurance into a true “retirement wage” that rapidly reduced poverty among older Americans. Yet, the combination of higher benefits indexed to prices, high inflation, and slow wage growth (which reduced the flow of taxes into the system) generated an incipient fiscal crisis for Social Security.
The 1979 Advisory Council on Social Security concluded that our social insurances were basically sound.
The Council is unanimous in finding that the social security system is the government’s most successful social program. It provides basic retirement, disability and survivorship protection, which American workers can supplement with their own savings and private pensions, and it will continue to provide this protection for as far ahead as anyone can see.
After reviewing the evidence, the Council is unanimously convinced that all current and future social security beneficiaries can count on receiving all the benefits to which they are entitled.32
Yet, the fiscal problems of the system continued to mount. Despite some alteration in benefit calculations during the Carter administration, inflation and rising unemployment led to deficits in the OASDI funds in 1978, 1979, and 1980. In June 1980, the trustees recommended that the retirement trust fund be permitted to borrow from the other trust funds. For the longer run, there was widespread concern about the distribution of the payroll tax receipts among the health funds, the disability fund, and the retirement system, and a potential need to finance in part from general revenues if benefits were not to be cut below planned levels. The Advisory Council recommended that health insurance be financed from general revenues, thus preserving for OAI the image of “contributions” and “earned entitlements” to retirement income. Conservatives proposed cuts in benefits and increases in the retirement age as ways of “saving” the Social Security system.
President Reagan’s initial reaction was to recommend major cuts in Social Security, including reductions in benefit payments for early retirement, elimination of the minimum benefit, and a postponement of the annual cost-of-living adjustment. In contrast to its acceptance of other Reagan welfare cuts, Congress almost unanimously rejected his Social Security proposals. To defuse the controversy, Reagan appointed a bipartisan National Commission on Social Security Reform in 1981 to examine the entire social insurance system and make recommendations that would lead to “long-term solvency” and ensure both “financial integrity” and “the provision of appropriate benefits.”33
The commission considered a variety of approaches to the system’s problems, but it ultimately recommended a set of incremental proposals that did not alter the basic elements of the retirement program. It recommended, first of all, that “Congress . . . not alter the fundamental structure of the . . . program or undermine its fundamental principles.” It was to remain a compulsory, non–means-tested, contributory program, which gave low-income workers a better return on their contributions than those who were higher up on the wage scale. To meet fiscal need, it proposed a partial taxation of higher-income retirees’ benefits, a gradual increase in the retirement age (to 67), and increases in the contribution rate.34
The 1983 Social Security reform law sought to eliminate the short-term financial problems with the system and to shift tax policy to build up the trust funds with an eye toward the retirement of the baby-boom generation early in the 21st century. The balance went up immediately and the assets of the Old Age and Survivors Insurance (OASI) trust fund grew rapidly. At the time, projections were that the trust fund would rise to $12 trillion by 2030, the height of the impact of the retirement of baby boomers.
The Social Security Act, as originally legislated, provided work-related benefits much in the way such benefits would have been provided through private insurance. The notion of equity was the tie among wages, tax contributions, and benefits. Amendments to the act have steadily altered this approach to reflect need and adequacy as well as equity. The passage of the SSI program in 1972 provided a means-tested income transfer program that guaranteed a minimum income for the elderly and disabled. Furthermore, SSI benefits, like OAI benefits, were indexed to keep pace with inflation. This change made it possible to rethink the extent to which the insurance programs should be used to redress inequities in wage and employment patterns. Increasingly, policy focused more on the “health” of the plan’s finances and less to the income adequacy of the program.
While Social Security was attracting widespread attention from policy makers and the public, a quieter revolution was expanding the public costs of so-called “private” pensions. Economic slowdowns sent many companies into bankruptcy, often taking their employees’ pension plan with them. These “defined-benefit” plans were generally funded and controlled by companies. Although they received generous tax benefits for creating the programs, there was little government oversight of the plans’ financial security.
While Social Security was attracting widespread attention from policy makers and the public, a quieter revolution was expanding the public costs of so-called “private” pensions.
In response to several well-publicized company bankruptcies, including the Studebaker automobile company, Congress in 1974 passed the Employee Retirement Income Security Act (ERISA) that set out new regulations for pension plans and established a quasi-governmental insurance system—the Pension Benefit Guaranty Corporation—to cover losses associated with the termination of defined-benefit plans. However, the real impact of the events of the early 1970s was to reorient pensions from defined-benefit—in which employers promised specific pensions for workers based on their years of service—to defined-contribution plans—in which workers contributed to individual plans, sometimes with employer matches. Subsequent revenue acts developed a variety of special accounts—of which Individual Retirement Accounts and 401(k) accounts were the most popular—to facilitate this shift toward private pension accounts. Although these plans were attractive to workers, they required workers to assume most of the risk associated with retirement planning. Furthermore, except for the highest-paid workers, these plans rarely assured adequate retirement income.
The implication of these changes during the 1970s was not immediately apparent. In contrast to the very public debate over the future of Social Security, these “private” accounts typically passed “below the radar.” Although they involved trillions of dollars in tax breaks, public policy analysts found it nearly impossible to assess their full implications. Over time, however, it became clear that the mass of the tax breaks associated with these plans flowed to the richest Americans and bypassed those with greater need. Over time, even without efforts to “privatize” Social Security, American retirement security began to tilt strongly toward private plans and increased income inequality among older Americans.35
The issues were joined in the discussions about the provision of health benefits and long-term care. Medicare was designed to provide the aged with prepaid hospital insurance financed by payroll taxes and a medical insurance program to pay doctors’ bills, outpatient hospital care, and some additional health services, financed by a combination of recipient payments and general tax revenues. It was intended to be a non–means-tested program.
In order to pass Medicare and Medicaid in the 1960s, Congress had set rules that gave hospitals and health providers few incentives to contain cost. Between 1965 and 1990, health care expenses as percent of gross domestic product (GDP) more than doubled, reaching 12 percent in 1990. With the rise in health care costs, efforts to contain expenses increased. In 1983, Congress placed a lid on Medicare reimbursements for hospital stays with the introduction of the diagnosis-related groups (DRGs) billing system. Patients were being discharged from hospitals “quicker but sicker.” The concept of approved reimbursement rates was extended to physicians’ services. In 1988, 37 percent of all Medicare-approved physicians were accepting Medicare-approved rates for fees. Unfortunately, there was evidence that patients might be experiencing cost shifting and restricted access to medical care. Premium increases and benefit cuts combined to increase older Americans’ out-of-pocket health care costs. In 1980, the aged used 13 percent of their income for health care; that figure rose to 18.5 percent by 1990.36
In addition to cutting services, the effort to control public payment for health care for the aging meant placing a larger share of the cost on middle- and upper-income workers and beneficiaries. Thus, in 1990, while the maximum wages to be taxed for retirement, survivors, and disability income was set at $51,300 annually, workers with incomes up to $125,000 paid a Medicare tax of 1.45 percent. In 1988, Congress passed legislation, the Medicare Catastrophic Health Care Act, that extended hospital care for all under Medicare, to be paid for by a surtax on the incomes of middle- and upper-income aged. The bill was repealed just one year later when older Americans realized that the catastrophic health care bill did not cover long-term custodial care and was a substantial tax on a small part of the population.
The increased number of Americans over the age of 80 made long-term care a pressing issue. Yet, in the absence of public insurance for nursing home care, the federal government backed into its policy. Increasingly, Medicaid—the health care program for low-income Americans—became the chief funder of nursing home care. However, to qualify for Medicaid, the elderly were required to “spend down” their assets. Often, senior citizens would see their life savings quickly evaporate before they became poor enough to receive Medicaid. In reality, we had a policy for the public funding of nursing home care, but it required some wrenching economic choices for older Americans and their families.
The Unemployed
Although the debate over welfare and the fiscal crisis of Social Security attracted the greatest headlines, some of the sharpest policy changes of the 1970s and 1980s involved provisions for the unemployed. Unemployment remained a problem throughout the period, largely because of the deliberate use of unemployment as a means of fighting inflation. Yet, even though government policy increased the number of unemployed Americans, the government took less responsibility for their economic security during the 1970s and 1980s than it had in earlier decades.
Changes in the workplace contributed to the decreasing effectiveness of unemployment insurance. The economic transition underway during the 1970s and 1980s meant that fewer of the unemployed experienced temporary layoffs, and more were displaced from their jobs. In addition, a greater share of Americans were working part-time or temporary jobs. Many who held these jobs failed to qualify for unemployment insurance when they lost their jobs.
The sharpest changes in the unemployment system occurred in the early years of the Reagan administration. Before 1981, when the national unemployment rate went above a certain level, all of the eligible unemployed were entitled to collect unemployment for an additional 13 weeks after their 26 weeks of basic coverage expired. As part of his “economic recovery” plan in 1981, President Reagan proposed and Congress enacted limits on the availability of extended benefits except in individual states that were experiencing high unemployment.
The combination of a changing labor market and deliberate government policy took its toll on the unemployed. In 1980, 44 percent of the unemployed had received unemployment compensation. By 1985, the percentage had fallen to 32 percent. When the economy again fell into recession in 1989, less than a third of the unemployed were eligible for unemployment compensation.37
At the same time, public employment—another means of reducing unemployment—was shrinking. During the early 1970s, Congress passed the Comprehensive Employment and Training Act (CETA), which, in addition to providing funding for job training, funded public service employment for city governments. During the recessions of 1974–1975 and 1980, CETA public service employment provided many jobs and helped to cushion the impact of the recessions in many metropolitan areas. Yet, the program was widely unpopular in Congress and was terminated in 1981. Cuts in federal funding for public service employment combined with the shrinking tax base of many American cities to severely reduce the availability of public employment. Because public employment was a significant source of white-collar jobs for African Americans in American cities, the cuts to public employment fell disproportionately on the black community.
The last line of defense for the unemployed—general assistance—was also in eclipse during these years. States eliminated or severely limited their programs. For example, in the early 1980s, Pennsylvania restricted general assistance benefits to a few months out of the year, a reform that many commentators tied to the increase in homelessness in the commonwealth. In 1990, 22 states had statewide programs; 17 more had programs in some counties; and in 6 states, there was a program of small emergency grants in some counties. In 7 states, there was no program at all. General assistance was concentrated in large cities and provided financial and medical assistance to low-income individuals and families who were not eligible for federally funded assistance programs. Maximum cash benefits ranged from a low of 5 percent of the poverty line in Charleston County, South Carolina, to a high of only 77 percent of the poverty line in Portland, Maine.38 Most of the programs emphasized the necessity to work for clients deemed employable, but the meaning of employability was not clear. Were those classified as employable truly capable of sustaining employment? Second, even if they were employable, were they getting realistic help in a job search? For those who were not immediately employable, what kinds of training programs were to be provided and who would be admitted?
Veterans
It was hoped that the coming together of women, black and white, rich and poor, on problems of discrimination might result in common demand for change—change reflected in the welfare of poor families. Thirty-five years earlier, some observers thought veterans would play such a catalytic role. The 1956 report of the President’s Commission on Veterans’ Pensions analyzed the meaning of the special status accorded veterans:
Veterans and their families will eventually be a majority of the population of the United States. Veterans in modern times are better off economically than nonveterans in similar age groups.39
The 1956 commission pointed to the extent to which basic needs of “all citizens, veterans and non-veterans alike, for economic security are being increasingly met through federal, state and private programs.” In summary, the commission concluded, “Military service in time of war or peace is the obligation of citizenship and should not be considered inherently a basis for future Government benefits,” and “all veterans’ benefits should be meshed with the nation’s general security system.”
The commission’s report of 1956 wielded little influence. But the benefits awarded in 1966 to veterans of the Korean War (retroactively) and the Vietnam War suggested some policy shifts. Certainly, the educational benefits for these groups did not compare favorably with those provided for veterans in 1944. Veterans of Vietnam in particular paid the price of our unhappiness with that war.
Personal Social Services
The history of social services was a harbinger of the pressure to cut social welfare spending. Although social services had been included in Social Security since the 1950s, it had remained a small part of welfare spending through the 1960s. During the War on Poverty, however, a combination of looser federal regulation and states’ interest in shifting costs to the federal government led to an explosion in federal spending on social services. Between 1965 and 1972, for example, total outlays for social services increased from $159 million to $4 billion—a 19-fold increase when controlled for inflation. Much of this increase resulted from a broader interpretation of regulations that allowed spending on those “likely to become” dependent on welfare. In response to this increase, Congress added a new title to the Social Security Act in 1974—Title XX—that imposed a cap on federal spending on social services.40
The Reagan administration used its first budget in 1981 to change Title XX from a “capped entitlement” to a block grant (Social Service Block Grant) subject to the congressional appropriation process. By 2016, the appropriation for Title XX had fallen to $1.58 billion. When corrected for inflation, the federal government’s spending on social services had fallen by 93 percent between 1972 and 2016.41 The block grant mechanism became a means of limiting public spending on a variety of functions by making the federal commitment subject to the vagaries of budget debate.
Social Movements
The New Right
Conservative movements drew little support during the early postwar years, but the social and political changes of the 1960s and 1970s mobilized new constituencies and organizations. The success of the civil rights movement provoked a white backlash in both the North and South. The changing status of women and the challenge to traditional sexual mores and gender roles provoked a powerful antifeminist movement. Underlying these social and cultural reactions was an insistent economic conservatism that argued that the welfare state and “excessive regulation” were responsible for the stagnation of the economy. Meanwhile, antitax revolts mobilized property owners in a movement against rising property taxes and government expenditures. Poor urban residents—who had been the focus of policy innovations during the 1970s—now were recast as an “urban underclass” that would require “tough love” to rejoin the mainstream.
Just as the drive to expand the civil rights of black Americans had become the model for other civil rights movements, the rearguard efforts of segregationists provided a vocabulary for newer conservative causes. Segregationists had objected to the trampling of “states’ rights” and the “imposition” of new requirements at odds with the Southern status quo. In the wake of the Supreme Court’s guaranteeing a women’s right to an abortion and the success of feminists in gaining passage of the Equal Rights Amendment by Congress in 1972, antifeminists saw a plot to undermine women’s traditional role. Although their claims that the ERA would require same-sex bathrooms and women’s role as combat soldiers were specious, the charge that women’s purity would be compromised by equality proved powerful (Figure 8.1).
Figure 8.1
Anti-ERA demonstration in front of White House, 1977. Efforts to put a ban on gender discrimination in the U.S. Constitution provoked a strong reaction of women and men who claimed that the Equal Rights Amendment would require unisex bathrooms, the abolition of Social Security survivors’ benefits, and women to register for the draft and become combat soldiers.
Warren K. Leffler, Prints & Photographs Division, Library of Congress, lc-u9-33889A-31/31A.
Figure 8.1 Full Alternative Text
Conservatives had long been suspicious of government action, but by the early 1970s, they had been joined by many traditionally liberal constituencies. The war in Vietnam, government surveillance of protesters, and the Watergate affair had led many Americans to view the government in a less favorable light. Ronald Reagan, in his inaugural address in 1981, seemed to capture many Americans’ cynicism when he concluded:
In this present crisis, government is not the solution to our problem; government is the problem.42
From the start of the 1980 presidential campaign, the Republican candidate Reagan argued that the Democratic Party had been unable to meet new and serious economic and social conditions. As president, Reagan acted to reduce government expenditures and taxes. Arguing for “supply-side” economics, deregulation of business, and tax cuts as a means of stimulating economic growth, President Reagan attacked domestic social welfare programs. President George H. W. Bush (1989–1993), while couching his arguments in a “kinder and gentler” manner, continued to allow his concern for interest rates and budget stability to take precedence over considerations of individual well-being. Changes in the economy, political structure, family organization, and the age structure of the U.S. population combined to make large social welfare changes necessary. However, necessary social interventions took a back seat to tax cuts and budget restraint.
The Expansion of Civil Rights
The state of the economy was only one factor shaping social welfare. The concern of the 1960s with the rights of oppressed groups, paced by the militancy of the black community, led to advances in rights to privacy, due process, and equal protection. The emergence of many groups—women, students, children, the aging, Native Americans, prisoners, homosexuals, and others—demanding change resulted in their gaining at least some basic recognition. The historical value of individualism broadened to a demand for group-determined rights.
The American Indian Movement organized for militant defense of the rights of Native Americans. The 1970s saw the formation of a new Council of Energy Resources, which took to the courts to fight for rights to the development of the coal, oil, and other resources needed to promote industries on the reservations. There were setbacks, but there were also victories. Some land claims were won, some land was put in trust for development for the benefit of Native Americans, and additional sums of money were appropriated to the Bureau of Indian Affairs for education. Nonetheless, Native Americans remained the poorest of the poor in the United States.
The period from 1970 to 1990 included conflicting trends in civil rights. In the 1970s, successes in extending civil liberties were scored in affirmative action programs and in efforts to replace institutionalization with community-based programs. Judicial decisions required the payment of reparations to groups who had been shown to suffer from discriminatory pay and promotion differentials. In July 1980, the Supreme Court, in Fullilove v. Klutznick, upheld the use of quotas for minority contractors when it decided that Congress could award federal funds on the basis of race to redress past racial discrimination.43 The courts seesawed on the issue of affirmative action throughout the decade. In 1984, the Supreme Court decided in a 6–3 decision that seniority overrules affirmative action concerns with regard to layoffs.44 A May 1986 decision confirmed that position, but found that the adoption of hiring goals favoring minorities would be a permissible way for a government employer to redress its past discrimination.45 In July 1986, in two cases, one involving New York City sheet metal workers and one involving firefighters in Cleveland, the use of affirmative action in the workplace to address past discrimination was endorsed when less drastic approaches would not work. In February 1987, in a 5–4 decision, the court endorsed “catch-up quotas” to counter severe past discrimination against blacks, and the following month, it extended preferential hiring to women.46
By 1989, however, as its composition changed and conservatives achieved a majority, the Supreme Court began to back away from its support of affirmative action. In January, the court invalidated a Richmond law that set aside 30 percent of public works funds to minority-owned construction companies.47 This was followed by a series of decisions against the use of racial or sex preferences to remedy past discrimination and making it increasingly difficult for plaintiffs to prove employment discrimination. In 1990, Congress passed a bill to reverse the Supreme Court’s decisions and shift the burden of proof back to employers on discrimination issues. Employees did not have to demonstrate discrimination; employers would have to show that a practice that hurts women or minorities was necessary to the success of the business. The bill facilitated opportunities for employees to challenge court orders, granted victims of intentional discrimination the right to recover compensatory and punitive damages, and reaffirmed that any intentional job bias was illegal. President Bush vetoed the bill on October 22, 1990, on the grounds that it would force employers to adopt hiring and promotion quotas to avoid discrimination suits. The Senate failed by one vote to override the veto, but the following year, Congress passed and the president signed into law the Civil Rights Act of 1991.
The same pattern of expansion and then a retrenchment applied to immigration policy. In 1965, the tie between the admission of immigrants and their ethnic origin was broken, and for over a decade, it appeared that the U.S. fear of alien cultures was fading. Concerns about the economic success of a more open policy began arising during the early 1980s. By 1980, there were political, medical, and economic doubts about the entry of immigrants, who were seen by American labor as competitors for jobs, and about the participation of “foreigners” in education, health, and welfare services.
Refugees became a significant share of the increase in the foreign-born population during the 1970s and 1980s. Refugees from Indochina, refugees from Haiti, and, in 1980, a new Cuban refugee population arrived in large numbers. Indeed, the anticipated ceilings on immigration of about 290,000 people annually were only a small part of the picture. Many more immigrants came into the country as close relatives of U.S. citizens or as refugees than were entering under the hemispheric quotas. Of the 1.5 million legal immigrants in 1990, fewer than 300,000 came under the immigration quotas; the remainder, 1.2 million, were exempt from numerical limitations.48 Added to this were large numbers of illegal entrants, many from Mexico.
Both the refugees from Cuba and those from Haiti were greeted with less than open arms. It was rumored that President Fidel Castro of Cuba had been releasing prisoners and mental patients to come to the United States. The presence of the AIDS virus in both the Cuban and Haitian refugees was seen as another barrier to their admission. In the end, the Cuban refugees were admitted and the Haitians rejected. After much bureaucratic maneuvering, the political arguments in favor of embarrassing the communist regime overcame other objections, and the Cuban refugees were admitted and given generous support by the U.S. government. The Haitian refugees were sent back to Haiti or placed in long-term detention because they were seeking refuge only from poverty, not from political persecution.49
Immigration policy became dominated by two economic concerns. One was the fear for jobs. Advocates of immigration reform in 1965 imagined that European immigration would resume after Congress passed the law and that highly skilled workers would dominate the new immigrant wave. As it turned out, Latin America and Asia were the sources of most new immigrants, and they were more likely to compete for less-skilled jobs. Further, it was feared that immigrants would avail themselves too extensively of education services and public aid—cash welfare programs, food stamps, and Medicaid. In 1982, the Supreme Court ruled that all children regardless of immigration status had the right to attend public schools. Local and state officials continued to raise the issue of immigrant eligibility, often noting that while the federal government controlled immigration, local governments often absorbed the cost of newcomers. Earlier in our history, immigrants were suspect for their religious and political ideas because they were feared as anarchists or communists or because of their possible allegiance to an enemy country. By 1990, however, economic issues had replaced political ones as the center of concern.
For children, too, there was a retreat from rights during the 1980s. The promise of the Gault decision of 1967, protecting the rights of children in trouble with the law, was compromised. There was visible return to giving priority to rehabilitation over justice in juvenile proceedings despite the paucity of resources that undermined the quality of services.
On a state level, the intent of laws such as Pennsylvania’s Act 148, designed to support deinstitutionalization and community/own-home planning for children, was countered by a lack of supportive community services and by active efforts to legally cut the tie between dependent children and their mothers. The Office of Child Development’s thrust toward “permanency planning,” though overtly intended to protect children from the uncertainties of longtime foster care placements, unnecessarily risked the permanent separation of some children from their natural parents. The notion of going “beyond the best interest of the child”50 to forge new permanent ties could, in the 1990s, prove an updated version of “binding out.”
People suffering from mental illness and mental retardation also saw their rights expand during the 1970s, but then fail to be fully realized. Since the early 19th century, institutions—first called “insane asylums” and later called “state hospitals”—had been the major policy response for this population. By the 1960s, innovations in therapy and research had led to the beginning of a community mental health system that would treat people with mental illnesses in their own communities.
But the institutions persisted, typically filled with individuals without resources or support to explore alternatives. In 1972, however, the Supreme Court (Wyatt v. Stickney) ruled that states could not continue to “warehouse” the mentally ill and mentally retarded. The court recognized a “right to treatment” that placed the burden on states to create noninstitutional alternatives to institutionalization that would reintegrate these populations into the community.
During the 1970s, lawsuits and public policy emptied many institutions. In 1955, more than 500,000 Americans lived in state psychiatric hospitals; by the 1990s, fewer than 60,000 did. In some states, massive efforts were made to provide services to reintegrate the mentally ill and mentally retarded, while in other localities, “dumping” was a more accurate description of the policy implementation. Concerns about cost, the hostility of current residents to the creation of community-based group homes (the acronym NIMBY—not in my backyard—came into common usage), and the largely untested strategies for community integration hampered efforts and led to mixed results.
During the 1980s, even successful programs found themselves scrambling for funding. On the one hand, as with many voluntary social welfare efforts stretching back more than a century, advocates of community action were shown to be overly optimistic about the ability of voluntary efforts to reintegrate these populations. On the other hand, government found itself quite adept at first limiting and then cutting funding available for programs. Eventually, many mentally ill individuals found themselves re-institutionalized, this time in prisons and jails instead of state hospitals.
The retreat on civil rights hit welfare families most severely. During the 1960s, the Supreme Court had strengthened the rights of welfare recipients by supporting their claims to due process, privacy, and the right to migrate. Between 1970 and 1990, a combination of legislation, court decisions, and administrative actions weakened these new, insecure rights. The federal government weakened its prohibitions on house searches while the New York state legislature used the failure to find standard housing as a means of denying welfare payments to new migrants.
Congress and the general public expanded their support for work requirements for welfare recipients during these years. With the passage of the Family Support Act in 1988, in every state, AFDC recipients with children age six or older were required to work or attend a job training program or school. A state could require participation in work or education programs even for recipients with children as young as three years. If a recipient did not meet the state’s behavioral rules, he or she might be “sanctioned”— that is, his or her grant might be reduced or even eliminated.51
Perhaps the most visible civil rights battle of the period was in the area of access to abortion. In two 1973 cases (Roe v. Wade and Doe v. Bolton), the Supreme Court established women’s control over their own bodies by supporting their right to abortion during the first trimester of pregnancy. Since then, there have been repeated attempts by “pro-life” groups to argue the viability and rights of the fetus over the rights of the mother. While abortion was legal in the United States, proponents of abortion, the “pro-choice” groups, feared the implications for women of the changing composition of the Supreme Court.
For poor women, the battle had already been lost, since the court ruled that the states were not required to make Medicaid funds available for abortions except in pregnancies resulting from rape or directly threatening their health.52 By 1990, an uneasy balance had been reached on abortion. The public overwhelmingly supported abortion as part of the right to privacy, but was unwilling to pay the cost of this right for women who could not afford it. In reality, an increasing proportion of women were denied this reproductive right, either because they could not afford it or because services were not accessible.
The Americans with Disabilities Act of 1990 established comprehensive civil rights and aimed to reverse the stigmatization of people with disabilities.
One group—people with physical or mental disabilities—fared better in Congress and the courts. The Americans with Disabilities Act of 1990 established comprehensive civil rights and aimed to reverse the stigmatization of people with disabilities. The law prohibited discrimination against people with physical or mental impairments in employment, transportation, and public accommodations. The regulations implementing the legislation required that new construction for public accommodations—schools, libraries, restaurants, and hospital and nursing home rooms, for example—be accessible. Existing businesses were required to make alterations to buildings to accommodate special needs unless they could show that expenses would be too great. The intent was to integrate those who have disabilities into all aspects of life.
Unfortunately, concern for the welfare of disabled people did not extend to their pension and public welfare rights. Despite repeated court rulings that disability insurance benefits could not be arbitrarily cut off, the Social Security Administration was slow to revise its eligibility procedures. Government used a variety of administrative procedures to limit the number of people receiving Disability Insurance (DI) or SSI. For adults, the General Accounting Office (GAO) reported that over one-half of the applicants who had been denied benefits between 1984 and 1987 reported not working, and many reported they did not anticipate ever working again. Over two-thirds of those denied DI and not able to work reported serious income inadequacy, as did those denied benefits but working at least part time.53
The right of children with disabilities to income support received a boost in 1990 when the Supreme Court ruled that the adult standard for establishing a disability applied to young people as well. As a result, eligibility for SSI for children began to expand after 1990. Yet, the results were mixed. No sooner did the court decision take effect than budget-cutters in Congress looked to the incomes of children with disabilities as a source of “savings.” The 1996 welfare reform law repealed this new entitlement.
Women
A staff report of the U.S. Commission on Civil Rights charged that federal and state welfare programs, federal job training programs, and social insurance and private pension plans all discriminated against women. AFDC and its work programs were subject to special attack. Not only were low AFDC benefits keeping women and their families in poverty, but WIN, when it did succeed in placing women in jobs, had done so at discriminatory, low-entry wage levels in jobs that offered little chance for advancement. Significantly, a staff member of the U.S. Commission on Civil Rights had this to say about the commission’s first hearings on women’s rights:
The value of having low-income women in these hearings is that they educate us and tell us problems. They also find . . . there are laws that cover them. . . . And these hearings get action . . . because they draw attention.54
That hearing took place in 1974. A Department of Justice Task Force on Sex Discrimination issued a report in 1979 with a broad overview of sexual discrimination in our pension system.55
Many studies and reports on the changing position of women were issued in the late 1970s. The Department of Labor held a major conference analyzing “Women’s Changing Roles at Home and on the Job” in 1977.56 That same year, the Social Security amendments legislated that the secretary of health, education, and welfare, in consultation with the Task Force on Sex Discrimination in the Department of Justice, make a detailed study of unequal treatment of men and women under the Social Security Act, and started an exploration of ways of eliminating dependency as a factor in the determination of spouse’s benefits.57 The Task Force on the Treatment of Women was appointed, and its report was issued early in 1978.58 Secretary Joseph A. Califano Jr. also requested that the Advisory Council on Social Security “consider the criticism that the present benefit structure does not recognize the changing role of women in our society.” Reactions to the 1978 Task Force report and letters sent to the Advisory Council became basic data for the 1979 HEW report, Social Security and the Changing Roles of Men and Women. Although the 1979 report analyzed options for change and did not make definite recommendations, it did serve a basic purpose: “to focus public debate on concerns about the way social security relates to the present complex and diversified structure of American society.” Later in 1979, the Advisory Council stated the issue as follows:
Two new objectives [for Social Security] are commanding increasing attention. First, from the recognition that women are important contributors to the economic well being of the family, whether they work inside or outside the home, comes the desire that women be entitled to benefits in their own right, not simply or primarily as economic dependents of their spouses. Second, in addition to individual equity, equity is now also tested by whether couples with the same total earnings receive the same protection, regardless of which partner earned what share.59
Issues of women’s rights were raised on many fronts: the courts, Congress, the administration, the women’s movement, and the public generally. But change for women generally and for poor women in particular did not occur with the predicted swiftness. Ambivalence about ways to help poor women combined with a broader ambivalence about the proper role of all women in our society. The failure to achieve ratification of the Equal Rights Amendment by the required approval of three-fourths of the states and the rescinding of approval by several states indicate the indecision of the period.
Conclusion
In 1970, social welfare seemed open to new endeavors, new efforts to help those in need. But by 1990, the attention to federal budget deficits dampened the interest not only in social services but also in any expansion of programs for retirees, for women, or for the unemployed. Welfare reform, too, was put on hold. President Carter had urged Congress to consider the establishment of a national minimum benefit level pegged at 65 percent of the poverty threshold, as well as additional allocations for job development,60 but President Reagan moved toward social welfare cutbacks.
Combined with budget restraints was the desire to narrow the role of the federal government. Revenue sharing, block grants, and the proposed return to the states of responsibility for many job training, education, community development, justice, and health programs were a retreat from the longtime trend toward centralized—that is, federal—responsibility for social welfare. Behind the Reagan drive to return responsibility for many programs to the states was a desire to cut total expenditures for social welfare. Funding for social programs was contained by the device of expanding block grants so that they had to cover more programs. Expansion of any one program then comes at the expense of a reduction in another. President Bush was inclined to further President Reagan’s move toward block grants, with the states making the allocation decisions.
Documents Conservative Resurgence and Social Change
President Richard M. Nixon’s Message on Reform in Welfare (1969) is one of two documents used to illustrate social welfare events during the 1970s and 1980s. Although the message was written in 1969, it was considered as H.R. 1 in 1970. The debate that swirled around the message and the bill highlighted the major issues in public welfare.
The FAP proposed in the message was the center of controversy. In the initial debate, attacks came from both liberals and conservatives. The merits of the NIT approach to helping poor families, the sufficiency of the guaranteed basic allowance, the work-incentive features, and the inclusion of benefits for the working poor were all subject to scrutiny by proponents and opponents. Liberals generally approved the plan to federalize aid for poor families. They were dismayed, however, by the seemingly punitive “workfare” aspects of the message and by the low guaranteed basic allowance of $1,600 a year for a family of four. Conservatives were frightened by the possibility that an income transfer program guaranteeing benefits for the working poor would attract people to the welfare rolls and weaken their attachment to the labor force.
What comes through forcefully in the message is the switch from a service approach to welfare to an income-workfare approach. Moreover, the message makes clear that mothers are now to be considered employable—or potentially employable. The implication was a reversal of the “own home” philosophy of child care proclaimed by President Theodore Roosevelt at the first National Conference on Children and Youth in 1909.
As criticisms of H.R. 1 became more volatile, the Nixon administration and Congress responded with changes in the original proposal. The basic guaranteed allowance was raised, but supplementation through food stamps was eliminated. Even more serious from the point of view of welfare recipients in the more generous North was the threat of a cutback in their grants. The plan, as originally devised, offered protection against such a loss by requiring the states to supplement the guaranteed allowance. This was also eliminated. Work requirements became more stringent, specifying that mothers with children over three years old be available for work. Three-quarters of the minimum wage was stipulated as acceptable pay.
On the one side were those who worried about the inadequacies and injustices of the program. On the other side was a group concerned with mounting costs. Analyses of multiple program benefits and their overlap increased the anxiety of those who feared the erosion of work incentives. Stalemate ensued, and the proposal was quietly abandoned.
President Jimmy Carter’s Better Jobs and Income Program was, in its essential thrust, similar to the FAP. The two plans differed in two major aspects: (1) the Carter plan provided universal coverage, whereas the Nixon proposal was for families only; and (2) the Carter proposal included job creation. Despite the differences, the Carter plan also failed to achieve approval.
The increased interest in workforce participation and a desire to reduce the number of people receiving government income transfer payments led, by the late 1970s, to renewed vigor in removing people from the disability rolls. The administration pursued a disability review process that required claimants to show that there had not been sufficient improvement to stop their disability payments. Claimants sued, holding that although they had the burden of coming forward with some evidence of disability initially, in review the administration had the burden of showing that they were no longer disabled.
The administration lost many individual cases on this issue. They announced a policy of non-acquiescence: They would obey the court in a particular case but would not generalize the principle to apply it to other claimants. They chose not to seek review of these cases in the Supreme Court.
Congress moved to resolve the matter when it passed new regulations for termination of disability benefits based on medical improvement. The standard established by Congress in 42 U.S.C.S. § 423f requires that, in order to cease paying disability benefits, substantial evidence demonstrates sufficient medical improvement that the individual is now well enough to “engage in substantial gainful activity.”61
The text of this statute applied to Title II benefits (DI). A similar statute with identical language covered SSI. The law also provides for continuation of disability benefits during an administrative appeal in a termination case.62 Slowly, benefits were restored to some people with disabilities.
Reform in Welfare
Message from President Richard M. Nixon August 11, 1969
TO THE CONGRESS OF THE UNITED STATES:
A measure of the greatness of a powerful nation is the character of the life it creates for those who are powerless to make ends meet.
If we do not find the way to become a working nation that properly cares for the dependent, we shall become a Welfare State that undermines the incentive of the working man.
The present welfare system has failed us—it has fostered family breakup, has provided very little help in many States and has even deepened dependency by all too often making it more attractive to go on welfare than to go to work.
I propose a new approach that will make it more attractive to go to work than to go on welfare, and will establish a nationwide minimum payment to dependent families with children.
I propose that the Federal government pay a basic income to those American families who cannot care for themselves in whichever State they live.
I propose that dependent families receiving such income be given good reason to go to work by making the first sixty dollars a month they earn completely their own, with no deductions from their benefits.
I propose that we make available an addition to the incomes of the “working poor,” to encourage them to go on working and to eliminate the possibility of making more from welfare than from wages.
I propose that these payments be made upon certification of income, with demeaning and costly investigations replaced by simplified reviews and spot checks and with no eligibility requirements that the household be without a father. That present requirement in many States has the effect of breaking up families and contributes to delinquency and violence.
I propose that all employable persons who choose to accept these payments be required to register for work or job training and be required to accept that work or training, provided suitable jobs are available either locally or if transportation is provided. Adequate and convenient day care would be provided children wherever necessary to enable a parent to train or work. The only exception to this work requirement would be mothers of pre-school children.
I propose a major expansion of job training and day care facilities, so that current welfare recipients able to work can be set on the road to self-reliance.
I propose that we also provide uniform Federal payment minimums for the present three categories of welfare aid to adults—the aged, the blind and the disabled.
This would be total welfare reform—the transformation of a system frozen in failure and frustration into a system that would work and would encourage people to work.
Accordingly, we have stopped considering human welfare in isolation. The new plan is part of an overall approach which includes a comprehensive new Manpower Training Act, and a plan for a system of revenue sharing with the State to help provide all of them with necessary budget relief. Messages on manpower training and revenue sharing will follow this message tomorrow and the next day, and the three should be considered as parts of a whole approach to what is clearly a national problem.
Need for New Departures
A welfare system is a success when it takes care of people who cannot take care of themselves and when it helps employable people climb toward independence.
A welfare system is a failure when it takes care of those who can take care of themselves, when it drastically varies payments in different areas, when it breaks up families, when it perpetuates a vicious cycle of dependency, when it strips human beings of their dignity.
America’s welfare system is a failure that grows worse every day.
First, it fails the recipient: In many areas, benefits are so low that we have hardly begun to take care of the dependent. And there has been no light at the end of poverty’s tunnel. After four years of inflation, the poor have generally become poorer.
Second, it fails the taxpayer: Since 1960, welfare costs have doubled and the number on the rolls has risen from 5.8 million to over 9 million, all in a time when unemployment was low. The taxpayer is entitled to expect government to devise a system that will help people lift themselves out of poverty.
Finally, it fails American society: By breaking up homes, the present welfare system has added to social unrest and robbed millions of children of the joy of childhood; by widely varying payments among regions, it has helped to draw millions into the slums of our cities.
The situation has become intolerable. Let us examine the alternatives available:
—We could permit the welfare momentum to continue to gather speed by our inertia; by 1975 this would result in 4 million more Americans on welfare rolls at a cost of close to $11 billion a year, with both recipients and taxpayers shortchanged.
—We could tinker with the system as it is, adding to the patchwork of modifications and exceptions. That has been the approach of the past, and it has failed.
—We could adopt a “guaranteed minimum income for everyone,” which would appear to wipe out poverty overnight. It would also wipe out the basic economic motivation for work, and place an enormous strain on the industrious to pay for the leisure of the lazy.
Or, we could adopt a totally new approach to welfare, designed to assist those left far behind the national norm, and provide all with the motivation to work and a fair share of the opportunity to train.
This administration, after a careful analysis of all the alternatives, is committed to a new departure that will find a solution for the welfare problem. The time for denouncing the old is over; the time for devising the new is now.
Recognizing the Practicalities
People usually follow their self-interest.
This stark fact is distressing to many social planners who like to look at problems from the top down. Let us abandon the ivory towers and consider the real world in all we do.
In most States, welfare is provided only when there is no father at home to provide support. If a man’s children would be better off on welfare than with the low wage he is able to bring home, wouldn’t he be tempted to leave home?
If a person spent a great deal of time and effort to get on the welfare rolls, wouldn’t he think twice about risking his eligibility by taking a job that might not last long?
In each case, welfare policy was intended to limit the spread of dependency; in practice, however, the effect has been to increase dependency and remove the incentive to work.
We fully expect people to follow their self-interest in their business dealings; why should we be surprised when people follow their self-interest in their welfare dealings? That is why we propose a plan in which it is in the interest of every employable person to do his fair share of work.
The Operation of the New Approach
1. We would assure an income foundation throughout every section of America for all parents who cannot adequately support themselves and their children. For a family of four with less than $1,000 income, this payment would be $1,600 a year; for a family of four with $2,000 income, this payment would supplement that income by $960 a year.
Under the present welfare system, each State provides “Aid to Families with Dependent Children,” a program we propose to replace. The Federal government shares the cost, but each State establishes key eligibility rules and determines how much income support will be provided to poor families. The result has been an uneven and unequal system. The 1969 benefits average for a family of four is $171 a month across the nation, but individual State averages range from $263 down to $39 a month.
A new Federal minimum of $1,600 a year cannot claim to provide comfort to a family of four, but the present low of $468 a year cannot claim to provide even the basic necessities.
The new system would do away with the inequity of very low benefits levels in some States, and of State-by-State variations in eligibility tests, by establishing a Federally-financed income floor with a national definition of basic eligibility.
States will continue to carry an important responsibility. In 30 States, the Federal basic payment will be less than the present levels of combined Federal and State payments. These States will be required to maintain the current level of benefits, but in no case will a State be required to spend more than 90% of its present welfare cost. The Federal government will not only provide the “floor,” but it will assume 10% of the benefits now being paid by the States as their part of welfare costs.
In 20 States, the new payment would exceed the present average benefit payments, in some cases by a wide margin. In these States, where benefits are lowest and poverty often the most severe, the payments will raise benefit levels substantially. For 5 years, every State will be required to continue to spend at least half of what they are now spending on welfare, to supplement the Federal base.
For the typical “welfare family”—a mother with dependent children and no outside income—the new system would provide a basic national minimum payment. A mother with three small children would be assured an annual income of at least $1,600.
For the family headed by an employed father or working mother, the same basic benefits would be received, but $60 per month of earnings would be “disregarded” in order to make up the costs of working and provide a strong advantage in holding a job. The wage earner could also keep 50% of his benefits as his earnings rise above that $60 per month. A family of four, in which the father earns $2,000 in a year, would receive payments of $960, for a total income of $2,960.
For the aged, the blind and the disabled, the present system varies benefit levels from $40 per month for an aged person in one State to $145 per month for the blind in another. The new system would establish a minimum payment of $65 per month for all three of these adult categories, with the Federal government contributing the first $50 and sharing in payments above the amount. This will raise the share of the financial burden borne by the Federal government for payments to these adults who cannot support themselves, and should pave the way for benefit increases in many States.
For the single adult who is not handicapped or aged, or for the married couple without children, the new system would not apply. Food stamps would continue to be available up to $300 per year per person, according to the plan I outlined last May in my message to the Congress on the food and nutrition needs of the population in poverty. For dependent families there will be an orderly substitution of food stamps by the new direct monetary payments.
2. The new approach would end the blatant unfairness of the welfare system. In over half the States, families headed by unemployed men do not qualify for public assistance. In no State does a family headed by a father working full-time receive help in the current welfare system, no matter how little he earns. As we have seen, this approach to dependency has itself been a cause of dependency. It results in a policy that tends to force the father out of the home.
The new plan rejects a policy that undermines family life. It would end the substantial financial incentives to desertion. It would extend eligibility to all dependent families with children, without regard to whether the family is headed by a man or woman. The effects of these changes upon human behavior would be an increased will to work, the survival of more marriages, the greater stability of families. We are determined to stop passing the cycle of dependency from generation to generation.
The most glaring inequity in the old welfare system is the exclusion of families who are working to pull themselves out of poverty. Families headed by a non-worker often receive more from welfare than families headed by a husband working full-time at very low wages. This has been rightly resented by the working poor, for the rewards are just the opposite of what they should be.
3. The new plan would create a much stronger incentive to work. For people now on the welfare rolls, the present system discourages the move from welfare to work by cutting benefits too fast and too much as earnings begin. The new system would encourage work by allowing the new worker to retain the first $720 of his yearly earnings without any benefit reduction.
For people already working, but at poverty wages, the present system often encourages nothing but resentment and an incentive to quit and go on relief where that would pay more than work. The new plan, on the contrary, would provide a supplement that will help a low-wage worker—struggling to make ends meet—achieve a higher standard of living.
For an employable person who just chooses not to work, neither the present system nor the one we propose would support him, though both would continue to support other dependent members in his family.
However, a welfare mother with pre-school children should not face benefit reductions if she decides to stay home. It is not our intent that mothers of pre-school children must accept work. Those who can work and desire to do so, however, should have the opportunity for jobs and job training and access to day care centers for their children: this will enable them to support themselves after their children are grown.
A family with a member who gets a job would be permitted to retain all of the first $60 monthly income, amounting to $720 per year for a regular worker, with no reduction of Federal payments. The incentive to work in this provision is obvious. But there is another practical reason: Going to work costs money. Expenses such as clothes, transportation, personal care, Social Security taxes and loss of income from odd jobs amount to substantial costs for the average family. Since a family does not begin to add to its net income until it surpasses the cost of working, in fairness this amount should not be subtracted from the new payment.
After the first $720 of income, the rest of the earnings will result in a systematic reduction in payments.
I believe the vast majority of poor people in the United States prefer to work rather than have the government support their families. In 1968, 600,000 families left the welfare rolls out of an average caseload of 1,400,000 during the year, showing a considerable turnover, much of it voluntary.
However, there may be some who fail to seek or accept work, even with the strong incentives and training opportunities that will be provided. It would not be fair to those who willingly work, or to all taxpayers, to allow others to choose idleness when opportunity is available. Thus, they must accept training opportunities and jobs when offered, or give up their right to the new payments for themselves. No able-bodied person will have a “free ride” in a nation that provides opportunity for training and work.
4. The bridge from welfare to work should be buttressed by training and child care programs. For many, the incentives to work in this plan would be all that is necessary. However, there are other situations where these incentives need to be supported by measures that will overcome other barriers to employment.
I propose that funds be provided for expanded training and job development programs so that an additional 150,000 welfare recipients can become job worthy during the first year.
Manpower training is a basic bridge to work for poor people, especially people with limited education, low skills and limited job experience. Manpower training programs can provide this bridge for many of our poor. In the new Manpower Training proposal to be sent to the Congress this week, the interrelationship with this new approach to welfare will be apparent.
I am also requesting authority, as a part of the new system, to provide child care for the 450,000 children of the 150,000 current welfare recipients to be trained.
The child care I propose is more than custodial. This Administration is committed to a new emphasis on child development in the first five years of life. The day care that would be part of this plan would be of a quality that will help in the development of the child and provide for its health and safety, and would break the poverty cycle for this new generation.
The expanded child care program would bring new opportunities along several lines: opportunities for the further involvement of private enterprise in providing high quality child care service; opportunities for volunteers; and opportunity for training and employment in child care centers of many of the welfare mothers themselves. I am requesting a total of $600 million additional to fund these expanded training programs and child care centers.
5. The new system will lessen welfare red tape and provide administrative cost savings. To cut out the costly investigations so bitterly resented as “welfare snooping,” the Federal payment will be based upon a certification of income, with spot checks sufficient to prevent abuses. The program will be administered on an automated basis, using the information and technical experience of the Social Security Administration, but, of course, will be entirely separate from the administration of the Social Security trust fund.
The States would be given the option of having the Federal government handle the payment of the State supplemental benefits on a reimbursable basis, so that they would be spared their present administrative burdens and so a single check could be sent to the recipient. These simplifications will save money and eliminate indignities; at the same time, welfare fraud will be detected and lawbreakers prosecuted.
6. This new department would require a substantial initial investment, but will yield future returns to the Nation. This transformation of the welfare system will set in motion forces that will lessen dependency rather than perpetuate and enlarge it. A more productive population adds to real economic growth without inflation. The initial investment is needed now to stop the momentum of work-to-welfare, and to start a new momentum in the opposite direction.
The costs of welfare benefits for families with dependent children have been rising alarmingly the past several years, increasing from $1 billion in 1960 to an estimated $3.3 billion in 1969, of which $1.8 billion is paid by the Federal government, and $1.5 billion is paid by the States. Based on current population and income data, the proposals I am making today will increase Federal costs during the first year by an estimated $4 billion, which includes $600 million for job training and child care centers.
The “start-up costs” of lifting many people out of dependency will ultimately cost the taxpayers far less than the chronic costs—in dollars and in national values—of creating a permanent underclass in America.
From Welfare to Work
Since this Administration took office, members of the Urban Affairs Council, including officials of the Department of Health, Education and Welfare, the Department of Labor, the Office of Economic Opportunity, the Bureau of the Budget, and other key advisers, have been working to develop a coherent, fresh approach to welfare, manpower training and revenue sharing.
I have outlined our conclusions about an important component of this approach in this message; the Secretary of HEW will transmit to the Congress the proposed legislation after the summer recess.
I urge the Congress to begin its study of these proposals promptly so that laws can be enacted and funds authorized to begin the new system as soon as possible. Sound budgetary policy must be maintained in order to put this plan into effect—especially the portion supplementing the wages of the working poor.
With the establishment of the new approach, the Office of Economic Opportunity will concentrate on the important task of finding new ways of opening economic opportunity for those who are able to work. Rather than focusing on income support activities, it must find means of providing opportunities for individuals to contribute to the full extent of their capabilities, and of developing and improving those capabilities.
This would be the effect of the transformation of welfare into “workfare,” a new work-rewarding system:
For the first time, all dependent families with children in America, regardless of where they live, would be assured of minimum standard payments based upon uniform and single eligibility standards.
For the first time, the more than two million families who make up the “working poor” would be helped toward self-sufficiency and away from future welfare dependency.
For the first time, training and work opportunity with effective incentives would be given millions of families who would otherwise be locked into a welfare system for generations.
For the first time, the Federal government would make a strong contribution toward relieving the financial burden of welfare payments from State governments.
For the first time, every dependent family in America would be encouraged to stay together, free from economic pressure to split apart.
These are far-reaching effects. They cannot be purchased cheaply, or by piecemeal efforts. This total reform looks in a new direction; it requires new thinking, a new spirit and a fresh dedication to reverse the downhill course of welfare. In its first year, more than half the families participating in the program will have one member working or training.
We have it in our power to raise the standard of living and the realizable hopes of millions of our fellow citizens. By providing an equal chance at the starting line, we can reinforce the traditional American spirit of self-reliance and self-respect.
***
Standard of Review for Termination of Disability Benefits
A recipient of benefits under this subchapter or subchapter XVIII of this chapter based on the disability of any individual may be determined not to be entitled to such benefits on the basis of a finding that the physical or mental impairment on the basis of which such benefits are provided has ceased, does not exist, or is not disabling only if such finding is supported by—
substantial evidence which demonstrates that—
there has been any medical improvement in the individual’s impairment or combination of impairments (other than medical improvement which is not related to the individual’s ability to work), and
the individual is now able to engage in substantial gainful activity, or
if the individual is a widow or surviving divorced wife under section 402(e) of this title or a widower or surviving divorced husband under section 402(f) of this title, the severity of his or her impairment or impairments is no longer deemed, under regulations prescribed by the Secretary, sufficient to preclude the individual from engaging in gainful activity; or
substantial evidence which—
consists of new medical evidence and (in a case to which clause (ii)(II) does not apply) a new assessment of the individual’s residual functional capacity, and demonstrates that—
although the individual has not improved medically, he or she is nonetheless a beneficiary of advances in medical or vocational therapy or technology (related to the individual’s ability to work), and
(I) the individual is now able to engage in substantial gainful activity, or
if the individual is a widow or surviving divorced wife under section 402(e) of this title or a widower or surviving divorced husband under section 402(f) of this title, the severity of his or her impairment or impairments is no longer deemed under regulations prescribed by the Secretary sufficient to preclude the individual from engaging in gainful activity, or
demonstrates that—
although the individual has not improved medically, he or she has undergone vocational therapy (related to the individual’s ability to work), and
the requirements of subclause (I) or (II) of subparagraph (A)(ii) are met; or
substantial evidence which demonstrates that, as determined on the basis of new or improved diagnostic techniques or evaluations, the individual’s impairment or combination of impairments is not as disabling as it was considered to be at the time of the most recent prior decision that he or she was under a disability or continued to be under a disability, and that therefore—
the individual is able to engage in substantial gainful activity, or
if the individual is a widow or surviving divorced wife under section 402(e) of this title or a widower or surviving divorced husband under section 402(f) of this title, the severity of his or her impairment or impairments is not deemed under regulations prescribed by the Secretary sufficient to preclude the individual from engaging in gainful activity; or
substantial evidence (which may be evidence on the record at the time any prior determination of the entitlement to benefits based on disability was made, or newly obtained evidence which relates to that determination) which demonstrates that a prior determination was in error.
Nothing in this subsection shall be construed to require a determination that a recipient of benefits under this subchapter or subchapter XVIII of this chapter based on an individual’s disability is entitled to such benefits if the prior determination was fraudulently obtained or if the individual is engaged in substantial gainful activity (or gainful activity in the case of a widow, surviving divorced wife, widower, or surviving divorced husband), cannot be located, or fails, without good cause, to cooperate in a review of the entitlement to such benefits or to follow prescribed treatment which would be expected to restore his or her ability to engage in substantial gainful activity (or gainful activity in the case of a widow, surviving divorced wife, widower, or surviving divorced husband). Any determination under this section shall be made on the basis of all the evidence available in the individual’s case file, including new evidence concerning the individual’s prior or current condition which is presented by the individual or secured by the Secretary. Any determination made under this section shall be made on the basis of the weight of the evidence and on a neutral basis with regard to the individual’s condition, without any initial inference as to the presence or absence of disability being drawn from the fact that the individual has previously been determined to be disabled. For purposes of this subsection, a benefit under this subchapter is based on an individual’s disability if it is a disability insurance benefit, a child’s, widow’s, or widower’s insurance benefit based on disability, or a mother’s or father’s insurance benefit based on the disability of the mother’s or father’s child who has attained age 16.
Public Law 98–460, 98th Congress, September 15, 1984