The Department of Justice Canada works to reduce family violence in Canada. Learn more...
RESEARCH REPORT
IV. STATE DEVELOPMENT OF NEW HIRE PROGRAM
Generally, employers reported by using the federal Employee's Withholding Allowance Certificate (W‑4 form), since by federal law all employees have to complete this form when they start new jobs. Employers use the W‑4 form to determine employee income tax withholding allowances. They put it in the employee's personnel file and disclose it to federal or state governments under certain circumstances.[13] The data elements on W‑4 forms are the employee's name, address and SSN, and the company's name, address and federal ID number. Some states use forms that ask for more information, such as the date of the hire, the employee's date of birthand medical coverage information.
As explained above, employers wanted many options for sending data. In all states, employers could mail or fax (toll free) a copy of the W‑4 form, or the state equivalent. Although more employers are now using electronic transmission, the majority of employers in the early to mid‑1990s preferred to use paper methods. In Washington, for example, less than 10 percent of reports were transmitted electronically in 1994. On the other hand, four years later, only 24 percent of reports submitted in South Carolina were sent on paper; the rest came ondiskette, cartridge or reel. This may be because most of the South Carolina employers reporting voluntarily are also large, technologically savvy businesses.
Recently, many employers have preferred the Internet, particularly small businesses. But states have to protect the privacy of information sent this way. In Massachusetts, each employer uses a password to gain access to the reporting portion of the Website. Massachusetts' promotional material states that this takes less than one minute per report, after which employers get a message confirming receipt of the report.
Not all new hire programs have sophisticated systems that permit online reporting. For example, until recently, the Oregon new hire program had to print out electronic reports and enter them manually. Smaller programs, like Oregon's, have had the greatest difficultyin meeting the federal PWRORA requirements.
A few states used toll‑free employer reporting lines. A Washington state contact warned that telephone reporting “took two full‑time staff members to translate the information and very often the information given was incomplete and staff would not know how to contact [employers] to get the rest of the data required.”
Other states used a 1‑800 line for which a trained staff member took the calls, which worked better.
Most of the original new hire programs required employers to report new hires within 15 days to one month of hire or rehire. The state's division of child support, another state department or a private contractor could then match the employee's name and SSN against open child support cases. ManySSNs were inaccurate, either because of errors in completing the form or in data entry, so some states verified these data with a special computer program.
Any matches were passed on to child support enforcement agencies. In some states, the staff member would send an employment verification letter to the employer and, if employment was confirmed, take action toward an income deduction order. In Arizona, for example, the new hire appeared on the system within two days of the receipt of the information. In cases where a court order was already in place, the caseworker called the employer, verified that the non-custodial parent was still employed and started the wage assignment process within 10 days. In Washington and Massachusetts, a fully automatic system identified non-custodial parents who were in arrears and automatically sent letters to employers before staff intervention and assessment.
In California, the new hire data were mailed to child support offices rather than transmitted electronically or via fax. Understandably, there was a delay of about a month before the data came to the attention of caseworkers. The California database had duplicate listings, since its records went back six months and were matched every month with child support caseloads. Until the problem was fixed, it increased workload and reduced confidence in the reporting.
Wage withholding accounts for between half and twothirds of all enforcement collections and, furthermore, produces the highest compliance rates and collections (Bartfield and Meyer, 1994). Under the PRWORA, states must have procedures for withholding the wages of someone in arrears of a support obligation, and these procedures must not require a judicial or administrative hearing (Legler, 1996:542).
If an employee has to pay child support, employers are asked to set aside wages from an employee's income.[14] As mentioned above, some states have a fully automated procedure for issuing wage withholding, while others continue to use manual procedures. In Massachusetts, automation decreased child support enforcement costs from $9,174,000 to $777,000, or an average decrease per case of $286, from $306 to $20 (Department of Revenue, 1995). However, as with any such automated process, a review process should allow speedy correction when a wage withholding is issued erroneously.[15]
One state respondent spotted one problem: employers with multiple business locations or different payroll departments will report only one business address, which may not be the address to which the state should send the employee's income withholding order. As such, enforcement staff cannot automatically issue a wage withholding order until they can be sure that it will reach the appropriate payroll section.
Most state legislation specified a period, often 3 to 12 months, after which unmatched records were to be destroyed. However, Texas legislation specified that the state “shall not create a record regarding the employee and the information...shall be promptly destroyed.”
The state could only retain the data if a support obligation was to be established or enforced. The new hire program has been criticized on the grounds of privacy invasion, and record destruction was originally the main mechanism for forestalling such criticism. The PRWORA, however, has no records destruction provision.
One of the objectives of this research was to determine whether the different new hire models had different success rates. Meaningful across-state comparisons are difficult, as the amount and type of available information varies from state to state, so much so that we cannot be sure whether we are comparing apples and apples, or apples and oranges.
States typically quantified the success of new hire programs by looking at:
The data in Table 3 illustrate match rates, but the lack of clarity in the documentation means, for example, that we don't know if the “enforcement” category was limited to cases where the payer was in arrears or whether it contained all matches where an order existed. Most documents did not define a match, although the match rate was usually based on the matches with open cases. For example, in Massachusetts, the match rate included those persons who were already paying child support; officials interviewed said this group was important because the state might need to increase the amount of the payments.
Table 3: Match Rates between Child Support Cases and Employer Reporting Data (in Percentages)
| Average match rate (all cases) | Most recent match rate (all cases) | Paternity establishment matches | Support establishment matches | Enforcement matches | |
|---|---|---|---|---|---|
| Alaska | 3.9 | 4.0a | 0.2d | n/a | 4.1d |
| Oregon | 8.7 | 7.7b | (0.9)e | (2.8)e | (5.9)e |
| Washington | 8.8 | 10.1c | n/a | n/a | n/a |
| Average match rate (all cases) | Most recent match rate (all cases) | Paternity establishment matches | Support establishment matches | Enforcement matches | |
|---|---|---|---|---|---|
| Texas | 3.7 | 3.1b | n/a | n/a | n/a |
| Average match rate (all cases) | Most recent match rate (all cases) | Paternity establishment matches | Support establishment matches | Enforcement matches | |
|---|---|---|---|---|---|
| Arizona | 8.0 | 6.1a | 1.7a | 1.3a | 3.1a |
| Average match rate (all cases) | Most recent match rate (all cases) | Paternity establishment matches | Support establishment matches | Enforcement matches | |
|---|---|---|---|---|---|
| Connecticut* | n/a | 0.5b | - | - | 0.5b |
| Iowa | n/a | 11.0b | n/a | n/a | n/a |
| Kentucky | n/a | 9.6a | n/a | n/a | 5.1a |
| Florida** | 5.5 | 6.2c | n/a | n/a | 2.6c |
| Maryland*** | n/a | 4.0a | 0.8a | 0.6a | 2.9a |
| Massachusetts | 3.7 | 3.5a | n/a | n/a | n/a |
| Missouri | n/a | 10.6b | n/a | n/a | n/a |
| New York | n/a | 7.0c | n/a | n/a | 3.9c |
| Virginia | n/a | 7.3be | n/a | n/a | n/a |
Notes:
The rates in Table 3 show that there was little relationship between the type of program and the degree of success in matching the names of non-custodial parents to the data received from employer reporting. One might expect that the targeted programs would have higher rates, but this does not appear to have been the case. The most recent match rates for targeted programs ranged from 4 to 10 percent. The Texas voluntary program had a match rate of about 3 percent, and the Arizona program had a rate of about 6 percent. The mandatory programs reported rates from 3.5 to 11 percent. In Connecticut, where it was clear that matching occurred only for delinquent payers, the match rate was only 0.5 percent. The other programs reported match rates for enforcement cases from 2.6 to 5.9 percent of all new hire reports.
The lack of uniformity in match rates within the program types could be due tomany non-program factors, such as variations by state in the following:
It is possible that states were defining matches differently. In addition, the Center for Law and Social Policy said that the caseload figures of child support agencies might be suspect: “there are justifiable concerns about the accuracy of IV‑D caseload data”
(Center for Law and Social Policy, 1998:7). States varied in their case definitions and procedures for opening and closing cases, and even within states, practices varied. If the definition of a case varied by state, then so would the match rates.
Several state tracking studies also showed that caution should be exercised in using the match rate as an indicator of program success.
The California parent locator service surveyed county child support agencies in 1993 and 1994 to see whether new hire listings from anewly established registry were effective for child support enforcement. All California employers submitted the names of new hires within 30 days of employment. County child support agencies explained what they did with 299 matches between child support caseloads and new hire listings.
For unknown reasons, the counties reported that only 227 of the sample listings reached case files for processing, taking an average of 33 days to do so once they had been mailed. Of the listings, 66 percent provided informationunknown to the county, such as an address, while 29 percent of the listings provided new no information and 5 percent of the listings involved the wrong person, usually because of inaccurate matching on SSNs.
The analysis found that collections were made in 8 percent of the “workable” listings. Of the 150 listings that provided new information, 80 percent led to a successful employer contact. Of these 120 listings, only 43 (36 percent) involved someone still working for the employer. The state tried to enforce 34 of the 43 cases. Of the 34 cases, 28 resulted in an order for wage assignment or licence hold. The average dollar amount of the wage assignments was $440 per month. At least one collection was made in 18 of the 28 wage assignments (64 percent), and at the time of the survey 11 of the 18 non-custodial parents were still paying child support. Enforcement efforts were made, on average, 65 days after the mailing of the listings.
The Florida program underwent a somewhat similar exercise, using new hire matches for the first six months of 1995 (Florida Advisory Council, 1995). The initial match rate was 5.8 percent. Of the 28,693 matches, 58 percent were “obligated cases” for which a child support obligation had already been established, and 42 percent were “unobligated cases,” apparently related to paternity establishment (only a few dozen of the latter resulted in an obligation being established).
Of the obligated cases, 91 percent required the location information. Of this group, 38 percent were “non-productive” (meaning that the employment had been terminated), 20 percent were pending and 42 percent were “productive.” Of the productive cases, in 89 percent, wage withholding was implemented. A third of obligated cases resulted in an income deduction order. If these ratios were applied to the initial match rate of 5.8 percent, the “success rate” would change from 5.8 percent to 1.1 percent of matched names (i.e. the cases where a deduction order was implemented).[16]
In Ohio, a review of a random sample of matched cases revealed the following outcomes:
Therefore, 39 percent of matches resulted in some type of enforcement action and about a fifth resulted in income withholding.
An internal Connecticut study found that income withholding was not issued in 56 percent of matched cases, for the following reasons:
In Iowa, an analysis of a sample of child support cases reported the following outcomes after matching:
Therefore, in 31 percent of the matched cases, employer reporting led to payments.
In late 1994 and early 1995, the Virginia program drew a sample of 295 matched cases, to determine the outcomes of the matches. Of the sample, 25.3 percent resulted in collections from wage withholding, and $32,377 of the amount received could be attributed to the program (meaning that the information was available before information from quarterly wage reports). Extrapolating the findings to the first 29 months of the program, the author of the study estimated that child support collections had increased by $20.2 million.[17]
The report also showed why there was no wage withholding for certain Virginia cases: in 34 percent of the cases, wage withholding was “not appropriate”; in 20 percent, employment had been identified by other means; in 11 percent, the person was no longer at the same job; in 4 percent, the information was incomplete and unusable; and in 32 percent, no reason was provided (Virginia Division of Child Support Enforcement, 1995).
These monitoring exercises illustrate that match rates do not necessarily translate into collections. Here is why an uncritical acceptance of match rates and extrapolation of them to Canada may be unwise.
Very often, the collection figures provided in program documents sound impressive—understandably, perhaps, because they are used as public relations and marketing tools to sell the program to employers and the general public. For example, a series of “success stories” is found in publications of the United States Office of Child Support Enforcement (1997).
It is nearly impossible to compare collections across new hire programs or categories of programs because most sources provided total dollars in collections that could be attributed to the program, but did not provide the number of open child support cases or the number of cases in arrears. Ideally, we would need to calculate the average amount collected per case in arrears.
In addition, the collection data were not always consistent. For example, one Arizona annual report estimated collections were $350,000 for fiscal year 1995, but a more recent internal document increased that estimate to $1,636,675.
Something similar happened with the Florida data. The Florida 1995 report cited above estimated that the new hire program had led to orders worth $5.2 million annually, for about one million new hire reports. However, another source estimated that employer reporting produced an obligation amount of $15.2 million, a threefold difference (U.S. Office of Child Support Enforcement, Child Support Report, 1996). The differences might be due to the method of selecting which child support collections could be directly attributed to the new hire program, as opposed to other methods of enforcing child support orders.
New hire databases were also used for purposes other than child support enforcement, such as fraud and overpayment detection for other social programs.
Before the PRWORA, many new hire programs shared their databases with workers' compensation, unemployment security, Medicare and Aid to Families with Dependent Children (AFDC) agencies. We looked at five states with information-sharing relationships—Georgia, Massachusetts, Missouri, Texas and West Virginia—to examine the details of their arrangements.
All the state respondentswe called said that they wanted access to the new hire database to more quickly find benefit recipients who had jobs. All agencies had been accessing quarterly employer wage reports to detect unreported employment and income. These data, as noted above, could be four to six months old by the time they were cross-matched with open cases, so some overpayments went on for months before being detected.
The Texas Workforce Commission (TWC) also used the data to run the new hire names against current unemployment benefits recipients, as well as against its old overpayment caseload. If the Commission matched a previous client and a large, recent overpayment, it reactivated the case and pursued an income withholding order.
Before the PRWORA, state policy or legislation sometimes prevented agencies from getting access to the new hire database. Many states, such as Alaska, had explicitly said that the employer reporting data could only be used for child support enforcement. In these cases, and in states that did not mention outside access at all, state legislation had to be modified. In Massachusetts and Missouri, legislation supplied the impetus to approach the IV‑D agency to work out an arrangement. In the other three states, knowledge of child support initiatives came about through regular communication among departments.
Many agencies had to wait before accessing the data because computer systems had to be altered and software written to enable the exchange. For example, the West Virginia IV‑A (public assistance) department waited several years until the system protected the security of the data. This agency shared the same computer system and could have had online access to the database, but this access could have threatened security, since those using the database could change records as well as read them.
In most cases, the department with the new hire database sent the new hire information to the receiving agency by magnetic tape, doing so once a week, once a month or, in one case, every day. Where the new hire reporting program was funded under the same umbrella department as state welfare programs, direct access to the system was usually possible. Online access was preferable because it avoided the costs involved in purchasing and mailing tapes, as well as the occasional problem of lost or wrinkled tapes. Texas tried to work out an arrangement for direct access to new hire data but decided against it because it was too expensive for the other agencies to change their computer systems.
In the five states with which we spoke, two public assistance programs and four employment security agencies had traced the savings they had achieved by using the new hire database. The two welfare agencies identified significant savings. (See Table 4.)
Table 4: Public Assistance Savings Attributable to the New Hire Program (in U.S. Dollars)
| reference period |
Monthly AFDC savings (number of cases) |
Monthly food stamps savings number of cases) |
Monthly Medicaid savings (number of cases) |
Total monthly savings | Projected annual savings |
|---|---|---|---|---|---|
| 1994 to 1995 | $443,800 (1,882) |
$597,859 (4,039) |
$175,824 (792) |
$1,217,483 | $14,609,796 |
| reference period | Monthly AFDC savings (number of cases) |
Monthly food stamps savings number of cases) |
Monthly Medicaid savings (number of cases) |
Total monthly savings | Projected annual savings |
|---|---|---|---|---|---|
| fiscal 1994 | n/a (1,110) |
n/a 1,948) |
n/a |
n/a | $15,900,000 |
| fiscal 1997 | $1,247,206 (3,194) |
$200,565 (1,597) |
n/a | $1,447,771 | $17,373,252 |
Note: Massachusetts 1997 AFDC cases include the General Assistance program and Emergency Aid to the Elderly, Disabled and Children.
Most monthly savings in Massachusetts came from AFDC cases. The agency estimated that it saved $1.25 million monthly from closing cases or reducing AFDC funding. The Virginia Department of Social Services saved more from closing food stamp cases than from closing AFDC cases. Program staff could not find any differences that could account for this variation.
The Virginia Department of Social Services used the new hire database to match recipients of public assistance, food stamps and Medicaid. An analysis of savings revealed the following:
Precisely how the reductions and case closures came about is not specified.
The results are less impressive for employment security agencies than for welfare agencies. (See Table 5.)
| State (reference period) |
Total Employment Security cases | Quarterly savings | Projected annual savings |
|---|---|---|---|
| Massachusetts (fiscal 1994) |
900 | $500,000 | $2,000,000 |
| West Virginia Bureau of Employment Programs (October 1997 to February 1998) |
107 | $45,207 | $180,828 |
| Florida Division of Unemployment Compensation (April 1995 to August 1995) |
417 | $84,556 | $338,224 |
Massachusetts saved more because it was overpaying more. In Massachusetts, the average overpayment had been approximately $556, whereas in West Virginia it was $423 and in Florida it was $203.
It is important to be cautious when interpreting these savings. Projections of savings to six months or one year may not accurately reflect the movement of cases on and off public assistance. Some clients may return to social assistance in the interim if they lose their jobs. Furthermore, it may have been possible to recoup overpayments through quarterly labour reports. In addition, not all the recorded overpayments have necessarily been recouped. For example, Florida recouped only 33 percent of unemployment compensation overpayments six months after it took action. It would probably have recouped even less from public assistance clients.
To determine the cost effectiveness of using new hire data, one needs both the projected benefits and the costs to the department. Only one agency, the Texas Department of Human Services (DHS), established the cost of using the new hire database to match welfare cases. In a 1996 report, Texas DHS stated that it had saved an estimated $792,000 a year. It cost an estimated $210,000 to match new hires within DHS, so the real annual cost saving was $582,000, or a cost-benefit ratio of 1: $3.77.
Texas DHS also estimated a cost-benefit ratio for a mandatory program in Texas. It projected savings of $12.7 million and costs of $3.4 million, for a projected annual benefit of $9.3 million. The projected cost-benefit ratio for the mandatory program in Texas was $1: $3.76. Therefore, the cost-benefit ratio of moving to a mandatory program is virtually identical to the present benefit calculated for the voluntary program.
The author of the Texas report, however, warned readers of the difficulties in making projections of this kind. One cannot accurately project the benefits of mandatory reporting because one doesn't know the differences between those employers who report and those who do not. For example, they may hire DHS clients in different proportions (Texas DHS, 1996:11).
In summary, even before the passage of the PRWORA, other welfare programs were using new hire data to detect overpayments and fraudulent claims. The data were transferred smoothly, although state agencies needed to reformat their computer systems. As with child support enforcement, states reported substantial cost savings, especially for public assistance,with lesser savings for unemployment insurance.
It is difficult to determine program start‑up costs because, in many states, existing departmental budgets absorbed these costs. For example, Vermont's voluntary program did not getany funds and the Office of Child Support absorbed all fixed costs associated with the program, which included costs to reprogram the telephone system. Only a few states identified discrete start‑up costs for their initial employer reporting program. Between 1990 and 1992, the fixed start‑up costs of the Washington state program were $43,292 and the initial variable program costs were $351,110. In Iowa, the start‑up costs (in 1993 and 1994) were estimated at $440,424. The Florida figure was $91,300 (in 1995). In 1996, start‑up costs in Minnesota were said to be $94,000.
There is also little information in the documentation on the annual costs of operating new hire programs, although we asked states for cost-related information.
Annual costs ranged from just over $100,000 in Arizona to $500,000 in Minnesota (see Table 6). There is no apparent relationship between the type of program and annual expenditures. When we divided the annual spending by the approximate number of new hires reported in each jurisdiction, we found a large range in the overall cost per report—from $0.27 per report in Florida to $1.45 in Arizona. This range might be due to differences in salaries, overhead, automation and, possibly,privatization. Based on our experience with costing social programs, we assume that many of the differences are due to such accounting questions as how overheads were included in expenditures and how equipment costs were amortized over time.
| State (reference period) |
Annual budget | Approx. number of new hires reported | Cost per report | All industries or targeted industries ? | Mandatory or voluntary ? |
|---|---|---|---|---|---|
| Alaska (fiscal 1994) |
$233,795 | n/a | n/a | Targeted | Mandatory |
| Washington (fiscal 1995) |
$451,000 | 324,300 | $1.39 | Targeted | Mandatory |
| Texas (fiscal 1996) |
$141,300 | 138,900 | $1.02 | Targeted | Voluntary |
| Arizona (fiscal 1994) |
$104,200 | 72,000 | $1.45 | All | Voluntary |
| Florida (1995) |
$268,600 | 992,000 | $0.27 | Large employers | Mandatory |
| Iowa (fiscal 1994) |
$270,850 | 483,300 | $0.56 | All | Mandatory |
| Minnesota (fiscal 1995) |
$499,100 | 1,017,000 | $0.49 | All | Mandatory |
Note: n/a = information not available.
The resources put into data control and entry might have affected operating costs. These costs for the Washington state program ranged from $84,019 in 1992 to $258,880 in 1997—from 54 to 60 cents per new employee reported. Similarly, the New York program estimated that the per‑record cost was 52 cents in 1997; the anticipated volume in that state was an astonishing 4.8 million records. In Ohio, the cost was 43 cents per new hire and it was 17 cents in Missouri in 1996. It is possible that states placed varying degrees of emphasis on data control and cleaning, which could account for the difference in per-record costs and help explain the varying costs of the program overall.
Cost-benefit ratios should be more accurate measures of overall program performance than total collections, although the discussion above suggests that the expenditures included in the “cost” side of the equation probably differed by state, as did the calculations to determine the collections attributable to the program.
Despite this problem, the ratios are worth presenting for those states that provided them. The state programs that calculated collection dollars received per dollar spent were the targeted industry programs in Alaska and Washington state, the voluntary program in Texas and the Massachusetts mandatory program.
We cannot easily explain the difference between Texas and the other voluntary program, in Arizona. Indeed, the very high ratios for Washington and Texas compared to the other states make us suspect that they have not tallied costs and collections the same way. These data do not permit any conclusions about costeffectiveness in relation to the “type” of program (voluntary or targeted).