Actually, the other answer isn't strictly correct. It's an estimate, giving a lower bound that gets less accurate as income increases. Consider: U.S. income tax is based on a progressive tax system where there are income bracket levels with increasing tax rates.
Example: Given U.S. 2009 federal tax rates for an individual filing as "single":
Imagine somebody making $100000. Assuming no other credits, deductions, or taxes, then income tax based on the above brackets & rates would be calculated as follows:
Meaning the average tax rate for the single individual earning $100,000 is 21.72%.
However, a pre-tax deduction from that income actually comes off at the top marginal tax rate. Consider the same calculation but with taxable income reduced to $99,000 instead (i.e. simulating a pre-tax $1000 deduction):
That's a difference of $280, which is more than the $217.20 savings that
would have been estimated if just using the average tax rate method.
Consequently, when trying to determine how much money would be saved by a tax deduction. it makes better sense to estimate using the marginal tax rate. which in this case was 28%. It gets a little trickier if the deduction crosses a bracket boundary. (Left as an exercise to the reader :-)
Finally, in the case of the deduction being discussed, it also looks like payroll FICA taxes paid by the employee (Social Security's 6.2%, and Medicare's 1.45%) would be avoided as well; so add that to the marginal tax rate savings.
The surest way to know how much would be saved, though, would be to do one's income tax return calculation without the deduction, and then with, and compare the numbers. Tax software can make this very easy to do.