Foreign Direct Investment in the United States
Executive Summary
When foreign companies invest in U.S. businesses, known as foreign direct investment (FDI), it not only provides jobs, but relatively high-paying jobs – indeed, up to 30% higher-paying. Encouraging more FDI and expanding the number of countries that invest in the United States would potentially lead to more economic growth and create even more new, high-paying U.S. jobs.
