Formal infection control programs were not evident in the U.S. until well into the 20th century. During this post-war period the Joint Commission for the Accreditation of Healthcare Organizations (JCAHO) and several Hospital Infection Control-related professional organizations were founded that exerted considerable influence on U.S. hospital practices and differing from European countries where government agencies brought to bear the greatest influence on practice. CDC’s Study on the Efficacy of Nosocomial Infection Control (SENIC) provided scientific evidence that effective infection surveillance and control programs were strongly associated with lower rates of hospital-acquired infection. Based upon SENIC findings, the JCAHO established requirements for adequate infection control personnel. At about the same time, CDC also began a surveillance system for monitoring hospital-acquired infections in U.S. hospitals in 1970. In the 1980s, CDC published its first guidelines for infection control, which revolutionized practice standards in the U.S. By 2000, advocacy groups demanded heightened transparency including publicly disclosed health care-associated infection (HAI) rates, now required by several U.S. states. In 2005, the U.S Government began using explicit financial incentives for quality measures including some HAI rates. Efforts to reduce costs and improve the quality of patient care will intensify the role of U.S. infection control programs in the future.