This article provides a brief account of the history of the development of training opportunities in clinical research in the United States. It highlights some developments in the clinical research enterprise since World War II and focuses examination on the involvement of the U.S. government and academic sector. Clinical research training is a relatively new academic field, and curricula in the design and conduct of clinical research have only emerged since the 1980s. The growing complexity of clinical trials and the emergence of evidence-based medicine in the last several decades created great demand for clinicians with knowledge of clinical epidemiology and biostatistics. Amidst alarm bells rung by physician–scientist leaders about the endangered species of clinical researchers, numerous proposals and solutions emerged to address these workforce and educational problems in the 1990s. Traditionally, physicians wishing to expand their education had to get a master’s degree in public health or participate in unique programs such as the Robert Wood Johnson Clinical Scholars Program. Since the 1990s, the National Institutes of Health, through K awards, the Roadmap Initiative, and other funding mechanisms, has furnished tremendous support for the development of clinical research training opportunities from predoctoral immersion programs to degree-granting graduate programs. The author discusses key components of successful clinical research training programs and concludes with empirical recommendations for promoting careers in clinical research.