Significant increases in National Institutes of Health (NIH) spending on medical research have not produced corresponding increases in new treatments and cures. Instead, laboratory discoveries remain in what has been termed the “valley of death,” the gap between bench research and clinical application. Recently, there has been considerable discussion in the literature and scientific community about the causes of this phenomenon and how to bridge the abyss. In this article, the authors examine one possible explanation: Clinician–scientists' declining role in the medical research enterprise has had a dilatory effect on the successful translation of laboratory breakthroughs into new clinical applications. In recent decades, the percentage of MDs receiving NIH funding has drastically decreased compared with PhDs. The growing gap between the research and clinical enterprises has resulted in fewer scientists with a true understanding of clinical problems as well as scientists who are unable to or uninterested in gleaning new basic research hypotheses from failed clinical trials. The NIH and many U.S. medical schools have recognized the decline of the clinician–scientist as a major problem and adopted innovative programs to reverse the trend. However, more radical action may be required, including major changes to the NIH peer-review process, greater funding for translational research, and significantly more resources for the training, debt relief, and early career support of potential clinician–scientists. Such improvements are required for clinician–scientists to conduct translational research that bridges the valley of death and transforms biomedical research discoveries into tangible clinical treatments and technologies.