The first electronic digital computer (called ENIAC - the Electronic Numerical Integrator and Calculator) was developed in 1946 and contained over 18,000 vacuum tubes.
John William Mauchly and J. Presper Eckert are the scientists credited with the invention of the Electronic Numerical Integrator And Computer (ENIAC), the first general-purpose electronic digital computer, which was completed in 1946.
In recent years, Mauchly had become interested in electronic calculating machines. At the Moore School, he gained a better understanding of electronic engineering and the mathematics of ballistics computations. He and Eckert discussed the possibility of building a large electronic computer. In 1942, Mauchly drafted a memo outlining the first large-scale digital electronic computer designed for general numerical computations. An official proposal was submitted in April of 1943, and the U.S. Army provided a grant for "Project PX," which Mauchly and Eckert undertook together.
The ENIAC was born out of a combination of many different design ideas. Mauchly, who was responsible for much of the overall design, is said to have been influenced by the work of Iowa State College professor John V. Atanasoff, who had designed and built an electronic computing device between 1937 and 1942 with a graduate student, Clifford Berry. Eckert was the main project engineer for ENIAC. He overcame many difficult technical challenges in getting it to work.
The ENIAC was unveiled to the public on February 14, 1946. Though it had been founded as a technology that might help the war effort, the war was, of course, over by that time. Nevertheless, the ENIAC was employed by the military to perform calculations for the design of a hydrogen bomb, weather prediction, cosmic-ray studies, thermal ignition, random-number studies, and wind-tunnel design. It was built out of 17,468 electronic vacuum tubes and weighed more than 60,000 pounds. At the time, it was the largest single electronic apparatus in the world. The system could perform 5,000 additions and 300 multiplications per second, which is slow by today's standards (microprocessors now perform 100 million additions per second) but 1,000 times faster than any existing machines. It was also highly reliable. It marked the beginning of a long road of computer technology development.