Scientific computing/History

From Wikiversity
Jump to navigation Jump to search
IBM 704 (1957)

Early to mid 20th century scientific computing was performed by human "computers" sometimes assisted by calculating machines, differential analysers, or punch card tabulating machines.[1] Electronic digital computers became available starting in 1946 with ENIAC and by 1953 at least two dozen major computers had been under construction.[2] The first mass-produced computer with floating point hardware, the IBM 704, was introduced in 1954. The FORTRAN progamming language was created a short time later to produce efficient code with less effort by the programmer.[3] High speed computing systems, some employing concurrent operation, developed rapidly during the next decade with notable examples such as the CDC 6600 and ILLIAC IV.[4] By the late 1960s a diverse variety of these specialized systems for scientific computing became known as supercomputers.[5]

In 1974 the commercial vector processor supercomputer CDC STAR-100 was introduced.[6]

Further reading[edit]

A selection of works from the history of scientific computing.

  • Ivie, Peter; Thain, Douglas (2018). "Reproducibility in Scientific Computing". ACM Computing Surveys 51 (3): 1–36. doi:10.1145/3186266. "Reproducibility is widely considered to be an essential requirement of the scientific process. However, anumber of serious concerns have been raised recently, questioning whether today’s computational work isadequately reproducible" 
  • Winsberg, Eric. (Apr 23, 2015). "Computer Simulations in Science". The Stanford Encyclopedia of Philosophy (Summer 2015). Ed. Edward N. Zalta. Metaphysics Research Lab, Stanford University. “As computer simulation methods have gained importance in more and more disciplines, the issue of their trustworthiness for generating new knowledge has grown, especially when simulations are expected to be counted as epistemic peers with experiments and traditional analytic theoretical methods. The relevant question is always whether or not the results of a particular computer simulation are accurate enough for their intended purpose.
  • Winsberg, Eric (2010), "Science in the Age of Computer Simulation", Physics Today, University Of Chicago Press, 64 (8): 50, Bibcode:2011PhT....64h..50S, doi:10.1063/PT.3.1221, ISBN 978-0226902029, retrieved 2018-01-20, Winsberg explores the impact of simulation on such issues as the nature of scientific evidence; the role of values in science; the nature and role of fictions in science; and the relationship between simulation and experiment, theories and data, and theories at different levels of description.
  • Kaufmann, William J.; Smarr, Larry L. (1993). Supercomputing and the Transformation of Science. Scientific American Library. 43. New York: Scientific American Library. ISBN 0-7167-5038-4. The development of the supercomputer has given scientists an awesome new capability: the power to virtually re-create the physical world on the computer screen, with a stunning degree of precision and sophistication. Everything from weather systems to biochemical interactions to car crashes to air pollution to high speed subatomic particle collisions can now be simulated, and observed at the scientist's will. As a result, suupercomputers have led to profound levels of insight and understanding. Indeed, they have revolutionized the very process of scientific discovery itself. Scientists no longer have to rely exclusively on either experiment-based or theoretical research methodologies, but rather can utilize both direct observation and mathematical modeling to visualize and simulate complex phenomena.
  • Davis, Dwight (July 1992). "Big-Time Changes". Computer Graphics World 15 (7): 42–52. ISSN 0271-4159. "Massively parallel processing systems alter the face and the pace of supercomputing." 
  • Emmett, Arielle (July 1992). "Something for Everybody". Computer Graphics World 15 (7): 29–41. ISSN 0271-4159. "By incorporating visual programming languages and widget-based graphical toolskits into their latest products, visualization software vendors have begun to provide scientists with the means to create sophisticated representations of their data, minus the drudgery of extensive code-writing." 
  • Alder, Bernie J. (1988). Special Purpose Computers. Computational Techniques. 5. Academic Press. ISBN 0-12-049260-1. This book describes computers designed and built to solve specific scientific problem, comparing these computers to general purpose computers in both speed and cost. Discussed as wellare the effort involved, the amount of time consumed, and the difficulties encountered in completing such projects. The special purpose computers described include the hypercube, the QCD machine, Navier-Stokes hydrodynamic solvers, classical molecular dynamics machines, and Ising model computers.
  • Karin, Sidney; Smith, Norris Parker (1987). The Supercomputer Era. ISBN 0-15-186787-9. Supercomputers - fare more powerful then the largest mainframes - comprise one of the fastest growing segments of the computer industry. Spurred by $200 million in federal funds, new supercomputing centers at American universities are making it possible for much larger numbers of researchers in science and industry to take full advantage of the enormous speed and power of supercomputers and at much lower cost.
  • Hillis, W. Daniel (1985). The Connection Machine. MIT Press Artificial Intelligence Series. 17. Cambridge, Massachusetts: MIT Press. ISBN 0-262-08157-1. Today's conventional computers are characterized by sequential, serial operations proceeding back and forth between a single powerful processor and a separate large-scale memory. The Connection Machine, by contrast, is a general, universal parallel computer, in which thousands or millions of independent, simple processors - each integrated with its own small but sufficient memory - operate concurrently on different aspects or segments of the same overall problem.
  • Feigenbaum, Edward A.; McCorduck, Pamela (1983). The Fifth Generation: Artificial Intelligence and Japan's Computer Challenge to the World. ISBN 0-201-11519-0. The Fifth Generation: a new breed of supercomputers so fast they can surpass today's machines a thousand times over, so smart that they can outthink humans. Science fiction? Not quite. Japan has proclaimed to the world that in ten years it intends to develop and market the Fifth Generation of computers - artificially intelligent machines that can reason, draw conclusions, make judgments, and even understand the written and spoken word. In a crash program, comparable to the U.S. space effort, Japan has gathered the best and brightest under a charismatic leader and backed the enterprise with significant resources.

References[edit]

  1. Comrie, L. J. (1944). "Recent Progress in Scientific Computing". Journal of Scientific Instruments 21 (8): 129–135. doi:10.1088/0950-7671/21/8/301. "I should like to make a retrospective survey of thc last dozcn years or so, and speak about the influence of calculating machines and mathematical tables on our attitude towards computing." 
  2. Sheldon, John; Thomas, L. H. (1953). "The Use of Large Scale Computing in Physics". Journal of Applied Physics 24 (3): 235–242. doi:10.1063/1.1721257. "As these projects are completed, more computers will become available to physicists everywhere. We discuss in this article what this new facility may mean to physicists within the next few years." 
  3. Backus, J. W.; Heising, W. P. (1964). "Fortran". IEEE Transactions on Electronic Computers (4): 382–385. doi:10.1109/PGEC.1964.263818. "The 704 was the first commercial computer with built-in floating point, therefore most common operations were fast enough that poor coding of loops would greatly reduce efficiency." 
  4. Flynn, M.J. (1966). "Very high-speed computing systems". Proceedings of the IEEE 54 (12): 1901–1909. doi:10.1109/PROC.1966.5273. "This paper is an attempt to explore large scientific computing equipment, reviewing possible organizations starting with the 'concurrent' organizations which are presently in operation and then examining the other theoretical organizational possibilities." 
  5. Higbie, L. C. (1973). "Tutorial: Supercomputer architecture". Computer 6 (12): 48–58. doi:10.1109/MC.1973.6540219. "In those days the main criterion for qualifying as a supercomputer was the number of instructions handled per unit time. Today, we know a good deal more about this category of machine, but it's getting harder and harder to define the term — mainly because of its growth in number and variety." 
  6. Theis, D. J. (1974). "Vector supercomputers". Computer 7 (4): 52–61. doi:10.1109/MC.1974.6323500.