Scientific computing

From Wikiversity
Jump to navigation Jump to search

Scientific computing is part of the School of Computer Science. This will contain lessons on numerical algorithms, modelling and simulation and bioinformatics.


Scientific computing is the science of solving problems with computers. The problems themselves usually arise from other disciplines such as mathematics, engineering, biology, physics, chemistry and other natural sciences. As a consequence, scientific computing is interdisciplinary by nature. The dividing line between scientific computing and the sciences from which its problems originate is best described by what scientific computing is not -- and what it is.

Anton is a massively parallel supercomputer. It is a special-purpose system for molecular dynamics (MD) simulations.
  • Computing Pi to 22.4 trillion digits[1] is not scientific computing. Developing algorithms to efficiently compute '"`UNIQ--postMath-00000001-QINU`"' to any precision is scientific computing.
  • Running a Molecular dynamics simulation with 1,000,000 atoms for 100 nanoseconds is not scientific computing. Developing models and algorithms to efficiently simulate large particle systems is scientific computing.
  • Computing the eigenvalues of a 1,000 x 1,000 dense, complex matrix is not scientific computing. Developing efficient and accurate methods to determine the eigevalues of any large, dense, complex matrix is scientific computing.
  • Running an all-against-all sequence alignment of every genome known is not scientific computing. Developing realistic and efficient models for sequence evolution is scientific computing.

The line between scientific computing and the sciences from which its problems are derived is drawn between interest in the methods used to solve problems and the solution of the problems themselves. In other words, all scientists use computers, but very few do scientific computation.


This is just a suggestion of what courses should be offered in this Department. The actual content will probably depend more on what courses are actually implemented.

The Courses in scientific computing are divided into three main branches, each containing specific courses. It is recommended that you follow the courses within each main branch in the order presented. The main branches themselves are independent. Should dependencies across branches arise, they will be indicated at the top of each course.

Active Participants[edit]

The histories of Wikiversity pages indicate who the active participants are. If you are an active participant in this department, you can list your name here (this can help small departments grow and the participants communicate better; for large departments a list of active participants is not needed).

See also[edit]

Schools and Portals



  1. Yee, Alexander J. (2016). "y-cruncher: A Multi-Threaded Pi Program". Retrieved 28 March 2017.

Further reading[edit]

  • Winsberg, Eric. (Apr 23, 2015). "Computer Simulations in Science". The Stanford Encyclopedia of Philosophy (Summer 2015). Ed. Edward N. Zalta. Metaphysics Research Lab, Stanford University. “As computer simulation methods have gained importance in more and more disciplines, the issue of their trustworthiness for generating new knowledge has grown, especially when simulations are expected to be counted as epistemic peers with experiments and traditional analytic theoretical methods. The relevant question is always whether or not the results of a particular computer simulation are accurate enough for their intended purpose.
  • Kaufmann, William J.; Smarr, Larry L. (1993). Supercomputing and the Transformation of Science. Scientific American Library. 43. New York: Scientific American Library. ISBN 0-7167-5038-4. The development of the supercomputer has given scientists an awesome new capability: the power to virtually re-create the physical world on the computer screen, with a stunning degree of precision and sophistication. Everything from weather systems to biochemical interactions to car crashes to air pollution to high speed subatomic particle collisions can now be simulated, and observed at the scientist's will. As a result, suupercomputers have led to profound levels of insight and understanding. Indeed, they have revolutionized the very process of scientific discovery itself. Scientists no longer have to rely exclusively on either experiment-based or theoretical research methodologies, but rather can utilize both direct observation and mathematical modeling to visualize and simulate complex phenomena.
  • Davis, Dwight (July 1992). "Big-Time Changes". Computer Graphics World 15 (7): 42-52. ISSN 0271-4159. "Massively parallel processing systems alter the face and the pace of supercomputing." 
  • Emmett, Arielle (July 1992). "Something for Everybody". Computer Graphics World 15 (7): 29-41. ISSN 0271-4159. "By incorporating visual programming languages and widget-based graphical toolskits into their latest products, visualization software vendors have begun to provide scientists with the means to create sophisticated representations of their data, minus the drudgery of extensive code-writing." 
  • Alder, Bernie J. (1988). Special Purpose Computers. Computational Techniques. 5. Academic Press. ISBN 0-12-049260-1. This book describes computers designed and built to solve specific scientific problem, comparing these computers to general purpose computers in both speed and cost. Discussed as wellare the effort involved, the amount of time consumed, and the difficulties encountered in completing such projects. The special purpose computers described include the hypercube, the QCD machine, Navier-Stokes hydrodynamic solvers, classical molecular dynamics machines, and Ising model computers.
  • Lua error in Module:Citation/CS1 at line 3723: bad argument #1 to 'pairs' (table expected, got nil).
  • Hillis, W. Daniel (1985). The Connection Machine. MIT Press Artificial Intelligence Series. 17. Cambridge, Massachusetts: MIT Press. ISBN 0-262-08157-1. Today's conventional computers are characterized by sequential, serial operations proceeding back and forth between a single powerful processor and a separate large-scale memory. The Connection Machine, by contrast, is a general, universal parallel computer, in which thousands or millions of independent, simple processors - each integrated with its own small but sufficient memory - operate concurrently on different aspects or segments of the same overall problem.
  • Lua error in Module:Citation/CS1 at line 3723: bad argument #1 to 'pairs' (table expected, got nil).