Scientific computing is part of the School of Computer Science. This will contain lessons on numerical algorithms, modelling and simulation and bioinformatics.
Scientific computing is the science of solving problems with computers. The problems themselves usually arise from other disciplines such as mathematics, engineering, biology, physics, chemistry and other natural sciences. As a consequence, scientific computing is interdisciplinary by nature. The dividing line between scientific computing and the sciences from which its problems originate is best described by what scientific computing is not -- and what it is.
- Computing Pi to 22.4 trillion digits is not scientific computing. Developing algorithms to efficiently compute to any precision is scientific computing.
- Running a Molecular dynamics simulation with 1,000,000 atoms for 100 nanoseconds is not scientific computing. Developing models and algorithms to efficiently simulate large particle systems is scientific computing.
- Computing the eigenvalues of a 1,000 x 1,000 dense, complex matrix is not scientific computing. Developing efficient and accurate methods to determine the eigenvalues of any large, dense, complex matrix is scientific computing.
- Running an all-against-all sequence alignment of every genome known is not scientific computing. Developing realistic and efficient models for sequence evolution is scientific computing.
The line between scientific computing and the sciences from which its problems are derived is drawn between interest in the methods used to solve problems and the solution of the problems themselves. In other words, all scientists use computers, but very few do scientific computation.
This is just a suggestion of what courses should be offered in this Department. The actual content will probably depend more on what courses are actually implemented.
The Courses in scientific computing are divided into three main branches, each containing specific courses. It is recommended that you follow the courses within each main branch in the order presented. The main branches themselves are independent. Should dependencies across branches arise, they will be indicated at the top of each course.
The histories of Wikiversity pages indicate who the active participants are. If you are an active participant in this department, you can list your name here (this can help small departments grow and the participants communicate better; for large departments a list of active participants is not needed).
Schools and Portals
- Yee, Alexander J. (2016). "y-cruncher: A Multi-Threaded Pi Program". Retrieved 28 March 2017.
- Winsberg, Eric. (Apr 23, 2015). "Computer Simulations in Science". The Stanford Encyclopedia of Philosophy (Summer 2015). Ed. Edward N. Zalta. Metaphysics Research Lab, Stanford University. “As computer simulation methods have gained importance in more and more disciplines, the issue of their trustworthiness for generating new knowledge has grown, especially when simulations are expected to be counted as epistemic peers with experiments and traditional analytic theoretical methods. The relevant question is always whether or not the results of a particular computer simulation are accurate enough for their intended purpose.”
- Winsberg, Eric (2010), Science in the Age of Computer Simulation, University Of Chicago Press, ISBN 0226902021, retrieved 2018-01-20,
Winsberg explores the impact of simulation on such issues as the nature of scientific evidence; the role of values in science; the nature and role of fictions in science; and the relationship between simulation and experiment, theories and data, and theories at different levels of description.
- Kaufmann, William J.; Smarr, Larry L. (1993). Supercomputing and the Transformation of Science. Scientific American Library. 43. New York: Scientific American Library. ISBN 0-7167-5038-4.
The development of the supercomputer has given scientists an awesome new capability: the power to virtually re-create the physical world on the computer screen, with a stunning degree of precision and sophistication. Everything from weather systems to biochemical interactions to car crashes to air pollution to high speed subatomic particle collisions can now be simulated, and observed at the scientist's will. As a result, suupercomputers have led to profound levels of insight and understanding. Indeed, they have revolutionized the very process of scientific discovery itself. Scientists no longer have to rely exclusively on either experiment-based or theoretical research methodologies, but rather can utilize both direct observation and mathematical modeling to visualize and simulate complex phenomena.
- Davis, Dwight (July 1992). "Big-Time Changes". Computer Graphics World 15 (7): 42-52. ISSN 0271-4159. "Massively parallel processing systems alter the face and the pace of supercomputing."
- Emmett, Arielle (July 1992). "Something for Everybody". Computer Graphics World 15 (7): 29-41. ISSN 0271-4159. "By incorporating visual programming languages and widget-based graphical toolskits into their latest products, visualization software vendors have begun to provide scientists with the means to create sophisticated representations of their data, minus the drudgery of extensive code-writing."
- Alder, Bernie J. (1988). Special Purpose Computers. Computational Techniques. 5. Academic Press. ISBN 0-12-049260-1.
This book describes computers designed and built to solve specific scientific problem, comparing these computers to general purpose computers in both speed and cost. Discussed as wellare the effort involved, the amount of time consumed, and the difficulties encountered in completing such projects. The special purpose computers described include the hypercube, the QCD machine, Navier-Stokes hydrodynamic solvers, classical molecular dynamics machines, and Ising model computers.
- Karin, Sidney; Smith, Norris Parker (1987). The Supercomputer Era. ISBN 0-15-186787-9.
Supercomputers - fare more powerful then the largest mainframes - comprise one of the fastest growing segments of the computer industry. Spurred by $200 million in federal funds, new supercomputing centers at American universities are making it possible for much larger numbers of researchers in science and industry to take full advantage of the enormous speed and power of supercomputers and at much lower cost.
- Hillis, W. Daniel (1985). The Connection Machine. MIT Press Artificial Intelligence Series. 17. Cambridge, Massachusetts: MIT Press. ISBN 0-262-08157-1.
Today's conventional computers are characterized by sequential, serial operations proceeding back and forth between a single powerful processor and a separate large-scale memory. The Connection Machine, by contrast, is a general, universal parallel computer, in which thousands or millions of independent, simple processors - each integrated with its own small but sufficient memory - operate concurrently on different aspects or segments of the same overall problem.
- Feigenbaum, Edward A.; McCorduck, Pamela (1983). The Fifth Generation: Artificial Intelligence and Japan's Computer Challenge to the World. ISBN 0-201-11519-0.
The Fifth Generation: a new breed of supercomputers so fast they can surpass today's machines a thousand times over, so smart that they can outthink humans. Science fiction? Not quite. Japan has proclaimed to the world that in ten years it intends to develop and market the Fifth Generation of computers - artificially intelligent machines that can reason, draw conclusions, make judgments, and even understand the written and spoken word. In a crash program, comparable to the U.S. space effort, Japan has gathered the best and brightest under a charismatic leader and backed the enterprise with significant resources.