Writing this section we had a look into the sizes of the memory and speed that different micro chips have. Just to make a comparison between the size and speed of present days super computers to things that you know. EXPAND OR LIVE OUT WHEN WE KNOW MORE ABOUT THIS....
The experiments that we discussed previously represent the type of numerical experiments that are required to learn about the mechanisms that may heat the solar corona.
The answer is not simple, because it depends on several parameters
Computer | Execution time |
PC, Pentium, 100 Mhz | ?? |
PC, Pentium, II 233 Mhz | 2800 hours (115 days) |
PC, Pentium, II 450 Mhz | 1400 hours (58 days) |
SGI workstation, 1 processor | 560 hours (23 days) |
Cray T3D, 64 processors | 61 hours |
Cray T3D, 128 processors | 34 hours |
Cray T3E, 128 processors | 9 hours |
The experiment used for the scaling here contains relative few grid points. Even though it is found that the execution time of one experiment is much to long on the single processor computers to be of any practical use. Therefore we need access to large multi-cpu computers to execute such experiments with in a reasonable time scale - a few days.
The amount of data that one such run produces depends on how often it writes the solution to disk. Each snapshot, for this size experiments - 723 grid points, takes up 12 Mega bytes. To follow the dynamical development with a reasonable time resolution requires about 400 of these, or 4.8 Giga bytes of data from one experiment. New PCs typically have hard disks of 4-6 Giga bytes size for comparison.
When the grid size is doubled from 723 to 1443 then the data sizes increases with a factor of 8, but the time it takes for the experiment to reach the same time increases with a factor of 16! Therefore even with today's computers it is difficult to perform very large experiments with in a reasonable time scale.
If we today are struggling making large scale numerical experiments, how did they managed in the past?
return to the ..... KLAUS THIS DOESN'T GO ANYWHERE