Computer Systems

  Home arrow Computer Systems arrow Millennium Run, Simulating the Univers...
Watch our Tech Videos 
Dev Hardware Forums 
Computer Cases  
Computer Processors  
Computer Systems  
Digital Cameras  
Flat Panels  
Gaming  
Hardware Guides  
Hardware News  
Input Devices  
Memory  
Mobile Devices  
Motherboards  
Networking Hardware  
Opinions  
PC Cooling  
PC Speakers  
Peripherals  
Power Supply Units  
Software  
Sound Cards  
Storage Devices  
Tech Interviews  
User Experiences  
Video Cards  
Weekly Newsletter
 
Developer Updates  
Free Website Content 
 RSS  Articles
 RSS  Forums
 RSS  All Feeds
Write For Us 
Contact Us 
Site Map 
Privacy Policy 
Support 
 USERNAME
 
 PASSWORD
 
 
  >>> SIGN UP!  
  Lost Password? 
COMPUTER SYSTEMS

Millennium Run, Simulating the Universe
By: Developer Shed
  • Search For More Articles!
  • Disclaimer
  • Author Terms
  • Rating: 4 stars4 stars4 stars4 stars4 stars / 16
    2005-07-19

    Table of Contents:
  • Millennium Run, Simulating the Universe
  • Using the Results
  • The Earth Simulator
  • The Blue Brain

  • Rate this Article: Poor Best 
      ADD THIS ARTICLE TO:
      Del.ici.ous Digg
      Blink Simpy
      Google Spurl
      Y! MyWeb Furl
    Email Me Similar Content When Posted
    Add Developer Shed Article Feed To Your Site
    Email Article To Friend
    Print Version Of Article
    PDF Version Of Article
     
     

    SEARCH DEV HARDWARE

    Millennium Run, Simulating the Universe


    (Page 1 of 4 )

    A supercomputer used for a research project known as the “Millennium Run” aims to simulate the evolution of the universe, from the Big Bang to present. To prove or revise theories about the universe’s creation, the simulation tracks billions of mass points and tracks the movement of dark matter. The results can then be compared to the current state of the universe. Read on to find out more.

    The Virgo Consortium is responsible for the Millennium Run, which is performed on one of the fastest supercomputers in the world. Located at the Max Planck Society in Garching, Germany, the machine is rated #85 on the Top 500 list of 2004 (http://www.top500.org). Its 822 processors are reported as reaching 2.198 TFLOPS (trillion floating point operations per second), which is still impressive despite the world’s top machine reaching 70.72 TFLOPS. Five years ago, the fastest supercomputer ran at a mere 0.170 TFLOPS.

    Researchers are using data about the radiation spreading after the Big Bang, collected by heat detecting satellites. Also, the simulation uses laws of physics established here on earth and our current understanding of the makeup of the universe. This information is turned into equations and algorithms and used to track some of the largest masses since the universe began. The very principle of the experiment is that people can quantify and understand every factor in the spreading of the universe as numbers, and that there must be an equation for everything with a noticeable impact on the universe’s evolution.

    There supercomputer is tracking 10,000,000,000 points of mass, the largest of such simulations which still only accounts for about 0.003% of the total mass of the universe that we know about. The mass points are roughly a billion times the size of our sun, and they do not correspond to anything substantial. For instance, a mass point is not the location of a star or black hole. The mass points are just for simulation purposes, trying to add proportionate gravity to areas of the universe where there are stellar objects.

    For each step of the simulation, the program must calculate the gravitational pull of each object on every other object. Each single mass point is moved by every one of the billions of other points. To do the calculations for every point over billions of years would take 60 thousand years on the current hardware setup. Because this is a little too long to be helpful to researchers, they developed a system of separating the simulated universe into smaller sections. Mass points within each section are summed, and their combined mass is then used for the calculations. Instead of measuring the relationship between every single mass point, the points are instead moved by the combined mass of sections of the universe. After the revised programming, it took the supercomputer roughly a month of constant work to come up with the first set of results.

    More Computer Systems Articles
    More By Developer Shed

    blog comments powered by Disqus

    COMPUTER SYSTEMS ARTICLES

    - Apple Doubles iPad Max Storage
    - CES 2013 Brings New Toys, Devices, More
    - Latest iMac Not For Tinkerers
    - Latest Apple Offerings: iPad Mini, Redesigne...
    - Digital Storm ODE and HP Pavilion Slimline
    - Alienware M11x and HP ProBook 5330m Laptop R...
    - Acer Aspire Ethos and TimelineX Laptop Review
    - HP Pavilion dv6t-6000 and Dell XPS 15z Lapto...
    - Lenovo IdeaCentre and Alienware Aurora Review
    - Gateway FX6850-51u and Velocity Micro Edge Z...
    - HP Pavilion p6720f and the Gateway DX4850-45...
    - HP Pavilion g6-1a69us and Toshiba Satellite ...
    - Toshiba Portege R835-P56X and Toshiba Satell...
    - Asus K53E-B1 and the Toshiba Satellite L655-...
    - Toshiba Portege R705-P35 and the Samsung QX4...

    Developer Shed Affiliates

     




    © 2003-2019 by Developer Shed. All rights reserved. DS Cluster - Follow our Sitemap
    KEITHLEE2zdeconfigurator/configs/INFUSIONSOFT_OVERLAY.phpzdeconfigurator/configs/ OFFLOADING INFUSIONSOFTLOADING INFUSIONSOFT 1debug:overlay status: OFF
    overlay not displayed overlay cookie defined: TI_CAMPAIGN_1012_D OVERLAY COOKIE set:
    status off