Cloud computing is a hot topic in the technology world these days. Even if you're not a tech-phile, chances are if you've watched a lot of television or skimmed a business magazine, you've heard someone talking about cloud computing as the way of the future.
While it's difficult to predict the future, a cloud computing infrastructure project developed at Argonne National Lab, called Nimbus, is demonstrating that cloud computing's potential is being realized now.
So what exactly is cloud computing? There are varying definitions, but cloud computing is essentially a form of distributed computing that allows users the ability to tap into a vast network of computing resources through the Internet to complete their work. If, for example, someone wanted to analyze traffic patterns on the nation's highways, they could upload and store their data into the 'cloud' and have multiple computers crunch the data and then present the results back to them in a unified way as if the work were completed by one giant machine.
Why the word 'cloud'? Some sources believe the term originated in 20th Century telephone systems. Kate Keahey, the lead on the Nimbus project at Argonne, believes the phrase was created when researchers were trying to visualize this type of computing on a whiteboard and made a circular set of squiggles to represent the many components in the internet that would do the computational work. Since these drawings looked liked clouds, Keahey says, researchers soon started saying that data would go 'up to the cloud' for processing.
If all of this sounds familiar, you may have heard this concept before, according to Keahey. Previous decades brought us something called grid computing, which was another type of distributed computing that allowed users to tap into computing resources through a network to get their computational jobs done. But Keahey argues that cloud computing is an evolution of grid computing, with some important differences. With grid computing, you submit what you want computed to a batch scheduler, which puts your job in a queue for a specific set of computing resources, for example a supercomputer, to work on.
"This means you have no control over when your job might execute," Keahey says. You may have to wait as long as a few days before your job is called up, and you're pretty much at the mercy of how that particular grid asset is set up. If its configuration doesn't quite match the complexities of your job, fixing the problem may get very complicated.
Cloud computing, on the other hand, can greatly mitigate this one-size-must-fit-all approach to distributed computing. Many cloud computing platforms allow users to know ahead of time how much computing capacity is available from the cloud, so the work can be done faster. Users can also configure a 'virtual machine' that exists within the cloud to meet the particulars of the jobs they are trying to accomplish. Once a user has configured the type of virtual machine they need for their work, they can go to different cloud computing providers and recreate the system they need to get their jobs done, making computation power a commodity.
Nimbus is an example of such an adaptable system. Keahey and her team developed this open source cloud computing infrastructure to allow scientists working on data-intensive research projects to be able to use such virtual machines with a cloud provider. Nimbus also allows users to create multiple virtual machines to complete specific computational jobs that can be deployed throughout the cloud and still work in tandem with each other. This flexibility allows a user to configure a virtual machine and then connect it to resources on a cloud, regardless of who is providing the cloud.
Having this kind of flexibility and on-demand computing power is vital to projects that are extremely data-intensive, such as research efforts in experimental and theoretical physics. Nimbus has already been deployed successfully to support the STAR nuclear physics experiment at Brookhaven National Laboratory's Relativistic Heavy-Ion Collider. When researchers there needed to turn the massive amounts of data they had generated into viable simulations for an international conference, they used Nimbus to create virtual machines that were run through commercial cloud computing providers.
Creating the virtual machines was relatively easy. "With Nimbus, a virtual cluster can be online in minutes," Keahey says, and the computing cloud they tapped into provided the computational power they needed to get the simulations done on time. Keahey and her team are now collaborating with CERN in Europe to process the data generated by physics experiments being done there.
Keahey and others in the field believe that this use of custom-crafted virtual machines there are relatively easy to configure on computing clouds will handle more and more of the heavy computational lifting in the future.
Dana W. Cruikshank | EurekAlert!
Fingerprints of quantum entanglement
16.02.2018 | University of Vienna
Simple in the Cloud: The digitalization of brownfield systems made easy
07.02.2018 | Deutsches Forschungszentrum für Künstliche Intelligenz GmbH, DFKI
Breakthrough provides a new concept of the design of molecular motors, sensors and electricity generators at nanoscale
Researchers from the Institute of Organic Chemistry and Biochemistry of the CAS (IOCB Prague), Institute of Physics of the CAS (IP CAS) and Palacký University...
For photographers and scientists, lenses are lifesavers. They reflect and refract light, making possible the imaging systems that drive discovery through the microscope and preserve history through cameras.
But today's glass-based lenses are bulky and resist miniaturization. Next-generation technologies, such as ultrathin cameras or tiny microscopes, require...
Scientists from the University of Zurich have succeeded for the first time in tracking individual stem cells and their neuronal progeny over months within the intact adult brain. This study sheds light on how new neurons are produced throughout life.
The generation of new nerve cells was once thought to taper off at the end of embryonic development. However, recent research has shown that the adult brain...
Theoretical physicists propose to use negative interference to control heat flow in quantum devices. Study published in Physical Review Letters
Quantum computer parts are sensitive and need to be cooled to very low temperatures. Their tiny size makes them particularly susceptible to a temperature...
Let’s say the armrest is broken in your vintage car. As things stand, you would need a lot of luck and persistence to find the right spare part. But in the world of Industrie 4.0 and production with batch sizes of one, you can simply scan the armrest and print it out. This is made possible by the first ever 3D scanner capable of working autonomously and in real time. The autonomous scanning system will be on display at the Hannover Messe Preview on February 6 and at the Hannover Messe proper from April 23 to 27, 2018 (Hall 6, Booth A30).
Part of the charm of vintage cars is that they stopped making them long ago, so it is special when you do see one out on the roads. If something breaks or...
15.02.2018 | Event News
13.02.2018 | Event News
12.02.2018 | Event News
16.02.2018 | Information Technology
16.02.2018 | Health and Medicine
16.02.2018 | Physics and Astronomy