Within the new institution, the “Bernstein Facility Simulation and Database Technology” will be established that is integrated into the National Bernstein Network Computational Neuroscience, funded by the Federal Ministry for Education and Research (BMBF). Providing advisory support to the network’s scientists, the new facility will be optimally integrated into the field of Computational Neuroscience in Germany.
Computer simulations and theoretical models are increasingly important tools for understanding the complex processes of our brain. The Simulation Laboratory supports neuroscientists from all over Europe in the optimal use of the Jülich supercomputers. It will also spur the development of theoretical models and standardisation in the field of brain research.
The most powerful computer in the world sits in our head. About 100 billion nerve cells interact in the brain. The rules by which the cells and brain areas communicate with each other and how they are altered by neurological diseases are increasingly being investigated in simulations. But the more realistic the simulations are, the more computationally intensive they are, too. In addition, neuroscientific methods that produce very large amount of data in a very short time are gaining in importance. Such high-throughput methods require new approaches to data processing. Therefore, the researchers at Europe’s largest computing center in Jülich will also develop methods that enable the analysis of ever larger data sets of neuroscience.
To fully exploit the performance of the Jülich supercomputer such as JUGENE, it is necessary to adapt the simulations of brain processes to their specific needs and opportunities. “Today's supercomputers consist of hundreds of thousands of cores. To efficiently distribute a simulation via these processors, we need completely new data structures and communication algorithms as compared to those that we used for smaller systems,” explains Markus Diesmann, Professor for Computational Neuroscience at the Research Center Jülich.
With the support of experts in computational neuroscience, data analysis, anatomy, virtual reality and supercomputing, neuroscientists have the possibility to adapt and optimise their programs. By improved standardisation of the model description, the researchers hope to achieve both better comparability as well as a simplified combination of different sub-models.
By integrating the “Bernstein Facility Simulation and Database Technology” into the National Bernstein Network Computational Neuroscience, the facility is from the outset well connected to the German neuroscience research landscape. The Bernstein Network connects more than 200 research groups. Here, large amounts of relevant neurobiological data are collected and complex models and simulations are used. The latter rely on the long term availability and development of simulation-software andsome of them are only processable at the Jülich supercomputers. The cooperation with the Bernstein Network is an excellent example of how long-term institutional funding of the Helmholtz Association and BMBF project funding can complement each other towards a common goal: understanding the brain.
NIST scientists discover how to switch liver cancer cell growth from 2-D to 3-D structures
17.11.2017 | National Institute of Standards and Technology (NIST)
High speed video recording precisely measures blood cell velocity
15.11.2017 | ITMO University
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
20.11.2017 | Earth Sciences
20.11.2017 | Earth Sciences
20.11.2017 | Life Sciences