Change is coming to the IT world, says Dave Robertson, coordinator of the EU-funded OpenKnowledge research project.
Just as individuals are now storing, editing and sharing photos and videos on the web, other users from small businesses to CERN’s Large Hadron Collider are moving their data, computation, and collaboration into “the cloud” – the internet’s worldwide network of servers and computers, but also the millions of handheld devices, monitors, sensors and other components linked to it.
“More and more companies are pushing much of what they do out into the cloud,” says Robertson. “If that’s the way things are going, and if it’s going to be very large, then society needs some way to be able to take control of how that gets coordinated.”
Creating a toolkit to access, coordinate and exploit the cloud’s dynamic resources is what OpenKnowledge set out to accomplish.
If the IT world embraces OpenKnowledge, says Robertson, users will no longer have to rely on a small number of big companies to access interactive internet services.
“You would see lots of people who weren’t so specialised writing and using and sharing lots of specific keys that would unlock what’s available on the internet for themselves and others,” says Robertson.
Roles, rules, results
Suppose, says Robertson, a number of potential partners want to create a new service or product. One might manage a database, another can analyse the data, a third can package and present the results, while the fourth has marketing and management skills.
These potential partners could be anywhere in the world and could be using a wide variety of software, natural and computer languages, internet interfaces and devices.
The OpenKnowledge researchers – who, as members of an EU-funded project are themselves scattered from Scotland to Spain – set out to create a user-friendly system that would let these virtual partners find each other, define their respective roles, figure out the rules and sequences that will let them interact smoothly, and get their new enterprise up and running.
To accomplish that goal, the OpenKnowledge team created a new language for specifying the kind of processes that let different systems interact with each other. The language is called LCC, for Lightweight Coordination Calculus.
“We’ve gone for the simplest way to understand a process that we could possibly devise,” says Robertson.
The researchers also found a way to deal with the fact that the same step in a process is likely to be labelled in different ways by different components of the system.
For example, a handheld device might use an asterisk to signal that it is about to send a number while the database where that number is needed might expect to receive an input labelled “price.”
System engineers often approach this semantic problem by building what are called global ontologies – essentially dictionaries that specify the labels and properties of all the objects or events within a system.
In situations where such rules of interaction already exist, OpenKnowledge will find and use them.
Most of the time, however, that approach will not work because there is no way of knowing in advance what devices or systems will be interacting in a particular exchange.
In that case, OpenKnowledge uses statistical regularities to build a much smaller dictionary that defines only the steps that are needed for the purpose at hand.
“You know that you’re at some specific point,” says Robertson, “and you look to see what other people were doing at the same point. As the system gets used, you have a lot of interactions, possibly thousands or millions. That’s where your mapping comes from.”
But can I trust you?
Like anyone using the internet, OpenKnowledge clients are vulnerable. For example, a partner might provide poor quality services, or not be who he claims to be.
The researchers believe they have solved that problem to some degree by building measures of reputation into their software package. One approach is to measure how often interactions with a potential partner have gone well. Another is to see how often they have interacted with other trustworthy partners.
“We do exactly the same things that are used to rate web pages, but with these more complicated forms of information,” says Robertson.
All of the key OpenKnowledge functions – discovering and interpreting interactions, ontology matching, and reputation checking – reside in the OpenKnowledge kernel, an open-source software package that can be downloaded from the project’s website.
Robertson and his colleagues have tested OpenKnowledge in three real-world areas: healthcare coordination, proteomics research, and emergency response. These applications will be featured in a subsequent ICT Results feature on 29 December.
In the meantime, they are eager for others to use OpenKnowledge to unlock the cloud’s capabilities and choreograph their own ideas.
“It will only become revolutionary,” Robertson writes, “if clever people invent interactions that are really useful for lots of other people.”
The OpenKnowledge project received funding from ICT strand of the Sixth Framework Programme for research.
This is the first of a two-part feature on OpenKnowledge.
Christian Nielsen | alfa
New silicon structure opens the gate to quantum computers
12.12.2017 | Princeton University
PhoxTroT: Optical Interconnect Technologies Revolutionized Data Centers and HPC Systems
11.12.2017 | Fraunhofer-Institut für Zuverlässigkeit und Mikrointegration IZM
MPQ scientists achieve long storage times for photonic quantum bits which break the lower bound for direct teleportation in a global quantum network.
Concerning the development of quantum memories for the realization of global quantum networks, scientists of the Quantum Dynamics Division led by Professor...
Researchers have developed a water cloaking concept based on electromagnetic forces that could eliminate an object's wake, greatly reducing its drag while...
Tiny pores at a cell's entryway act as miniature bouncers, letting in some electrically charged atoms--ions--but blocking others. Operating as exquisitely sensitive filters, these "ion channels" play a critical role in biological functions such as muscle contraction and the firing of brain cells.
To rapidly transport the right ions through the cell membrane, the tiny channels rely on a complex interplay between the ions and surrounding molecules,...
The miniaturization of the current technology of storage media is hindered by fundamental limits of quantum mechanics. A new approach consists in using so-called spin-crossover molecules as the smallest possible storage unit. Similar to normal hard drives, these special molecules can save information via their magnetic state. A research team from Kiel University has now managed to successfully place a new class of spin-crossover molecules onto a surface and to improve the molecule’s storage capacity. The storage density of conventional hard drives could therefore theoretically be increased by more than one hundred fold. The study has been published in the scientific journal Nano Letters.
Over the past few years, the building blocks of storage media have gotten ever smaller. But further miniaturization of the current technology is hindered by...
With innovative experiments, researchers at the Helmholtz-Zentrums Geesthacht and the Technical University Hamburg unravel why tiny metallic structures are extremely strong
Light-weight and simultaneously strong – porous metallic nanomaterials promise interesting applications as, for instance, for future aeroplanes with enhanced...
11.12.2017 | Event News
08.12.2017 | Event News
07.12.2017 | Event News
12.12.2017 | Physics and Astronomy
12.12.2017 | Earth Sciences
12.12.2017 | Power and Electrical Engineering