Information Technology

First author Brendan Cottrell in the field. Image Credit: DFO (Fisheries and Oceans Canada). Credit: DFO (Fisheries and Oceans Canada)
Information Technology

Innovative Researcher Uses Smartphone for Sea Creature Reports

Q&A with Brendan Cottrell, who investigated the use of smartphones to create 3D scans of stranded marine life that can help scientists protect marine species What inspired you to become a researcher? My interest in research began with an early love for nature, particularly the ocean and its wildlife. Drawn to conservation, I am fascinated by how technology can help study and protect marine mammals. Can you tell us about the research you’re currently working on? This research focuses on…

The overall architecture of the MLOB framework. Image Credit: Zhiming Dong et al.
Information Technology

Machine Learning on Blockchain: Enhancing Computational Security

A new study published in Engineering presents a novel framework that combines machine learning (ML) and blockchain technology (BT) to enhance computational security in engineering. The framework, named Machine Learning on Blockchain (MLOB), aims to address the limitations of existing ML-BT integration solutions that primarily focus on data security while overlooking computational security. ML has been widely used in engineering to solve complex problems, offering high accuracy and efficiency. However, it faces security threats such as data tampering and logic corruption….

Engineer developing innovative artificial intelligence solutions by DC_Studio, Envato
Information Technology

Better Poverty Mapping: New Machine-Learning Approach Enhances Aid

Leveraging national surveys, big data, and machine learning, Cornell University researchers have developed a new approach to mapping poverty that could help policymakers and NGOs better identify the neediest populations in poor countries and allocate resources more effectively. To eliminate extreme poverty, defined as surviving on less than $2.15 per person per day, governments and development and humanitarian agencies need to know how many people live under that threshold, and where. Yet that information often is lacking in the countries that…

This illustrates the principle of two oscillators giving in-phase and out-of-phase oscillation modes. Image Credit: Victor H. González
Information Technology

New Low-Cost Computer Breakthrough Enhances Accessibility

A low-energy challenger to the quantum computer that also works at room temperature may be the result of research at the University of Gothenburg. The researchers have shown that information can be transmitted using magnetic wave motion in complex networks. Spintronics explores magnetic phenomena in nano-thin layers of magnetic materials that are exposed to magnetic fields, electric currents and voltages. These external stimuli can also create spin waves, ripples in a material’s magnetisation that travel with a specific phase and…

Like the teeth of a comb, a microcomb consists of a spectrum of evenly distributed light frequencies. Optical atomic clocks can be built by locking a microcomb tooth to a ultranarrow-linewidth laser, which in turn locks to an atomic transition with extremely high frequency stability. That way, frequency combs act like a bridge between the atomic transition at an optical frequency and the clock signal at a radio frequency that is electronically detectable for counting the oscillations – enabling extraordinary precision. The researchers’ photonic chip, on the righthand side of the image, contains 40 microcombs generators and is only five millimeters wide. Image Credit: Chalmers University of Technology\ Kaiyi Wu
Information Technology

Microcomb Chips Enhance GPS Accuracy by 1000 Times

Optical atomic clocks can increase the precision of time and geographic position a thousandfold in our mobile phones, computers, and GPS systems. However, they are currently too large and complex to be widely used in society. Now, a research team from Purdue University, USA, and Chalmers University of Technology, Sweden, has developed a technology that, with the help of on-chip microcombs, could make ultra-precise optical atomic clock systems significantly smaller and more accessible – with significant benefits for navigation, autonomous…

Ruishan Liu, WiSE Gabilan Assistant Professor of Computer Science, USC. Image Credit: Alexis Situ
Information Technology

AI Unlocks Genetic Insights for Personalized Cancer Care

New study uncovers how specific genetic mutations influence cancer treatment outcomes  A groundbreaking study led by USC Assistant Professor of Computer Science Ruishan Liu has uncovered how specific genetic mutations influence cancer treatment outcomes—insights that could help doctors tailor treatments more effectively. The largest study of its kind, the research analyzed data for more than 78,000 cancer patients across 20 cancer types. Patients received immunotherapies, chemotherapies and targeted therapies. Using advanced computational analysis, the researchers identified nearly 800 genetic changes that directly…

Information Technology

D2-GCN: Dynamic Disentanglement for Node Classification

Classic Graph Convolutional Networks (GCNs) often learn node representation holistically, which would ignore the distinct impacts from different neighbors when aggregating their features to update a node’s representation. Disentangled GCNs have been proposed to divide each node’s representation into several feature channels. However, current disentangling methods do not try to figure out how many inherent factors the model should assign to help extract the best representation of each node. To solve the problems, a research team led by Chuliang WENG published…

Automotive Engineering

TU Graz AI System Boosts E-Mobility Powertrain Development

The new method optimises the technical design with regard to classic objectives such as costs, efficiency and package space requirements and also takes greenhouse gas emissions along the entire supply chain into account  The development of vehicle components is a lengthy and therefore very costly process. Researchers at Graz University of Technology (TU Graz) have developed a method that can shorten the development phase of the powertrain of battery electric vehicles by several months. A team led by Martin Hofstetter…

Parsimonious models may be the norm in science, but complex models can be more flexible and accurate.
Information Technology

Exploring Ockham’s Razor: Simplifying Complex Innovations

Medieval friar William of Ockham posited a famous idea: always pick the simplest explanation. Often referred to as the parsimony principle, “Ockham’s razor” has shaped scientific decisions for centuries. But lately, incredibly complex AI models have begun outperforming their simpler counterparts. Consider AlphaFold for predicting protein structures, or ChatGPT and its competitors for generating humanlike text. A new paper in PNAS argues that by relying too much on parsimony in modeling, scientists make mistakes and miss opportunities. First author and…

Information Technology

AI Tool Analyzes Speech Patterns to Identify Depression

Evaluation of an AI-based voice biomarker tool to detect signals consistent with moderate to severe depression Background and Goal: Depression impacts an  estimated 18 million Americans each year,  yet depression screening rarely occurs in the outpatient setting. This study evaluated an AI-based machine learning biomarker tool that uses speech patterns to detect moderate to severe depression, aiming to improve access to screening in primary care settings. Study Approach: The study analyzed over 14,000 voice samples from U.S. and Canadian adults….

Information Technology

Humans vs Machines—Who’s Better at Recognizing Speech?

Are humans or machines better at recognizing speech? A new study shows that in noisy conditions, current automatic speech recognition (ASR) systems achieve remarkable accuracy and sometimes even surpass human performance. However, the systems need to be trained on an incredible amount of data, while humans acquire comparable skills in less time. Automatic speech recognition (ASR) has made incredible advances in the past few years, especially for widely spoken languages ​​such as English. Prior to 2020, it was typically assumed…

Information Technology

Not Lost in Translation: AI Increases Sign Language Recognition Accuracy

Additional data can help differentiate subtle gestures, hand positions, facial expressions The Complexity of Sign Languages Sign languages have been developed by nations around the world to fit the local communication style, and each language consists of thousands of signs. This has made sign languages difficult to learn and understand. Using artificial intelligence to automatically translate the signs into words, known as word-level sign language recognition, has now gained a boost in accuracy through the work of an Osaka Metropolitan…

Illustration of multiferroic heterostructures enabling energy-efficient MRAM with giant magnetoelectric effect.
Information Technology

Magnetic Memory Unlocked with Energy-Efficient MRAM

Researchers from Osaka University introduced an innovative technology to lower power consumption for modern memory devices. Stepping up the Memory Game: Overcoming the Limitations of Traditional RAM Osaka, Japan – Numerous memory types for computing devices have emerged in recent years, aiming to overcome the limitations imposed by traditional random access memory (RAM). Magnetoresistive RAM (MRAM) is one such memory type which offers several advantages over conventional RAM, including its non-volatility, high speed, increased storage capacity and enhanced endurance. Although…

Framework for automating RBAC compliance checks using process mining and policy validation tools.
Technology Offerings

Next-Level System Security: Smarter Access Control for Organizations

Cutting-Edge Framework for Enhancing System Security Researchers at the University of Electro-Communications have developed a groundbreaking framework for improving system security by analyzing business process logs. This framework focuses on ensuring that role-based access control (RBAC) rules-critical to managing who can access specific system resources-are correctly implemented. Noncompliance with these rules, whether due to error or malicious activity, can result in unauthorized access and pose significant risks to organizations. Challenges in Ensuring Compliance with RBAC Policies RBAC is a widely…

Information Technology

NTU and NUS spin-off cutting-edge quantum control technology

AQSolotl’s quantum controller is designed to be adaptable, scalable and cost-efficient. Quantum technology jointly developed at Nanyang Technological University, Singapore (NTU Singapore) and National University of Singapore (NUS) has now been spun off into a new deep tech startup, AQSolotl. The startup’s flagship product, CHRONOS-Q, is a quantum controller that acts as a translator between conventional computing systems and quantum computers. Developed by university researchers affiliated with Singapore’s Centre for Quantum Technologies (CQT), it enables users to control quantum computers…

Information Technology

Microelectronics Science Research Centers to lead charge on next-generation designs and prototypes

Pacific Northwest National Laboratory to contribute leadership to national effort in microelectronics design and development. Microelectronics run the modern world. Staying ahead of the development curve requires an investment that doesn’t just keep pace but sets new standards for the next generation of technological advances. Today, the Department of Energy announced the creation of three Microelectronics Science Research Centers to address the nation’s specific needs for microelectronics designed to operate in extreme environments such as high radiation, extreme cold, and…

Feedback