Researchers at the University of Copenhagen have developed a prototype of an app that may potentially prescribe the optimal dose of medicine for the individual patient, as well as prevent counterfeit products.
A team of Australian researchers has designed a reliable strategy for testing physical abilities of humanoid robots. Using a blend of machine learning methods and algorithms, the research team succeeded in enabling test robots to effectively react to unknown changes in the simulated environment, improving their odds of functioning in the real world. The findings have promising implications in the broad use of humanoid robots in fields such as healthcare, education, disaster response and entertainment.
To walk or run with finesse, roaches and robots coordinate leg movements via signals sent through centralized systems. Though their moving parts are utterly divergent, researchers have devised handy principles and equations to assess how both beasts and bots locomote and to improve robotic gait.
A team of scientists from School of Engineering at Far Eastern Federal University (FEFU), Institute of Automation and Control Processes, and Institute of Marine Technology Problems of the Far Eastern Department of the Russian Academy of Sciences developed a software module to automatically diagnose defects in sensors and electric drives in various kinds of robots. The system is able to compensate for the detected defects in real time.
CSHL neuroscientist Anthony Zador shows how evolution and animal brains can be a rich source of inspiration for machine learning, especially to help AI tackle some enormously difficult problems, like doing the dishes.
The majority of soft robots today rely on external power and control, keeping them tethered to off-board systems or rigged with hard components. Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Caltech have developed soft robotic systems, inspired by origami, that can move and change shape in response to external stimuli, paving the way for fully untethered soft robots.
An experiment with a water-saving 'smart' faucet shows potential for reducing water use. The catch? Unbeknownst to study participants, the faucet's smarts came from its human controller.
In the blink of an eye, the human visual system can process an object, determining whether it's a cup or a sock within milliseconds, and with seemingly little effort. It's well-established that an object's shape is a critical visual cue to help the eyes and brain perform this trick. A new study, however, finds that while the outer shape of an object is important for rapid recognition, the object's inner 'skeleton' may play an even more important role.
Researchers at the UW have used machine learning to develop a new system that can monitor factory and warehouse workers and tell them how ergonomic their jobs are in real time.
UC Berkeley neuroscientists have created interactive maps that can predict where different categories of words activate the brain. Their latest map is focused on what happens in the brain when you read stories.