IBM must be brimming with confidence about its new automated system for performing chemical synthesis because Big Blue just had twenty or so journalists demo the complex technology live in a virtual room. IBM even had one of the journalists choose the molecule for the demo: a molecule in a potential Covid-19 treatment. And then we watched as the system synthesized and tested the molecule and provided its analysis in a PDF document that we all saw in the other journalist’s computer. It all worked; again, that’s confidence. The complex system is based upon technology IBM started developing three years ago that uses artificial intelligence (AI) to predict chemical reactions. In August 2018, IBM made this service available via the Cloud and dubbed it RXN for Chemistry. Now, the company has added a new wrinkle to its Cloud-based AI: robotics. This new and improved system is no longer named simply RXN for Chemistry, but Continue reading Robotics, AI, and Cloud Computing Combine to Supercharge Chemical and Drug Synthesis
The research described in this article has been published on a preprint server but has not yet been peer-reviewed by scientific or medical experts. During the current coronavirus pandemic, one of the riskiest parts of a health care worker’s job is assessing people who have symptoms of Covid-19. Researchers from MIT and Brigham and Women’s Hospital hope to reduce that risk by using robots to remotely measure patients’ vital signs. The robots, which are controlled by a handheld device, can also carry a tablet that allows doctors to ask patients about their symptoms without being in the same room. “In robotics, one of our goals is to use automation and robotic technology to remove people from dangerous jobs,” says Henwei Huang, an MIT postdoc. “We thought it should be possible for us to use a robot to remove the health care worker from the risk of directly exposing themselves to Continue reading Robot takes contact-free measurements of patients’ vital signs
Los cerebros están hablando con las computadoras, y las computadoras con los cerebros. ¿Nuestros sueños están a salvo? More details
Brains are talking to computers, and computers to brains. Are our daydreams safe? More details
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!): CLAWAR 2020 – August 24-26, 2020 – [Online Conference] ICUAS 2020 – September 1-4, 2020 – Athens, Greece ICRES 2020 – September 28-29, 2020 – Taipei, Taiwan AUVSI EXPONENTIAL 2020 – October 5-8, 2020 – [Online Conference] IROS 2020 – October 25-29, 2020 – Las Vegas, Nev., USA CYBATHLON 2020 – November 13-14, 2020 – [Online Event] ICSR 2020 – November 14-16, 2020 – Golden, Colo., USA Let us know if you have suggestions for next week, and enjoy today’s videos.
At IROS last year, Caltech and JPL presented a prototype for a ballistically launched quadrotor—once folded up into a sort of football shape with fins, the drone is stuffed into a tube and then fired straight up with a blast of compressed CO2, at which point it unfolds itself, stabilizes, and then flies off. It’s been about half a year, and the prototype has been scaled up in both size and capability, now with a half-dozen rotors and full onboard autonomy that can (barely) squeeze into a 6-inch tube.
The change from Middle Stone Age (MSA) to Later Stone Age (LSA) marks a major cultural change amongst our hunter-gatherer ancestors, but distinguishing between these two industrial complexes is not straightforward. New researc demonstrates how machine learning can provide a valuable tool for archaeologists, and can identify what differentiates the MSA and LSA. More details
Robotic support pets used to reduce depression in older adults and people with dementia acquire bacteria over time, but a simple cleaning procedure can help them from spreading illnesses, according to a new study. More details
A collaboration has created the first microscopic robots that incorporate semiconductor components, allowing them to be controlled – and made to walk – with standard electronic signals. More details
The U.S. National Science Foundation (NSF) announced today an investment of more than $100 million to establish five artificial intelligence (AI) institutes, each receiving roughly $20 million over five years. One of these, the NSF AI Institute for Artificial Intelligence and Fundamental Interactions (IAIFI), will be led by MIT’s Laboratory for Nuclear Science (LNS) and become the intellectual home of more than 25 physics and AI senior researchers at MIT and Harvard, Northeastern, and Tufts universities. By merging research in physics and AI, the IAIFI seeks to tackle some of the most challenging problems in physics, including precision calculations of the structure of matter, gravitational-wave detection of merging black holes, and the extraction of new physical laws from noisy data. “The goal of the IAIFI is to develop the next generation of AI technologies, based on the transformative idea that artificial intelligence can directly incorporate physics intelligence,” says Jesse Thaler, Continue reading National Science Foundation announces MIT-led Institute for Artificial Intelligence and Fundamental Interactions