Deep learning for mechanical property evaluation

A standard method for testing some of the mechanical properties of materials is to poke them with a sharp point. This “indentation technique” can provide detailed measurements of how the material responds to the point’s force, as a function of its penetration depth. With advances in nanotechnology during the past two decades, the indentation force can be measured to a resolution on the order of one-billionth of a Newton (a measure of the force approximately equivalent to the force you feel when you hold a medium-sized apple in your hand), and the sharp tip’s penetration depth can be captured to a resolution as small as a nanometer, or about 1/100,000 the diameter of a human hair. Such instrumented nanoindentation tools have provided new opportunities for probing physical properties in a wide variety of materials, including metals and alloys, plastics, ceramics, and semiconductors. But while indentation techniques, including nanoindentation, work well Continue reading Deep learning for mechanical property evaluation

Video Friday: Autonomous Security Robot Meets Self-Driving Tesla

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!): HRI 2020 – March 23-26, 2020 – Cambridge, U.K. [CANCELED] ICARSC 2020 – April 15-17, 2020 – Ponta Delgada, Azores ICRA 2020 – May 31-4, 2020 – Paris, France ICUAS 2020 – June 9-12, 2020 – Athens, Greece CLAWAR 2020 – August 24-26, 2020 – Moscow, Russia Let us know if you have suggestions for next week, and enjoy today’s videos.

3Q: Collaborating with users to develop accessible designs

Academic researchers and others have long struggled with making data visualizations accessible to people who are blind. One technological approach has been 3D printing tactile representations of data, in the form of raised bar graphs and line charts. But, often, the intended users have little say in the actual design process, and the end result isn’t as effective as planned. A team of MIT researchers hopes to fix that. They used a collaborative project with staff and students at the Perkins School for the Blind as a case study of the accessible design process, and generated a list of “sociotechnical” considerations to guide researchers in similar work. A paper detailing the work appears in the journal IEEE Transactions on Visualization and Computer Graphics. Co-authors Alan Lundgard, a graduate student in the Department of Electrical Engineering and Computer Science (EECS); Crystal Lee, a graduate student in the Program in Science, Technology, Continue reading 3Q: Collaborating with users to develop accessible designs

Autonomous Robots Are Helping Kill Coronavirus in Hospitals

The absolute best way of dealing with the coronavirus pandemic is to just not get coronavirus in the first place. By now, you’ve (hopefully) had all of the strategies for doing this drilled into your skull—wash your hands, keep away from large groups of people, wash your hands, stay home when sick, wash your hands, avoid travel when possible, and please, please wash your hands.  At the top of the list of the places to avoid right now are hospitals, because that’s where all the really sick people go. But for healthcare workers, and the sick people themselves, there’s really no other option. To prevent the spread of coronavirus (and everything else) through hospitals, keeping surfaces disinfected is incredibly important, but it’s also dirty, dull, and (considering what you can get infected with) dangerous. And that’s why it’s an ideal task for autonomous robots.

Satellites and AI Monitor Chinese Economy’s Reaction to Coronavirus

Researchers on WeBank’s AI Moonshot Team have taken a deep learning system developed to detect solar panel installations from satellite imagery and repurposed it to track China’s economic recovery from the novel coronavirus outbreak. This, as far as the researchers know, is the first time big data and AI have been used to measure the impact of the new coronavirus on China, Haishan Wu, vice general manager of WeBank’s AI department, told IEEE Spectrum. WeBank is a private Chinese online banking company founded by Tencent. The team used its neural network to analyze visible, near-infrared, and short-wave infrared images from various satellites, including the infrared bands from the Sentinel-2 satellite. This allowed the system to look for hot spots indicative of actual steel manufacturing inside a plant.  In the early days of the outbreak, this analysis showed that steel manufacturing had dropped to a low of 29 percent of capacity. But Continue reading Satellites and AI Monitor Chinese Economy’s Reaction to Coronavirus

Gill Pratt on “Irrational Exuberance” in the Robocar World

A lot of people in the auto industry talked for way too long about the imminent advent of fully self-driving cars.  In 2013, Carlos Ghosn, now very much the ex-chairman of Nissan, said it would happen in seven years. In 2016, Elon Musk, then chairman of Tesla, implied  his cars could basically do it already. In 2017 and right through early 2019 GM Cruise talked 2019. And Waymo, the company with the most to show for its efforts so far, is speaking in more measured terms than it used just a year or two ago.  It’s all making Gill Pratt, CEO of the Toyota Research Institute in California, look rather prescient. A veteran roboticist who joined Toyota in 2015 with the task of developing robocars, Pratt from the beginning emphasized just how hard the task would be and how important it was to aim for intermediate goals—notably by making a car that could help drivers now, not merely replace them Continue reading Gill Pratt on “Irrational Exuberance” in the Robocar World

Skin-like, Flexible Sensor Lets Robots Detect Us

A new sensor for robots is designed to make our physical interactions with these machines a little smoother—and safer. The sensor, which is now being commercialized, allows robots to measure the distance and angle of approach of a human or object in close proximity. Industrial robots often work autonomously to complete tasks. But increasingly, collaborative robots are working alongside humans. To avoid collisions in these circumstances, collaborative robots need highly accurate sensors to detect when someone (or something) is getting a little too close.