Visualizing the world beyond the frame

Most firetrucks come in red, but it’s not hard to picture one in blue. Computers aren’t nearly as creative. Their understanding of the world is colored, often literally, by the data they’ve trained on. If all they’ve ever seen are pictures of red fire trucks, they have trouble drawing anything else.  To give computer vision models a fuller, more imaginative view of the world, researchers have tried feeding them more varied images. Some have tried shooting objects from odd angles, and in unusual positions, to better convey their real-world complexity. Others have asked the models to generate pictures of their own, using a form of artificial intelligence called GANs, or generative adversarial networks. In both cases, the aim is to fill in the gaps of image datasets to better reflect the three-dimensional world and make face- and object-recognition models less biased. In a new study at the International Conference on Learning Representations, MIT researchers Continue reading Visualizing the world beyond the frame

Boston Dynamics’ Spot Robot Gets Even More Capable With Enhanced Autonomy, Mobility

Boston Dynamics’ Spot has been out in the world doing useful stuff (and some other things) for long enough now that it’s high time for a software update packed with more advanced skills and new features. Spot Release 2.0, launching today, includes improvements to navigation, autonomy, sensing, user programmability, payload integration, communications, and more. Some of that more is an improvement to Spot’s physical capabilities—namely, the robot is better at dealing with slippery surfaces (something Boston Dynamics has always excelled at) and now has a better understanding of stairs, the nemesis of legged robots everywhere. We’ll take a look at what’s new with Spot, and talk with Boston Dynamics founder Marc Raibert as well as Zack Jackowski, lead robotics engineer on Spot, about some of the highlights of the 2.0 update, how Spot now understands what stairs are, and when we’ll finally be seeing that arm hit commercial production.

Study finds stronger links between automation and inequality

This is part 3 of a three-part series examining the effects of robots and automation on employment, based on new research from economist and Institute Professor Daron Acemoglu.  Modern technology affects different workers in different ways. In some white-collar jobs — designer, engineer — people become more productive with sophisticated software at their side. In other cases, forms of automation, from robots to phone-answering systems, have simply replaced factory workers, receptionists, and many other kinds of employees. Now a new study co-authored by an MIT economist suggests automation has a bigger impact on the labor market and income inequality than previous research would indicate — and identifies the year 1987 as a key inflection point in this process, the moment when jobs lost to automation stopped being replaced by an equal number of similar workplace opportunities. “Automation is critical for understanding inequality dynamics,” says MIT economist Daron Acemoglu, co-author of Continue reading Study finds stronger links between automation and inequality