Flipping or Turning? This Massive Database of Video Clips Will Help AIs Understand the Difference

Researchers can use the “Moments in Time” project to train AI systems to recognize and understand actions and events in videos Imagine if we had to explain all of the actions that take place on Earth to aliens. We could provide them with non-fiction books or BBC documentaries. We could try to explain verbally what twerking is. But, really, nothing conveys an action better than a three second video clip. Falling Asleep via GIPHY Thanks to researchers at MIT and IBM, we now have a clearly labelled dataset of more than one million such clips. The dataset, called Moments in Time, captures hundreds of common actions that occur on Earth, from the beautiful moment of a flower opening to the embarrassing instance of a person tripping and eating dirt.   Tripping via GIPHY (We’ve all been there.) Moments in Time, however, wasn’t created to provide a bank of GIFs, but to lay Continue reading Flipping or Turning? This Massive Database of Video Clips Will Help AIs Understand the Difference

Shimi Will Now Sing to You in an Adorable Robot Voice

Deep learning helps one of Georgia Tech’s musical robots to understand humans and sing to them Human-robot interaction is easy to do badly, and very difficult to do well. One approach that has worked well for robots from R2-D2 to Kuri is to avoid the problem of language—rather than use real words to communicate with humans, you can do pretty well (on an emotional level, at least) with a variety of bleeps and bloops. But as anyone who’s watched Star Wars knows, R2-D2 really has a lot going on with the noises that it makes, and those noises were carefully designed to be both expressive and responsive. Most actual robots don’t have the luxury of a professional sound team (and as much post-production editing as you need), so the question becomes how to teach a robot to make the right noises at the right times. At Georgia Tech’s Center for Continue reading Shimi Will Now Sing to You in an Adorable Robot Voice

Putting Skin, Heart, and Soul in the Game of Solving Biomedical Challenges

Millimeter-wave imaging is cheaper, safer, less power-intensive and much more portable than other types of body imaging. Developing algorithms and learning-based systems to support potentially life-saving biomedical devices is more than abstract research for Stevens Institute of Technology electrical and computer engineering assistant professor and senior IEEE member Negar Tavassolian. “I’ve always been interested in solving medical problems with commercially viable technology,” says Tavassolian, whose work is affiliated with the Stevens Institute for Artificial Intelligence. “I like to make things and see how they can help somebody.” Tavassolian was granted a National Science Foundation (NSF) CAREER Award to leverage millimeter-wave technology in her quest to use artificial intelligence and other emerging technologies to develop an innovative, portable dermatological application to create a high-resolution image of a patient’s skin for early detection of skin cancers. Millimeter-wave imaging (at a frequency of 30 to 300 GHz) is cheaper, safer, less power-intensive and Continue reading Putting Skin, Heart, and Soul in the Game of Solving Biomedical Challenges

Can AI Detect Deepfakes To Help Ensure Integrity of U.S. 2020 Elections?

Startup Deeptrace is racing to develop automated detection of fake videos and images as U.S. 2020 elections loom A perfect storm arising from the world of pornography may threaten the U.S. elections in 2020 with disruptive political scandals having nothing to do with actual affairs. Instead, face-swapping “deepfake” technology that first became popular on porn websites could eventually generate convincing fake videos of politicians saying or doing things that never happened in real life—a scenario that could sow widespread chaos if such videos are not flagged and debunked in time.

Animal-AI Olympics Will Test AI on Intelligence Tasks Designed for Crows and Chimps

Experiments drawn from Aesop’s Fables can gauge general intelligence Illustration: Squidoodle Are today’s best artificial intelligence (AI) systems as smart as a mouse? A crow? A chimp? A new contest aims to find out.  The Animal-AI Olympics, which will begin this June, aims to “benchmark the current level of various AIs against different animal species using a range of established animal cognition tasks.” At stake are bragging rights and $10,000 in prizes. The project, a partnership between the University of Cambridge’s Leverhulme Centre for the Future of Intelligence and GoodAI, a research institution based in Prague, is a new way to evaluate the progress of AI systems toward what researchers call artificial general intelligence. While AI systems have recently bested humans in a host of challenging competitions, including the board game Go, the poker game Texas Hold’em, and the video game StarCraft, these matchups only proved that AIs were astoundingly Continue reading Animal-AI Olympics Will Test AI on Intelligence Tasks Designed for Crows and Chimps

Can Machine Learning Teach Us Anything?

Games, Computers, and Humans Illustration: Edmon DeHaro The breathless headline caught my eye: “Computer Shows Human Intuition—AI Breakthrough!” (or words to that effect). I was intrigued but skeptical. Reading further, I learned that a computer program, AlphaZero, developed by a team at DeepMind, in London, had beaten other champion chess-playing programs, as well as (of course) humans. That wasn’t the interesting news, as we take that kind of dominance for granted these days. What fascinated me was how the program had been constructed. Instead of being tuned by expert players, AlphaZero initially knew nothing more than the rules of chess. It learned how to play, and to win, by playing against itself. Soon it got so good it could beat everyone and everything. But, I wondered, isn’t this what humans have been doing for centuries—learning by playing chess against ourselves? What, if anything, has the computer learned so quickly that we Continue reading Can Machine Learning Teach Us Anything?

4 Experts Respond to Trump’s Executive Order on AI

The new “American AI Initiative” is heavy on bombast, light on specifics Yesterday, U.S. President Donald Trump signed an executive order establishing the American AI Initiative, with the aim of “accelerating our national leadership” in artificial intelligence. The announcement framed it as an effort to win an AI arms race of sorts:  “Americans have profited tremendously from being the early developers and international leaders in AI. However, as the pace of AI innovation increases around the world, we cannot sit idly by and presume that our leadership is guaranteed.” While extremely light on details, the announcement mentioned five major areas of action: Having federal agencies increase funding for AI R&D Making federal data and computing power more available for AI purposes  Setting standards for safe and trustworthy AI Training an AI workforce Engaging with international allies—but protecting the tech from foreign adversaries IEEE Spectrum asked four experts for their take on the Continue reading 4 Experts Respond to Trump’s Executive Order on AI

Pictionary-Playing AI Sketches the Future of Human-Machine Collaborations

As either “guesser” or “drawer,” the Allen Institute’s new AI cooperates with a human player What do the games of chess, Jeopardy!, Go, Texas Hold’em, and StarCraft have in common? In each of these competitive arenas, an AI has resoundingly beat the best human players in the world. These victories are astounding feats of artificial intelligence—yet they’ve become almost humdrum. Another day, another triumph over humans.  At the Allen Institute for Artificial Intelligence (AI2), in Seattle, researchers set out to do something different. Their AllenAI collaborates with a human player in a Pictionary-style drawing and guessing game, which is won through human-AI cooperation.  Want to see for yourself? Go play it. AI2 has just launched a public version of the game, a simplified version of Pictionary that it calls Iconary. The current version of AllenAI has limited abilities—but as it engages with a diverse set of players, with different aptitudes Continue reading Pictionary-Playing AI Sketches the Future of Human-Machine Collaborations

DeepMind’s AI Shows Itself to Be a World-Beating World Builder

Bests human professional gamers in the complex strategy game StarCraft II At the end of 2018, Dario “TLO” Wünsch, a well-known professional gamer from Germany, was ranked 42nd in the world in the video game StarCraft II. He’d lost some—especially as he battled debilitating carpal tunnel syndrome—but he’d won enough to still be considered among the world’s best players. But last week, as he sat before his screen executing the unorthodox moves that have become his signature, he watched helplessly as his opponent slaughtered his armies and laid waste to his StarCraft II kingdom. There was no fist-pumping excitement coming from TLO’s opponent. The German gamer lost to an artificial intelligence agent created by DeepMind Technologies as part of its mission to push the boundaries of AI.  The company, which is measuring its progress by testing its algorithms’ ability to play StarCraft II, is celebrating a major milestone: the introduction last week of AlphaStar, its StarCraft II player. To Continue reading DeepMind’s AI Shows Itself to Be a World-Beating World Builder

A Choice of Grippers Helps Dual-Arm Robot Pick Up Objects Faster Than Ever

Dex-Net 4.0 enables “ambidextrous” robots to choose the best gripper for the job We’ve been following Dex-Net’s progress towards universal grasping for several years now, and today in a paper in Science Robotics, UC Berkeley is presenting Dex-Net 4.0. The new and exciting bit about this latest version of Dex-Net is that it’s able to successfully grasp 95 percent of unseen objects at a rate of 300 per hour, thanks to some added ambidexterity that lets the robot dynamically choose between two different kinds of grippers.