A company acquired by Google that develops robots for the U.S. military appears to have greatly reduced its dependence on government funding, suggesting a reluctance on Google’s part to align itself too closely with military projects.
When Google acquired Boston Dynamics last December, some questioned whether the firm’s military focus conflicted with Google’s pledge of “don’t be evil” and the virtuous image it nurtures for itself.
“Google search and destroy,” quipped the U.K.’s Independent newspaper. “The internet giant (motto: ‘Don’t be evil’) has bought a pioneer of scary robot animals. Can its ethics survive?”
Google said little to address the issue at the time and, perhaps illustrating the sensitivity of the topic, declined several requests this week to discuss Boston Dynamics and its work.
But an analysis of federal procurement records for unclassified projects shows Boston Dynamics has accepted vastly less government money in the eight months since it was acquired by Google.
In 2013, the company received US$31.2 million in grants from the Defense Advanced Research Projects Agency (DARPA), the U.S. Army and the U.S. Navy. That was roughly in line with the $33.2 million it received in 2012 and the $27.8 million in 2011.
But so far in 2014, Boston Dynamics has received just one payment, of $1.1 million, the records show. The money came from DARPA in April and was for participation in the organization’s robotics challenge, which aims to stimulate robot research and attracts teams from around the world.
It’s unclear why the funding has dropped so sharply, but it suggests a reluctance by Google to pursue military projects and align itself publicly with the U.S. government, especially when suspicion about clandestine government projects is running high.
Two weeks before the April payment, DARPA said that Tokyo-based Schaft, another robotics company acquired by Google last year, had decided to stop accepting military funding and would pay for its own work.
Receiving such government funding, particularly tied to military projects, can be an ethical quandary for scientists and others.
“I know researchers really wrestle with this,” said George Lucas, a professor of ethics and public policy at the Naval Postgraduate School in Monterey, California. “It isn’t necessarily evil in all cases to work for the military, but it always raises the question of what exactly DARPA wants you to do.”
Part of the concern, he said, comes from the researchers not having control over how their work will be used, or if it will be shared with the wider community.
The robots developed by Boston Dynamics are some of the most complex and agile yet built. They include the LS-3, a four-legged mule designed to follow soldiers into battle carrying heavy gear; Robogator, a river surveillance robot; Cheetah, another four-legged robot that can run fast; and Atlas, a two-legged humanoid robot.
The LS-3’s role in battle was highlighted this week when U.S. Marines put the robot through its paces in a variety of terrains as part of the RIMPAC multinational naval exercise in Hawaii.
Scott Strawn, an IDC analyst who follows Google closely, said its “don’t be evil” mantra is more a PR statement than a fixed rule for business, but it does reflect an awareness at Google that making money depends on maintaining the trust of its users. That can create tension when areas of its work overlap with government and military goals.
“There are just inherent aspects of their business that are going to be very interesting from a defense perspective,” he said.
While Google’s robot ambitions are unclear, it may have little interest in becoming a supplier to the military. In December, it said it would “honor existing military contracts,” but that it did not plan to become “a military contractor on its own,” according to a news report at the time.
Instead, it’s believed to be interested in robots for use in factory automation, home help, package delivery and even as explorers on future space missions.