Kijun Times

Kijun Times 는 교내 영어잡지,신문 동아리 다양한 주제에 관한 이슈로 원고를 작성하며 영어 잡지를 만드는 동아리입니다.

매년 잡지 출판뿐만 아니라 자신의 진로와 관련된 개인기사, 모둠기사를 작성함으로써 영어 실력향상은 물론 주제에 제한이 없기 때문에 다양한 진로에 접목 가능합니다.


We are looking for a new journalist for The KIJUN TIMES.

Anyone can be a journalist for The KIJUN TIMES.


Is 'killer robot' warfare closer than we think?

이름 윤소연 등록일 17.09.15 조회수 610

Is 'killer robot' warfare closer than we think?

Terminator robotImage copyrightGETTY IMAGES
Image caption"Killer robots" may seem like something from a sci-fi film, but reality is catching up

More than 100 of the world's top robotics experts wrote a letter to the United Nations recently calling for a ban on the development of "killer robots" and warning of a new arms race. But are their fears really justified?

Entire regiments of unmanned tanks; drones that can spot an insurgent in a crowd of civilians; and weapons controlled by computerised "brains" that learn like we do, are all among the "smart" tech being unleashed by an arms industry many believe is now entering a "third revolution in warfare".

"In every sphere of the battlefield - in the air, on the sea, under the sea or on the land - the military around the world are now demonstrating prototype autonomous weapons," says Toby Walsh, professor of artificial intelligence at Sydney's New South Wales University.

"New technologies like deep learning are helping drive this revolution. The tech space is clearly leading the charge, and the military is playing catch-up."

Kalashnikov gunImage copyrightKALASHNIKOV GROUP
Image captionRussian arms maker Kalashnikov is developing a suite of fully automated weapons

One reported breakthrough giving killer machine opponents sleepless nights is Kalashnikov's "neural net" combat module.

It features a 7.62mm machine gun and a camera attached to a computer system that its makers claim can make its own targeting judgements without any human control.

According to Russia's state-run Tass news agency it uses "neural network technologies that enable it to identify targets and make decisions".

Unlike a conventional computer that uses pre-programmed instructions to tackle a specific but limited range of predictable possibilities, a neural network is designed to learn from previous examples then adapt to circumstances it may not have encountered before.

Kalashnikov mobile rocket unitImage copyrightKALASHNIKOV GROUP
Image captionWould robot combat systems make fewer mistakes than humans?

And it is this supposed ability to make its own decisions that is worrying to many.

"If weapons are using neural networks and advanced artificial intelligence then we wouldn't necessarily know the basis on which they made the decision to attack - and that's very dangerous," says Andrew Nanson, chief technology officer at defence specialist Ultra Electronics.

But he remains sceptical about some of the claims arms manufacturers are making.

Automated defence systems can already make decisions based on an analysis of a threat - the shape, size, speed and trajectory of an incoming missile, for example - and choose an appropriate response much faster than humans can.

But what happens when such systems encounter something they have no experience of, but are still given the freedom to act using a "best guess" approach?

Mistakes could be disastrous - the killing of innocent civilians; the destruction of non-military targets; "friendly fire" attacks on your own side.

Drone with bombsImage copyrightGETTY IMAGES
Image captionRemotely piloted drones have been used to carry out missile attacks since 2001

And this is what many experts fear, not that AI will become too smart - taking over the world like the Skynet supercomputer from the Terminator films - but that it's too stupid.

"The current problems are not with super-intelligent robots but with pretty dumb ones that cannot flexibly discriminate between civilian targets and military targets except in very narrowly contained settings," says Noel Sharkey, professor of artificial intelligence and robotics at Sheffield University.

Despite such concerns, Kalashnikov's latest products are not the only autonomous and semi-autonomous weapons being trialled in Russia.

The Uran-9 is an unmanned ground combat vehicle and features a machine gun and 30mm cannon. It can be remotely controlled at distances of up to 10km.

More Technology of Business

And the diminutive Platform-M combat robot boasts automated targeting and can operate in extremes of heat and cold.

Meanwhile the Armata T-14 "super tank" has an autonomous turret that designer Andrei Terlikov claims will pave the way for fully autonomous tanks on the battlefield.

Manufacturer Uralvagonzavod also didn't respond to BBC requests for an interview, but Prof Sharkey - who is a member of pressure group The Campaign to Stop Killer Robots - is wary of its potential.

"The T-14 is years ahead of the West, and the idea of thousands of autonomous T-14s sitting on the border with Europe does not bear thinking about," he says.

And it's not just Russia developing such weapons.

Prof Toby WalshImage copyrightTOBY WALSH
Image captionProf Toby Walsh says all militaries are developing autonomous weapons

Last summer, the US Defence Advanced Research Projects Agency (Darpa) equipped an ordinary surveillance drone with advanced AI designed to discern between civilians and insurgents during a test over a replica Middle Eastern village in Massachusetts.

And Samsung's SGR-A1 sentry gun, capable of firing autonomously, has been deployed along the South Korean side of the Korean Demilitarised Zone.

The UK's Taranis drone - which is roughly the size of a Red Arrow Hawk fighter jet - is being developed by BAE Systems. It is designed to carry a myriad of weapons long distances and will have "elements" of full autonomy, BAE says.

At sea, the USA's Sea Hunter autonomous warship is designed to operate for extended periods at sea without a single crew member, and to even guide itself in and out of port.

Sea Hunter ship in dockImage copyrightDARPA
Image captionThe Sea Hunter can operate without a single crew member

All the Western arms manufacturers contacted by the BBC, including Boeing's Phantom Works, Northrop Grumman, Raytheon, BAE Systems, Lockheed Martin and General Dynamics, refused to co-operate with this feature, an indication perhaps of the controversial nature of this technology.

But could autonomous military technology also be used simply as support for human military operations?

Roland Sonnenberg, head of defence at consultancy firm PricewaterhouseCoopers, says combat simulation, logistics, threat analysis and back office functions are the more mundane - but equally important - aspects of warfare that robots and AI could perform.

"The benefits that AI has to offer are only useful if they can be applied effectively in the real world and will only be broadly adopted if companies, consumers and society trust the technology and take a responsible approach," he says.

이전글 Catching the hackers in the act
다음글 Would you take a ride in a pilotless sky taxi?