Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On the other hand, Israel is using AI to generate their bombing targets and pound Gaza strip with bombs non-stop [0].

And, according to UN, Turkey has used AI powered, autonomous littering drones to hit military convoys in Libya [1].

Regardless of us vs. them, AI shouldn't be a part of warfare, IMHO.

[0]: https://www.theguardian.com/world/2023/dec/01/the-gospel-how...

[1]: https://www.voanews.com/a/africa_possible-first-use-ai-armed...



> AI shouldn't be a part of warfare, IMHO.

Nor should nuclear weapons, guns, knives, or cudgels.

But we don’t have a way to stop them being used.


Sure we do. We enforce it through the threat of warfare and subsequent prosecution, the same way we enforce the bans on chemical weapons and other war crimes.

We may lack the motivation and agreement to ban particular methods of warfare, but the means to enforce that ban exists, and drastically reduces their use.


"We enforce it through the threat of warfare and subsequent prosecution, the same way we enforce the bans on chemical weapons and other war crimes."

Do we, though? Sometimes, against smaller misbehaving players. Note that it doesn't necessarily stop them (Iran, North Korea), even though it makes their international position somewhat complicated.

Against the big players (the US, Russia, China), "threat of warfare and prosecution" does not really work to enforce anything. Russia rains death on Ukrainian cities every night, or attempts to do so while being stopped by AA. Meanwhile, Russian oil and gas are still being traded, including in EU.


We lack the motivation precisely because of information warfare that is already being used.


This is literally the only thing that matters in this debate. Everything else is useless hand-wringing from people who don't want to be associated with the negative externalities of their work.

The second that this tech was developed it became literally impossible to stop this from happening. It was a totally foreseeable consequence, but the researchers involved didn't care because they wanted to be successful and figured they could just try to blame others for the consequences of their actions.


> the researchers involved didn't care because they wanted to be successful and figured they could just try to blame others for the consequences of their actions

Such an absurdly reductive take. Or how about just like nuclear energy and knives, they are incredibly useful, society advancing tools that can also be used to cause harm. It's not as if AI can only be used for warfare. And like pretty much every technology, it ends up being used 99.9% for good, and 0.1% for evil.


I think you're missing the point. I don't think we should have prevented the development of this tech. It's just absurd to complain about things that we always knew would happen as though they're some sort of great surprise.

If we cared about preventing LLMs from being used for violence, we would have poured more than a tiny fraction our resources into safety/alignment research. We did not. Ergo, we don't care, we just want people to think we care.

I don't have any real issue with using LLMs for military purposes. It was always going to happen.


You say ‘we’ as if everyone is the same. Some people care, some people don’t. It only takes a a few who don’t, or who feel the ends justify the means. Because those people exist, the people who do care are forced into a prisoners dilemma forcing them to develop the technology anyway.


Safe or alignment research isn't going to stop it from being used for military purposes. Once the tech is out there, it will be used for military purposes; there's just no getting around it.


If it ever happens again, they'll develop the lists in seconds from data collected from our social media, intercept. What took organizations warehouses and thousands of agents will be done in a matter of seconds.


Why not? Maybe AI is what is needed to finally tear Hamas out of Palestine root and branch. As long as humans are still in the loop vetting the potential targets, it doesn't seem particularly different from the IDF just hiring a bunch of analysts to produce the same targets.


There is no "removing Hamas from Palestine". The only way to remove the desire of the Palestinian people for freedom is to remove the Palestinian people themselves. And that is what the IDF is trying to do.


Hamas isn't the only path to freedom for Palestinians. In fact, they seem to be the major impediment to it.


If we're going to be reductive, at least include the other main roadblock to a solution which is the current government of Israel.


That doesn't explain why deals weren't reached with the previous governments of Israel.


Sure it doesn't explain that. Would be nice if things were that easy wouldn't it?


Generally if a main roadblock is removed, you can get a little farther down the road.


Hamas doesn't exist in a vacuum where you can just remove it and then it's gone. You have to offer a life that's better than Hamas.


Considering the incredible amount of civilian casualties, I don't think the target vetting is working very well.


I would be very surprised if Turkey was capable of doing that. If they did, that's all Erdoğan would be talking about. Also it's a bit weird that the linked article's source is a Turkish name. (Economy and theology major too)

I am not saying this is anything but it's definetely tingling my "something's up" senses.


Voice of America generally employs country's nationals for their reporting. There are some other resources:

    - NPR: https://www.npr.org/2021/06/01/1002196245/a-u-n-report-suggests-libya-saw-the-first-battlefield-killing-by-an-autonomous-d
    - Lieber Institute: https://lieber.westpoint.edu/kargu-2-autonomous-attack-drone-legal-ethical/
    - ICRC: https://casebook.icrc.org/case-study/libya-use-lethal-autonomous-weapon-systems
    - UN report itself (Search for Kargu): https://undocs.org/Home/Mobile?FinalSymbol=S%2F2021%2F229&Language=E&DeviceType=Desktop&LangRequested=False
    - Kargu itself: https://www.stm.com.tr/en/kargu-autonomous-tactical-multi-rotor-attack-uav
From my experience, Turkish military doesn't like to talk about all the things they have.


The major drone manufacturer is Erdoğan's son-in-law. He's being groomed as one of his possible sucessors on the throne. They looove to talk about those drones.

I will check out the links. Thanks a lot.


You're welcome.

The drones in question (Kargu) are not built by his company.


True. I had been reading about how other drones are in service but they never get mentioned anymore.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: