Concerns Rise Over AI’s Role in Gaza Conflict
Reports have stated that Israel has utilized AI models, including Lavender, Gospel, and Where’s Daddy?, to conduct widespread surveillance, identify targets, and carry out strikes on tens of thousands of people in Gaza, often inside their homes, with minimal human oversight.
Experts and human rights organizations argue that these AI systems have been central to Israel's relentless and seemingly indiscriminate strikers, which have devastated large areas of Gaza, resulting in over 50,000 Palestinian deaths, most of whom are women and children.
Heidy Khlaaf, a previous systems safety engineer at OpenAI, emphasized to a news agency that the use of AI models, known for lacking precision accuracy, is leading to the normalization of large-scale civilian casualties, a troubling trend already visible in Gaza.
Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.
Legal Disclaimer:
EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.
