Microsoft says it provided AI to Israel for Gaza war but denies use to harm Palestinians

haaretz.com·The Associated Press·2025-05-17
View original article
0out of 100
Noticeable — persuasion techniques worth noting

This article tries to convince you that major tech companies, especially Microsoft, are acting responsibly even when their technology is used in conflicts like the war in Gaza. It does this by repeatedly quoting officials and figures associated with the company and presenting their statements as unquestionable facts. While it mentions concerns from human rights groups, it doesn't give many details or specific examples to back up those worries, leaving out information that might challenge the idea that these companies are managing things ethically.

FATE Analysis

Four dimensions of psychological manipulation: how content captures Focus, exploits Authority, triggers Tribal identity, and engineers Emotion.

Focus4/10Authority5/10Tribe3/10Emotion4/10
FFocus
0/10
AAuthority
0/10
TTribe
0/10
EEmotion
0/10

Focus signals

unprecedented framing
"Microsoft acknowledged Thursday that it sold advanced artificial intelligence and cloud computing services to the Israeli military during the war in Gaza and aided in efforts to locate and rescue Israeli hostages. But the company also said it has found no evidence to date that its Azure platform and AI technologies were used to target or harm people in Gaza."

This is framed as Microsoft's first public acknowledgement of its deep involvement, presenting it as a significant, previously unrevealed development.

attention capture
"The unsigned blog post on Microsoft's corporate website appears to be the company's first public acknowledgement of its deep involvement in the war."

Highlights the novelty and significance of Microsoft's statement, drawing attention as 'first public acknowledgement.'

unprecedented framing
"We are in a remarkable moment where a company, not a government, is dictating terms of use to a government that is actively engaged in a conflict," she said. "It's like a tank manufacturer telling a country you can only use our tanks for these specific reasons. That is a new world."

This quote from an expert is used to frame the situation as 'remarkable' and 'a new world,' emphasizing its extraordinary nature to capture and hold attention.

Authority signals

institutional authority
"Microsoft acknowledged Thursday that it sold advanced artificial intelligence and cloud computing services to the Israeli military during the war in Gaza..."

The article uses 'Microsoft acknowledged' to leverage the institutional weight of a major tech company, giving credibility to the information presented.

expert appeal
"Emelia Probasco, a senior fellow for the Center for Security and Emerging Technology at Georgetown University, said the statement is noteworthy because few commercial technology companies have so clearly laid out standards for working globally with international governments."

Leverages the credentials and institutional affiliation of Emelia Probasco (senior fellow, Georgetown University) to validate the significance and impact of Microsoft's statement.

institutional authority
"Cindy Cohn, executive director of the Electronic Frontier Foundation, applauded Microsoft Friday for taking a step toward transparency."

Citing the 'executive director of the Electronic Frontier Foundation' lends institutional authority to the idea that Microsoft's statement is a move towards transparency, despite remaining questions.

institutional authority
"It comes nearly three months after an investigation by The Associated Press revealed previously unreported details about the American tech giant's close partnership with the Israeli Defense Ministry..."

The article cites 'The Associated Press' (a major news organization) and its 'investigation' to establish the credibility and factual basis of the background information provided.

Tribe signals

us vs them
"Meanwhile, human rights groups have raised concerns that AI systems, which can be flawed and prone to errors, are being used to help make decisions about who or what to target, resulting in the deaths of innocent people."

This creates an implicit 'us vs. them' dynamic between 'human rights groups' and the entities using AI systems, highlighting a conflict of perspectives and concerns.

us vs them
"No Azure for Apartheid, a group of current and former Microsoft employees, called on Friday for the company to publicly release a full copy of the investigative report."

This identifies a specific group, implicitly framing a 'us vs. them' dynamic with Microsoft's official stance or actions. The name 'No Azure for Apartheid' itself is a tribal marker.

us vs them
"Hossam Nasr, a former Microsoft worker fired in October after he helped organize an unauthorized vigil at the company's headquarters for Palestinians killed in Gaza."

This detail highlights a clear 'us vs. them' dynamic between Microsoft management and employees who are advocating for Palestinians, with consequences for those taking a stand.

Emotion signals

outrage manufacturing
"Meanwhile, human rights groups have raised concerns that AI systems, which can be flawed and prone to errors, are being used to help make decisions about who or what to target, resulting in the deaths of innocent people."

This sentence is engineered to evoke outrage and concern by highlighting the potential for 'flawed and prone to errors' AI systems to cause 'deaths of innocent people.'

moral superiority
"We provided this help with significant oversight and on a limited basis, including approval of some requests and denial of others," Microsoft said. "We believe the company followed its principles on a considered and careful basis, to help save the lives of hostages while also honoring the privacy and other rights of civilians in Gaza."

Microsoft's statement, as quoted, attempts to project moral superiority and careful ethical conduct by claiming to have acted 'to help save the lives of hostages while also honoring the privacy and other rights of civilians.'

outrage manufacturing
"It's very clear that their intention with this statement is not to actually address their worker concerns, but rather to make a PR stunt to whitewash their image that their relationship with the Israeli military has tarnished," said Hossam Nasr, a former Microsoft worker fired in October after he helped organize an unauthorized vigil at the company's headquarters for Palestinians killed in Gaza."

The quote from the former Microsoft worker is framed to ignite outrage by suggesting Microsoft's statement is a 'PR stunt' to 'whitewash their image,' particularly given the context of being fired for advocating for Palestinians killed in Gaza.

Narrative Analysis (PCP)

How the article reshapes thinking: Perception (what beliefs are targeted), Context (what information is shifted or omitted), and Permission (what behavior is being encouraged).

What it wants you to believe

The article aims to instill the belief that major tech companies, specifically Microsoft, are acknowledging their involvement in sensitive geopolitical conflicts and are making efforts towards transparency and ethical oversight of their technology's use. It suggests that even in a conflict, these companies are attempting to uphold ethical principles and investigate concerns. The target perception is that corporations can, or at least attempt to, act responsibly even when their products are used by militaries.

Context being shifted

The article shifts context by focusing on Microsoft's self-assessment and stated policies. By emphasizing the company's internal investigation and its claims of 'no evidence' of misuse, it shifts the frame from the direct impact of the technology on the ground in Gaza to the company's internal governance and public relations efforts. The narrative focuses on what Microsoft says it is doing to be ethical, rather than extensively on the documented consequences of its technology's use in actual conflict.

What it omits

The article mentions that human rights groups express concerns about AI systems being flawed and leading to civilian deaths, but it omits detailed examples or evidence from these groups specific to Microsoft's technology use in Gaza. While The AP investigation is referenced, the article does not extensively delve into the specifics of *how* the Israeli military's use of Azure for 'mass surveillance' and 'targeting systems' would likely manifest in harm, or the specific criticisms by human rights groups that led Microsoft to launch its review. The statement that Microsoft 'does not have visibility into how customers use our software on their own servers or other devices' is presented as a neutral fact, without deep analysis of its implications for accountability or the potential for evasion of stated 'Acceptable Use Policies'.

Desired behavior

The reader is nudged towards accepting, or at least tolerating, the ongoing involvement of major tech companies in military conflicts, particularly when these companies present themselves as acting ethically or taking steps towards transparency. It implicitly grants permission to believe that corporations can self-regulate and manage the ethical dimensions of their products even in warfare, potentially leading to less public pressure for stricter governmental oversight or divestment.

SMRP Pattern

Four manipulation maintenance tactics: Socializing the idea as normal, Minimizing concerns, Rationalizing with logic, and Projecting blame.

-
Socializing
!
Minimizing

"But the company also said it has found no evidence to date that its Azure platform and AI technologies were used to target or harm people in Gaza."

!
Rationalizing

""We provided this help with significant oversight and on a limited basis, including approval of some requests and denial of others," Microsoft said. "We believe the company followed its principles on a considered and careful basis, to help save the lives of hostages while also honoring the privacy and other rights of civilians in Gaza.""

!
Projecting

"Microsoft said the Israeli military, like any other customer, was bound to follow the company's Acceptable Use Policy and AI Code of Conduct, which prohibit the use of products to inflict harm in any way prohibited by law."

Red Flags

High-severity indicators: silencing dissent, coordinated messaging, or weaponizing identity to shut down debate.

-
Silencing indicator
!
Controlled release (spokesperson test)

"The unsigned blog post on Microsoft's corporate website appears to be the company's first public acknowledgement of its deep involvement in the war."

-
Identity weaponization

Techniques Found(8)

Specific propaganda techniques identified using the SemEval-2023 academic taxonomy of 23 techniques across 6 categories.

Obfuscation/VaguenessManipulative Wording
"Microsoft acknowledged its deep involvement in the Gaza war for the first time, but did not directly address questions about precisely how the Israeli military is using its technologies"

This statement highlights a lack of clarity and directness regarding a crucial aspect, indicating that Microsoft is being vague about the specifics of its technology's use.

Obfuscation/VaguenessManipulative Wording
"But the company also said it has found no evidence to date that its Azure platform and AI technologies were used to target or harm people in Gaza."

The phrase 'no evidence to date' leaves open the possibility that evidence could emerge later, or that the investigation was not comprehensive enough, making the statement vague.

Obfuscation/VaguenessManipulative Wording
"The statement did not identify the outside firm or provide a copy of its report."

The lack of identification of the external firm and the report itself makes the review process opaque and vague, hindering independent verification.

Obfuscation/VaguenessManipulative Wording
"The statement also did not directly address several questions about precisely how the Israeli military is using its technologies, and the company declined Friday to comment further."

This directly points to the company's refusal to provide precise details, indicating an intentional act of vagueness regarding critical information.

Obfuscation/VaguenessManipulative Wording
"Microsoft declined to answer written questions from The AP about how its AI models helped translate, sort and analyze intelligence used by the military to select targets for airstrikes."

The refusal to answer specific questions about the AI models' role in target selection demonstrates a deliberate vagueness on a sensitive topic.

Exaggeration/MinimisationManipulative Wording
"We provided this help with significant oversight and on a limited basis, including approval of some requests and denial of others"

The terms 'significant oversight' and 'limited basis' are used to minimize the extent of their involvement and responsibilities, without providing concrete details.

Obfuscation/VaguenessManipulative Wording
"The company did not answer whether it or the outside firm it hired communicated or consulted with the Israeli military as part of its internal probe. It also did not respond to requests for additional details about the special assistance it provided to the Israeli military to recover hostages or the specific steps to safeguard the rights and privacy of Palestinians."

The article explicitly states Microsoft's failure to answer critical questions about its probe and the specifics of its assistance, illustrating a lack of transparency and vagueness.

Obfuscation/VaguenessManipulative Wording
"In its statement, the company also conceded that it 'does not have visibility into how customers use our software on their own servers or other devices.' The company added that it could not know how its products might be used through other commercial cloud providers."

This statement uses vagueness to deflect responsibility, implying a lack of knowledge about how their products are ultimately used, which might be a way to avoid accountability.

Share this analysis