Microsoft says its Azure and AI tech hasn’t harmed people in Gaza

theverge.com·Tom Warren·2025-05-16
View original article
0out of 100
High — clear manipulation patterns detected

This article uses Microsoft's assurances to downplay concerns about its technology's use in Gaza, creating a sense that criticism is unfounded. It relies on emotional language and a strong us-vs-them dynamic to dismiss employee and activist concerns, portraying them as unsubstantiated despite acknowledging limits to Microsoft's own investigation.

FATE Analysis

Four dimensions of psychological manipulation: how content captures Focus, exploits Authority, triggers Tribal identity, and engineers Emotion.

Focus2/10Authority3/10Tribe5/10Emotion7/10
FFocus
0/10
AAuthority
0/10
TTribe
0/10
EEmotion
0/10

Focus signals

attention capture
"Microsoft says it has found no evidence that the Israeli military has used its Azure and AI technology to harm Palestinian civilians or anyone else in Gaza."

The opening sentence directly addresses a controversial and newsworthy topic, immediately capturing attention by presenting a definitive statement from a major corporation on a highly sensitive issue.

Authority signals

institutional authority
"Microsoft says it has “conducted an internal review and engaged an external firm,” to perform a review..."

Leverages Microsoft's institutional weight and the implied credibility of 'an external firm' to validate its claims and findings, suggesting a thorough and unbiased process.

institutional authority
"Microsoft says that its relationship with the Israel Ministry of Defense (IMOD) is “structured as a standard commercial relationship,”"

Microsoft's statement frames its relationship with IMOD as standard business practice, using its corporate authority to normalize and justify the contract.

Tribe signals

us vs them
"The group accuses Microsoft of “supporting and enabling an apartheid state,“ by not suspending sales of cloud and AI services to Israel, like it did to Russia when it invaded Ukraine."

This quote from 'No Azure for Apartheid' creates a clear us-vs-them dynamic, pitting Microsoft (and by extension, those who support its contracts) against the group's stance, framing Microsoft as 'enabling an apartheid state' and drawing a parallel to a clear moral binary (Russia's invasion).

identity weaponization
"There is no form of selling technology to an army that is plausibly accused of genocide — whose leaders are wanted for war crimes and crimes against humanity by the International Criminal Court — that would be ethical,” says Nasr. “That’s the premise that we reject."

This statement from Hossam Nasr weaponizes ethical identity, suggesting that anyone who does not reject selling technology to such an army is complicit or unethical. This can make readers feel that their moral standing is tied to their agreement with Nasr's position.

us vs them
"Not once did they name Palestinians or Palestinian people or Palestine” in the blog post. “I think that still speaks to where Microsoft’s business interests truly lie."

Nasr's observation creates a subtle us-vs-them tribal marker, implying Microsoft's omission of 'Palestinians' signals where its loyalties and 'business interests' truly lie, thereby aligning Microsoft with one side of a conflict and distancing it from the other.

Emotion signals

outrage manufacturing
"one calling Microsoft’s AI CEO, Mustafa Suleyman, a “war profiteer” and demanding that Microsoft “stop using AI for genocide in our region.”"

The terms 'war profiteer' and 'genocide' are highly charged and engineered to evoke strong outrage and moral condemnation, bypassing rational debate about technology use.

moral superiority
"Hossam Nasr, an organizer of No Azure for Apartheid, is quoted calling out Microsoft’s statement as contradictory in a response from the group: “In one breath, they claim that their technology is not being used to harm people in Gaza,” while also admitting “they don’t have insight into how their technologies are being used.” According to the group, “In their statement yesterday, Microsoft has actually put on record the company’s direct involvement in the Palestinian genocide.”"

The accusation of 'direct involvement in the Palestinian genocide' is a highly emotional statement, designed to elicit profound moral outrage and position the group as morally superior for calling out this alleged involvement. The framing of Microsoft's statement as 'contradictory' further fuels this, suggesting evasiveness in the face of grave accusations.

outrage manufacturing
"“There is no form of selling technology to an army that is plausibly accused of genocide — whose leaders are wanted for war crimes and crimes against humanity by the International Criminal Court — that would be ethical,” says Nasr."

This statement is designed to provoke outrage and moral indignation by linking Microsoft's actions directly to accusations of 'genocide' and 'war crimes' against the Israeli military, implying that Microsoft is acting unethically.

Narrative Analysis (PCP)

How the article reshapes thinking: Perception (what beliefs are targeted), Context (what information is shifted or omitted), and Permission (what behavior is being encouraged).

What it wants you to believe

The article aims to instill the belief that Microsoft is a responsible and ethical corporate actor, diligently investigating concerns and finding no wrongdoing related to its technology's use in Gaza. It wants readers to believe that Microsoft's internal and external reviews are thorough and conclusive, and that criticism from employees/activists is unsubstantiated by actual evidence.

Context being shifted

The article establishes the context of a 'standard commercial relationship' and a company committed to ethical AI use, making Microsoft's actions appear normal and above board even amidst accusations. It frames the debate around a technical compliance review rather than the broader ethical implications of providing technology to a military entity involved in conflict.

What it omits

The article notes that Microsoft 'does not have visibility into how customers use our software on their own servers or other devices.' This critical limitation on the scope of their 'review' is presented almost as an aside, rather than fundamentally undermining the conclusiveness of their 'no evidence' finding. The specific allegations from 'The Guardian and the Associated Press' about mass surveillance and AI use for intelligence gathering by the Israeli military using Azure and OpenAI, are mentioned but not fully integrated into the assessment of Microsoft's 'review' findings. The broader context of the conflict in Gaza and the widely reported civilian casualties, which are the root cause of the employee concerns, are largely absent, making the company's 'no harm' claim feel more absolute than it might otherwise.

Desired behavior

The article nudges the reader to accept Microsoft's assurances, dismiss the concerns raised by employees and activists as unfounded, and continue to view Microsoft as an ethically sound technology provider. It seeks to license continued trust in Microsoft and its business practices.

SMRP Pattern

Four manipulation maintenance tactics: Socializing the idea as normal, Minimizing concerns, Rationalizing with logic, and Projecting blame.

-
Socializing
!
Minimizing

"Microsoft says it has found no evidence that the Israeli military has used its Azure and AI technology to harm Palestinian civilians or anyone else in Gaza."

!
Rationalizing

"“It is worth noting that militaries typically use their own proprietary software or applications from defense-related providers for the types of surveillance and operations that have been the subject of our employees’ questions,” says Microsoft in its blog post. “Microsoft has not created or provided such software or solutions to the IMOD.”"

-
Projecting

Red Flags

High-severity indicators: silencing dissent, coordinated messaging, or weaponizing identity to shut down debate.

-
Silencing indicator
!
Controlled release (spokesperson test)

"Microsoft says it has “conducted an internal review and engaged an external firm,” to perform a review, after some Microsoft employees have repeatedly called on the company to cut its contracts with the Israeli government.Microsoft says that its relationship with the Israel Ministry of Defense (IMOD) is “structured as a standard commercial relationship,” and that it has “found no evidence that Microsoft’s Azure and AI technologies, or any of our other software, have been used to harm people or that IMOD has failed to comply with our terms of service or our AI Code of Conduct.”"

-
Identity weaponization

Techniques Found(10)

Specific propaganda techniques identified using the SemEval-2023 academic taxonomy of 23 techniques across 6 categories.

Obfuscation/VaguenessManipulative Wording
"Microsoft says it has “conducted an internal review and engaged an external firm,” to perform a review, after some Microsoft employees have repeatedly called on the company to cut its contracts with the Israeli government."

The phrase 'engaged an external firm' is vague, as it does not name the firm, its qualifications, or the specific terms of its engagement, making it difficult to assess the thoroughness or impartiality of the review.

Obfuscation/VaguenessManipulative Wording
"Microsoft says that its relationship with the Israel Ministry of Defense (IMOD) is “structured as a standard commercial relationship,”"

The term 'standard commercial relationship' is vague and provides no specific details about the nature of the services, products, or support provided, thus obscuring what such a relationship entails in this context.

MinimisationManipulative Wording
"The review process included “interviewing dozens of employees and assessing documents,” looking for evidence that Microsoft technologies were being used to target or harm anyone in Gaza. However, the company notes that it “does not have visibility into how customers use our software on their own servers or other devices,” so the evidence to inform its review is clearly very limited in scope."

Microsoft's admission that its 'evidence to inform its review is clearly very limited in scope' minimizes the impact of its own investigation, effectively downplaying the thoroughness and reliability of its findings while still presenting the review as an authoritative statement.

Loaded LanguageManipulative Wording
"with one calling Microsoft’s AI CEO, Mustafa Suleyman, a “war profiteer” and demanding that Microsoft “stop using AI for genocide in our region.”"

The terms 'war profiteer' and 'genocide' are emotionally charged and designed to evoke strong negative reactions, immediately framing Microsoft's involvement in a highly critical and accusatory light.

Name Calling/LabelingAttack on Reputation
"with one calling Microsoft’s AI CEO, Mustafa Suleyman, a “war profiteer” and demanding that Microsoft “stop using AI for genocide in our region.”"

The label 'war profiteer' is used to discredit Mustafa Suleyman and Microsoft by associating them with unethical and exploitative behavior during a conflict.

Loaded LanguageManipulative Wording
"According to the group, “In their statement yesterday, Microsoft has actually put on record the company’s direct involvement in the Palestinian genocide.”"

The phrase 'direct involvement in the Palestinian genocide' uses extremely strong, emotionally charged language to directly accuse Microsoft of complicity in severe human rights abuses, aiming to inflame public opinion.

False DilemmaSimplification
"The group accuses Microsoft of “supporting and enabling an apartheid state,“ by not suspending sales of cloud and AI services to Israel, like it did to Russia when it invaded Ukraine."

This statement presents a false dilemma by implying that Microsoft's only choices are either to suspend sales to Israel (thus not supporting an 'apartheid state') or to continue sales and be complicit. It ignores other potential actions or considerations Microsoft might have.

Causal OversimplificationSimplification
"The group accuses Microsoft of “supporting and enabling an apartheid state,“ by not suspending sales of cloud and AI services to Israel, like it did to Russia when it invaded Ukraine."

This quote attributes the accusation of 'supporting and enabling an apartheid state' solely to Microsoft's decision not to suspend sales, oversimplifying the complex political and ethical considerations of business operations in conflict zones to a single cause or inaction.

Causal OversimplificationSimplification
"“There is no form of selling technology to an army that is plausibly accused of genocide — whose leaders are wanted for war crimes and crimes of humanity by the International Criminal Court — that would be ethical,” says Nasr."

Nasr oversimplifies the ethics of selling technology by asserting that any sale to an army accused of certain crimes is inherently unethical, without acknowledging the nuances of technology use, contractual obligations, or the potential for abuse by individual actors within a larger organization.

Loaded LanguageManipulative Wording
"“There is no form of selling technology to an army that is plausibly accused of genocide — whose leaders are wanted for war crimes and crimes of humanity by the International Criminal Court — that would be ethical,” says Nasr."

The phrases 'plausibly accused of genocide' and 'wanted for war crimes and crimes of humanity' use highly charged language designed to evoke moral outrage and immediately condemn any associated action as unethical, without necessarily presenting concrete evidence or a full legal determination.

Share this analysis