Microsoft says it provided AI to Israel for Gaza war but denies use to harm Palestinians
Analysis Summary
This article attempts to show Microsoft as a responsible company, even acknowledging its role in the Gaza war. It does this by repeatedly mentioning an internal review and an external firm, but it's vague about how these reviews work or what they found, and doesn't explain how its AI might contribute to civilian harm even as it mentions the Israeli military's increased use of AI.
Cross-Outlet PSYOP Detected
This article is part of a narrative being pushed across multiple outlets:
FATE Analysis
Four dimensions of psychological manipulation: how content captures Focus, exploits Authority, triggers Tribal identity, and engineers Emotion.
Focus signals
"Microsoft acknowledged Thursday that it sold advanced artificial intelligence and cloud computing services to the Israeli military during the war in Gaza and aided in efforts to locate and rescue Israeli hostages. But the company also said it has found no evidence to date that its Azure platform and AI technologies were used to target or harm people in Gaza."
This is framed as Microsoft's 'first public acknowledgement' of deep involvement, suggesting new and significant information being revealed to capture attention.
"Emelia Probasco, a senior fellow for the Center for Security and Emerging Technology at Georgetown University, said the statement is noteworthy because few commercial technology companies have so clearly laid out standards for working globally with international governments."We are in a remarkable moment where a company, not a government, is dictating terms of use to a government that is actively engaged in a conflict," she said. "It's like a tank manufacturer telling a country you can only use our tanks for these specific reasons. That is a new world.""
The expert's commentary emphasizes the 'remarkable' and 'new world' aspect of a company dictating terms to a government in conflict, framing the situation as unprecedented and therefore worthy of sustained focus.
Authority signals
"It comes nearly three months after an investigation by The Associated Press revealed previously unreported details about the American tech giant's close partnership with the Israeli Defense Ministry..."
Leverages the institutional credibility of 'The Associated Press' and its 'investigation' to lend weight and veracity to the claims about Microsoft's involvement.
"Meanwhile, human rights groups have raised concerns that AI systems, which can be flawed and prone to errors, are being used to help make decisions about who or what to target, resulting in the deaths of innocent people."
Invokes the 'concerns' of 'human rights groups' to add moral and ethical authority to the critique of AI use in targeting, even without naming specific groups.
"Emelia Probasco, a senior fellow for the Center for Security and Emerging Technology at Georgetown University, said the statement is noteworthy because few commercial technology companies have so clearly laid out standards for working globally with international governments."
Presents Emelia Probasco as an expert through her title and affiliation ('senior fellow for the Center for Security and Emerging Technology at Georgetown University') to validate and analyze the significance of Microsoft's statement.
Tribe signals
"No Azure for Apartheid, a group of current and former Microsoft employees, called on Friday for the company to publicly release a full copy of the investigative report."It's very clear that their intention with this statement is not to actually address their worker concerns, but rather to make a PR stunt to whitewash their image that their relationship with the Israeli military has tarnished," said Hossam Nasr, a former Microsoft worker fired in October after he helped organize an unauthorized vigil at the company's headquarters for Palestinians killed in Gaza."
Creates an 'us-vs-them' dynamic between Microsoft (portrayed as whitewashing its image) and disgruntled 'current and former Microsoft employees' and activists (portrayed as holding Microsoft accountable on behalf of Palestinians). Hossam Nasr's quote clearly delineates this conflict.
"Tech workers from Google, Meta and Amazon protest against Big Tech supplying Israel with intelligence tools outside Google offices in Manhattan, NY in 2024."
This image caption, coupled with the broader narrative, subtly reinforces a division between 'Big Tech' and those protesting its involvement with the Israeli military, creating an 'us-vs-them' visual.
Emotion signals
"Meanwhile, human rights groups have raised concerns that AI systems, which can be flawed and prone to errors, are being used to help make decisions about who or what to target, resulting in the deaths of innocent people."
This statement uses language designed to provoke outrage and moral indignation by associating 'flawed and prone to errors' AI systems with 'deaths of innocent people', linking technological failure directly to severe human cost.
"Microsoft acknowledged Thursday that it sold advanced artificial intelligence and cloud computing services to the Israeli military during the war in Gaza and aided in efforts to locate and rescue Israeli hostages."
The phrase 'aided in efforts to locate and rescue Israeli hostages' coupled with 'during the war in Gaza' implicitly evokes a sense of urgency and high stakes, playing on the reader's empathy for the hostages.
"No Azure for Apartheid, a group of current and former Microsoft employees, called on Friday for the company to publicly release a full copy of the investigative report."It's very clear that their intention with this statement is not to actually address their worker concerns, but rather to make a PR stunt to whitewash their image that their relationship with the Israeli military has tarnished," said Hossam Nasr, a former Microsoft worker fired in October after he helped organize an unauthorized vigil at the company's headquarters for Palestinians killed in Gaza."
Hossam Nasr's quote directly calls Microsoft's actions a 'PR stunt to whitewash their image' and connects it to his firing over a vigil for 'Palestinians killed in Gaza', intentionally designed to elicit outrage and sympathy for workers and victims against a calculating corporation.
Narrative Analysis (PCP)
How the article reshapes thinking: Perception (what beliefs are targeted), Context (what information is shifted or omitted), and Permission (what behavior is being encouraged).
The article aims to instill the belief that Microsoft, despite its involvement in the Gaza war, is a responsible and principled corporate actor that adheres to ethical standards and tries to mitigate harm, even when dealing with militaries in conflict zones. It suggests that Microsoft's internal review and stated principles provide a sufficient safeguard against misuse of its technology.
The article shifts the context from focusing on the direct impact of AI technology in warfare and potential civilian casualties to the company's internal policies, its 'principles,' and efforts to save hostages. This framing makes Microsoft's limited disclosures and the absence of clear answers about military use seem acceptable, as the company is presented as being caught between competing ethical demands.
The article omits detailed context regarding the scale and nature of civilian casualties in Gaza, which would highlight the potential severity of 'targeting' errors and the implications of AI involvement. It also largely omits the specific mechanisms by which AI-enabled targeting systems might contribute to or accelerate decision-making processes that result in harm, beyond a general mention of 'flawed and prone to errors.' Specific details from the AP investigation about the nearly 200-times increase in military use of commercial AI are mentioned, but not analyzed for their operational implications when contextualized with reports of civilian deaths.
The reader is nudged towards accepting Microsoft's current level of transparency and its internal review process as sufficient, and towards believing that corporations can ethically engage with militaries in conflict zones while upholding their 'principles.' It implicitly grants permission for readers to perceive Microsoft as a conscientious actor, despite significant unanswered questions regarding the application and impact of its technologies.
SMRP Pattern
Four manipulation maintenance tactics: Socializing the idea as normal, Minimizing concerns, Rationalizing with logic, and Projecting blame.
"But the company also said it has found no evidence to date that its Azure platform and AI technologies were used to target or harm people in Gaza."
"We provided this help with significant oversight and on a limited basis, including approval of some requests and denial of others,"
Red Flags
High-severity indicators: silencing dissent, coordinated messaging, or weaponizing identity to shut down debate.
"The unsigned blog post on Microsoft's corporate website appears to be the company's first public acknowledgement of its deep involvement in the war."
Techniques Found(11)
Specific propaganda techniques identified using the SemEval-2023 academic taxonomy of 23 techniques across 6 categories.
"Microsoft acknowledged its deep involvement in the Gaza war for the first time, but did not directly address questions about precisely how the Israeli military is using its technologies"
The article highlights Microsoft's lack of direct address to critical questions, indicating a deliberate use of vagueness regarding its technology's specific military applications.
"But the company also said it has found no evidence to date that its Azure platform and AI technologies were used to target or harm people in Gaza."
This statement is vague by omitting details about the scope or methodology of the 'evidence to date' and 'no evidence' claim, leaving room for interpretation about what was actually investigated or found.
"The statement did not identify the outside firm or provide a copy of its report."
This conceals crucial details about the external firm and its findings, contributing to a lack of transparency and making it difficult for readers to assess the credibility of Microsoft's internal review.
"The statement also did not directly address several questions about precisely how the Israeli military is using its technologies, and the company declined Friday to comment further."
Microsoft's refusal to directly answer questions and its decline to comment further demonstrates a pattern of obfuscation regarding the specific uses of its technology.
"Microsoft declined to answer written questions from The AP about how its AI models helped translate, sort and analyze intelligence used by the military to select targets for airstrikes."
This direct refusal to provide information about how its AI models are used in target selection exemplifies obfuscation, avoiding a clear and specific account of its involvement.
"Microsoft said it had also provided 'special access to our technologies beyond the terms of our commercial agreements' and 'limited emergency support' to Israel as part of the effort to help rescue the more than 250 hostages taken by Hamas on October 7."
The phrases 'special access beyond the terms of our commercial agreements' and 'limited emergency support' are vague, lacking specific details about what these entail and the extent of the support provided.
"We provided this help with significant oversight and on a limited basis, including approval of some requests and denial of others"
The terms 'significant oversight' and 'limited basis' are vague, as they do not specify what that oversight entails or the criteria for approving versus denying requests, leaving the reader with an unclear understanding.
"We believe the company followed its principles on a considered and careful basis, to help save the lives of hostages while also honoring the privacy and other rights of civilians in Gaza."
The phrase 'followed its principles on a considered and careful basis' is vague and provides no verifiable details or evidence to support the claim of upholding principles and honoring rights.
"The company did not answer whether it or the outside firm it hired communicated or consulted with the Israeli military as part of its internal probe. It also did not respond to requests for additional details about the special assistance it provided to the Israeli military to recover hostages or the specific steps to safeguard the rights and privacy of Palestinians."
This highlights multiple instances where Microsoft omits crucial details about its probe and assistance, preventing a clear understanding of its actions and their implications.
"In its statement, the company also conceded that it "does not have visibility into how customers use our software on their own servers or other devices." The company added that it could not know how its products might be used through other commercial cloud providers."
This statement uses vagueness to minimize Microsoft's responsibility, suggesting they cannot fully track or be accountable for how their technology is used once it's on customer servers or through other cloud providers.
"In its statement, the company said it had found "no evidence" that the Israeli military had violated those terms."
Similar to previous instances, the claim of 'no evidence' is vague without specifying the scope, duration, or methodology of the investigation that led to this conclusion, leaving its validity open to question.