In pulverized Gaza, life has become almost medieval and extremely difficult for traumatized residents a year into Israeli aggression that obliterated their homes and killed their neighbors. In contrast, Israel's offensive stands out as defiantly modern and has raised profound questions over the role of Big Tech in war.
More than any other major conflict this century, Israel's war on Gaza has spotlighted how technologies such as artificial intelligence (AI) and machine learning can be used on the battlefield and what responsibility the makers of these tools should bear.
"From the early days of the war, the campaign has been framed as an opportunity to test and refine how AI is used in war," said Sophia Goodfriend, a post-doctoral fellow at Harvard University who studies Israel's use of AI and automation in war.
"Gaza, like Ukraine, is seen as a war lab for the future."
Employees at Amazon, Microsoft and Alphabet Inc.'s Google became increasingly concerned over whether their companies were empowering Israel's military forces after a slew of reports alleging the use of Big Tech products in Gaza.
Outrage over civilian casualties – the death toll in Gaza stands at more than 41,600, with at least 10,000 believed missing under the rubble – has fanned the flames of employee anger.
Some tech employees have joined workplace protests and been fired, some have left and others have voiced their support for Palestinians in internal groups.
Around this time last year, the number of Amazon workers who had joined a virtual community for employees supporting Palestinian rights numbered around 800.
Today, it has grown fivefold, said an Amazon employee in the group who did not want their name to be used for fear of repercussions.
"Knowing how Amazon Web Services (AWS) are being used (in Gaza) is enough to drive people to get involved," the employee told the Thomson Reuters Foundation.
Asked to comment on the use of its services by the Israeli military, an Amazon spokesperson said: "AWS is focused on making the benefits of our world-leading cloud technology available to all our customers, wherever they are located."
The spokesperson added that the company was committed to ensuring employees were safe and supporting those affected by war.
The destructive war on Gaza was following the Oct. 7, 2023, Hamas incursion, which caused around 1,200 deaths.
Goodfriend said the conflict in Gaza revealed the "lethal" effect of the application of high-tech systems in war.
"The scale of the destruction has made it hard to see any of this technology as neutral – and it's made a lot of people within the tech industry quite critical of supplying systems that are driving warfare," she told the Thomson Reuters Foundation.
"There's no way Israel could have the technical infrastructure that they do without the support of private companies and cloud computing infrastructure – they just wouldn't be able to operate their AI systems without the major tech conglomerates."
Very little is officially known about how exactly Big Tech firms' systems are being used by the Israeli military in Gaza.
Much scrutiny has focused on Project Nimbus, a $1.2 billion contract jointly awarded to Google and Amazon Web Services to supply the Israeli government with cloud computing infrastructure, AI and other tech services.
When it unveiled the project in May 2021, the Israeli government said it was intended to "provide a comprehensive and thorough response for the provision of cloud services to the Government, the Security Services and other entities."
Later that year, Google and Amazon employees published an open letter in the Guardian, condemning the project, which they said "allows for further surveillance of and unlawful data collection on Palestinians, and facilitates the expansion of Israel's illegal settlements on Palestinian land."
Since the start of the war in Gaza, there has been a slew of media reports alleging the use of Project Nimbus technology by the Israel Defense Forces (IDF) in the crowded strip, which is home to 2.3 million people.
But the extent to which Project Nimbus is used by the military remains unclear.
"Where (tech companies) draw the line is blurry," said Deborah Brown, a technology researcher at Human Rights Watch (HRW). "What they offer in the contracts and what the military does with it is shrouded in mystery."
In April, Time magazine said it had seen a Google company document that showed the firm provides cloud computing services to the Israeli Ministry of Defense and that the tech giant had negotiated to deepen its partnership during the war in Gaza.
In August, nonprofit +972 Magazine, which is run by Israeli and Palestinian journalists, published a story citing a leaked recording it said was of a senior IDF commander confirming that the army was using cloud storage and AI services sourced from Google, Microsoft and Amazon.
The previous November, the magazine had published a report saying that the IDF uses AI to generate targets in Gaza.
Following that report, a group of tech workers known as No Tech For Apartheid said that U.S. tech companies doing business with the IDF, including Google and Amazon, were "enabling the first AI-powered genocide."
Google did not respond to a request for comment.
Israel's Ministry of Defense declined to answer questions about the IDF's reliance on infrastructure provided by U.S. tech companies or the allegations in the +972 report.
Previously, the IDF has denied using AI to identify suspected targets.
Microsoft – which had cloud contracts with the IDF that proceeded with Nimbus – is still providing cloud computing services to the military, according to reports.
In May, Microsoft employees launched "No Azure for Apartheid," a campaign to pressure the company to stop providing its Azure Cloud Services to Israel.
Microsoft did not respond to multiple requests for comment about whether and how its technology was being used in Gaza.
Hossam Nasr, a Microsoft worker involved in the No Azure for Apartheid campaign, said workers inside Microsoft became increasingly uneasy as press reports described the IDF's reliance on AI on the battlefield.
"Real humans are being fed into machines and processed by algorithms, and then with the press of a button, it's decided if they get to live another day," he said. "Seeing this happen has been radicalizing."
As the death toll mounted in Gaza, so too did dissent among employees at the tech giants.
Two former Google employees, who were among 50 people fired in April for protesting over Project Nimbus, told the Thomson Reuters Foundation that many employees at their company came to believe that their work extended beyond civilian applications. They were frustrated by the company's refusal to act following reports about how their tech was being used.
"Generally speaking, Google has just dismissed and downplayed concerns throughout the entire time," said Cheyne Anderson, who worked out of Google's Washington office.
Any public discussion within the company around Project Nimbus was shut down by Google during employee town halls and in internal communications, said Mohammad Khatami, who was also fired in April.
"Google would basically either take down the question, delete the question or close the email chains associated with any kind of dissent regarding Project Nimbus," he said.
A few weeks after Oct. 7, Khatami, who is Muslim, circulated an internal petition to pressure Google to drop Project Nimbus, he said. He was the only person to be called in by the human resources department and reprimanded, he added.
"They ... told me that essentially you are justifying terrorism and you got to shut up about this and just put your head down and keep working," he said.
Google did not respond to a request for comment on the incident recounted by Khatami.
Tech giants have also come under fire for censoring pro-Palestinian content on their social media sites, particularly Meta, which owns Facebook and Instagram.
In a report released last year, Human Rights Watch said, "Censorship of content related to Palestine on Instagram and Facebook is systemic and global."
Asked about the report, Meta said: "We aim to apply our global policies fairly, but doing so at scale and during a fast-moving, highly polarized, and intense conflict brings challenges. We acknowledge we make errors that can be frustrating for people, but the implication that we deliberately and systematically suppress a particular voice is false."
Saima Akhter, who worked at the company's New York office until she was fired in June, said reports of censure of pro-Palestinian content prompted Meta staff to question team leaders.
"At this time, we started noticing how Meta was deleting our internal posts," she said, referring to posts in support of Palestinians on employee resource groups, including posts offering condolences to Meta staff who had lost family in Gaza.
"We were internally being heavily censored, and our product concerns were not being taken seriously," Akhter said.
Akhter says she was fired after she uploaded a copy of a document put together by staff detailing how the company allegedly censors Palestinians into her personal cloud storage application.
Meta said it tried to foster a company culture based around "mutual respect and inclusivity" and that there were many channels where employees could raise concerns.
It did not comment when asked to respond to Akhter's comments that she was fired for uploading the document.
Regarding Big Tech's involvement in the war, HRW's Brown said more scrutiny was needed before firms enter into lucrative army contracts.
"Unless there is someone forcing them to do human rights due diligence, to show their work, to be able to explain how they will not be contributing to abuses, and to be able to stop services if they are, then they're going to pursue their profits."