Google officials had concerns about potential human rights violations that might be linked to its $1.2 billion contract with the Israeli government before ever even signing the deal, according to documents first reported on by The New York Times today.
āGoogle Cloud services could be used for, or linked to, the facilitation of human rights violations, including Israeli activity in the West Bank,ā Google lawyers, members of the companyās policy team, and outside consultants wrote in the documents prepared for executives and reviewed by the Times. The documents date to several months before Google announced the deal in May 2021 and show that the company was worried about whether the contract might be bad for its reputation.
āGoogle Cloud services could be used for, or linked to, the facilitation of human rights violationsā
The company has staunchly defended the deal since inking it in 2021, going so far as to fire dozens of employees who protested the contract they believed might involve them in violence against Palestinians. Now, it seems Google was weighing those risks, too ā but ultimately decided to move forward with the deal anyway.
Dubbed Project Nimbus, the contract gives the Israeli government access to cloud services from Google and Amazon. Project Nimbus enabled the use of AI tools to analyze and identify objects in images and videos, according to the Times. It also included videoconferencing and āservices to store and analyze large amounts of data.ā
The most profitable part of the deal was $525 million from Israelās Ministry of Defense expected between 2021 and 2028, the Times reports. Thatās not a huge sum for Google, which reportedly made $258 billion in sales in 2021. But it was enough to give the company some clout with other potential military and intelligence customers.
Google didnāt immediately respond to a request for comment from The Verge. But in April, it said in an emailed statement that āthe Nimbus contract is for workloads running on our commercial cloud by Israeli government ministries, who agree to comply with our Terms of Service and Acceptable Use Policy. This work is not directed at highly sensitive, classified, or military workloads relevant to weapons or intelligence services.ā A Google spokesperson provided a similar statement to the Times.
However, separate Israeli government contract documents recently reported on by The Intercept suggest that Project Nimbus is subject to āadjustedā terms of service rather than Googleās general terms of service.
In the months leading up to the contract in 2021, Google reportedly sought input from consultants including the firm Business for Social Responsibility (BSR). Consultants apparently recommended that the contract bar the sale and use of its AI tools to the Israeli military āand other sensitive customers,ā the report says. BSR reportedly recommended ādue diligenceā on Googleās part to make sure its services werenāt being misused and that Google add its AI principles that prohibit surveillance or weapons to the contract.
Ultimately, the contract reportedly didnāt reflect those recommendations. The contract did, however, include a right to suspend customers for breaching Googleās terms of serviceĀ andĀ acceptable use policy.
Before signing the deal, the Times says, Google had additional concerns about the company itself potentially running into legal quandaries because of the contract:
The company also worried that it would be forced to accept āonerousā risks, such as the possibility that it could run into conflicts with foreign or international authorities if they sought Israeli data and that it might have to ābreach international legal ordersā under the deal terms, according to the documents.
Project Nimbus has become an even bigger flashpoint within the company since the Israel-Hamas war, which has killed more than 44,000 people in Gaza. Google has fired roughly 50 employees for their alleged involvement in protests against Project Nimbus.
āWe did not come to Google to work on technology that kills. By engaging in this contract leadership has betrayed our trust, our AI Principles, and our humanity,ā Billy Van Der Laar, a Google software engineer, said in an emailed statement following protests in April that called on Google to exit Project Nimbus.Ā
Read the full article here