PX14A6G 1 e56221px14a6g.htm

 

United States Securities and Exchange Commission Washington, D.C. 20549

 

NOTICE OF EXEMPT SOLICITATION Pursuant to Rule 14a-103

 

Name of the Registrant: Alphabet Inc.

Name of persons relying on exemption: Edward Feigen

Address of persons relying on exemption: 1907 Gaspar Dr. Oakland, CA 94611

 

 

Written materials are submitted pursuant to Rule 14a-6(g) (1) promulgated under the Securities Exchange Act of 1934. Submission is not required of this filer under the terms of the Rule, but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

 

 

 

 

Shareholder Rebuttal to Alphabet, Inc.

 

The Proponents urge you to vote FOR Proposal 21 on the proxy, the shareholder proposal requesting that the board issue a risk assessment regarding Alphabet Inc.’s support for military and militarized policing agency activities.

 

Summary

 

Proposal

 

The Proposal requests that the board issue a report reassessing policies of Alphabet, Inc. (the “Company”) on support for military and militarized policing agency activities and their impacts on stakeholders, user communities, and the Company’s reputation and finances. The supporting statement contains recommendations for the content of the report, at board and management discretion, and also states that the report should assess potential changes to current policies, such as a policy to avoid entering into or renewing contracts with military and militarized policing agencies.

 

Why You Should Support this Resolution

 

Deep division within the company and public controversy over the Company’s past, current, and pending future contracts with militaries and militarized policing agencies may jeopardize the Company’s competitive advantage, public image, reputation, and commitments to hiring and retaining a diverse workforce. These concerns may have a significant impact on the Company’s current and future prospects, and therefore on its shareholders.

 

   
 

 

Ethical opposition from employees and the public to the Company’s military and militarized policing agency partnerships has its roots in Google’s “don’t be evil” origins and self-image, which set the company apart from its competitors. The internal and external opposition could cast doubt among current and future clients as to Google’s ability to fulfill contractual obligations without significant headwinds. Management has asserted continued interest in pursuing controversial contracts with militaries and militarized policing agencies. This is a sharp contrast to previous decisions not to renew or bid for such contracts in the wake of employee and public concern.1 A risk assessment is urgently needed to protect the interests of stakeholders and shareholders alike.

 

1. Company partnerships with militaries and militarized policing agencies run a high risk of causing harm to stakeholders and will likely continue to face opposition from employees and the public alike. While the Company could otherwise be seen as the ethical leader among its competitors, the partnerships seem to contradict Google’s “don’t be evil” roots and recently issued AI principles. This may undercut the Company’s positive image and positioning among its peers.

 

As the New York Times so aptly put in 2018: “Many tech companies have sought military business without roiling their work forces. But Google’s roots and self-image are different. ‘We have kind of a mantra of ‘don’t be evil,’ which is to do the best things that we know how for our users, for our customers and for everyone,’ Larry Page told Peter Jennings in 2004, when ABC News named Mr. Page and his Google co-founder, Sergey Brin, “People of the Year.”2

 

Public and employee objection to Google pursuing contracts with militaries and militarized policing agencies is grounded in the harmful impact that such contracts have on society and all communities, especially marginalized ones. Providing Google technology to militaries and militarized policing agencies—whether it be to, for example, provide basic Cloud services, or to provide AI technology towards specific activities, such as improving support for weaponized drone technology in the case of Project Maven3—means, in one way or another, providing technology that has a high likelihood of enabling violent warfare, illegal military occupation, and land grabs that violate international law, mass surveillance, and other activities overseen by these military entities. These contracts run a particularly high risk of contributing to physical harm to people, as well as threatening privacy and other human rights, despite Google’s AI Principles.

 

_____________________________

1 It has been widely publicized that after management’s decision not to renew Google’s “Project Maven” contract with the U.S. Department of Defense in the wake of significant employee and public pushback, the Company declined to bid on the JEDI contract with the Pentagon. Now, management has announced that the Company is currently vying for the Joint Warfighter Cloud Capability contract with the Pentagon.

 

Brustein, Joshua and Mark Bergen. “Google wants to do business with the military–Many of its employees don’t.” Bloomberg Businessweek, 21 Nov., 2019, https://www.bloomberg.com/features/2019-google-military- contract-dilemma/

 

Simonite, Tom. “3 years after the Project Maven uproar, Google cozies to the Pentagon.” WIRED, 18 Nov., 2021, https://www.wired.com/story/3-years-maven-uproar-google-warms-pentagon/

 

2 Shane, Scott, Cade Metz, and Daisuke Wakabayashi. “How a Pentagon contract became an identity crisis for Google.” The New York Times, 30 May, 2018, https://www.nytimes.com/2018/05/30/technology/google-project- maven-pentagon.html

3 Stroud, Matt. “The Pentagon is getting serious about AI weapons.” The Verge, 12 April, 2018, https://www.theverge.com/2018/4/12/17229150/pentagon-project-maven-ai-google-war-military

 

   
 

 

Such contracts with militaries and militarized policing agencies may continue to put Google’s public image and reputation at risk because these partnerships continue to associate the Company with harmful and unpopular activities. For example, Google’s “Project Nimbus” Cloud computing contract with the Israeli military and government continues to associate Google with what Human Rights Watch4 and Amnesty International5 have both deemed a system of apartheid that is military-enforced, and which continues to harm Palestinians.6 While evidently the Company has ruled that such a contract passes muster according to Google’s AI Principles, Project Nimbus associates Google with harmful, ethically questionable activities, such as military-enforced, racially discriminatory restriction of movement that cruelly causes separation of families,7 and displacement and dispossession of Palestinian families while allowing for the expansion of illegal settlements.8 We believe that while Google’s AI Principles have helped improve Google’s public image by committing the Company to ethical behavior, military and militarized policing agency contracts risk continuing to associate the Company with ethically questionable, harmful activities.

 

In recent years for Company employees, a focal point of Google’s original “don’t be evil” moral creed9 has been the Company’s equivocation on whether or not the Company should engage in support for military operations, especially where there is a high risk of the supported activities infringing on human rights:

 

In 2018, Google employees rushed to petition and protest against the use of artificial intelligence for drone technology for Project Maven, where Google sold artificial intelligence technology to the Pentagon to potentially improve drone targeting. A wave of employee resignations10 and protest—as well as press focused on the quasi civil-war within Google around employee opposition and public opinion regarding the use of Google AI for the Department of Defense11—ultimately contributed to the Company deciding not to bid for contract renewal. Such upheaval also led to the creation of Google’s AI Principles,12 a welcome commitment to ethics in AI that could set Google apart from its competitors, but for its military contracts.

 

_____________________________

4 Human Rights Watch. “A Threshold Crossed: Israeli Authorities and the Crimes of Apartheid and Persecution.” 27 April, 2021, https://www.hrw.org/report/2021/04/27/threshold-crossed/israeli-authorities-and- crimes-apartheid-and-persecution

5 Amnesty International. “Israel’s Apartheid Against Palestinians: Cruel System of Domination and Crime Against Humanity.” 2022, https://www.amnesty.org/en/wp- content/uploads/2022/02/MDE1551412022ENGLISH.pdf

6Israel’s occupation of Palestinian Territory is ‘apartheid’: UN rights expert.” 25 March, 2022. United Nations News, https://news.un.org/en/story/2022/03/1114702

7 Amnesty International. “Israel’s Apartheid Against Palestinians: Cruel System of Domination and Crime Against Humanity.” 2022, https://www.amnesty.org/en/wp- content/uploads/2022/02/MDE1551412022ENGLISH.pdf, pg. 98-104

8 B’Tselem and Keren Navot. “This is Ours – And This, Too: Israel’s Settlement Policy in the West Bank.” March 2021. https://www.btselem.org/sites/default/files/publications/202103_this_is_ours_and_this_too_eng.pdf, pg. 11

9 “[D]on’t be evil, and if you see something that you think isn’t right – speak up,” was removed from the preface but still remains in Google’s code of conduct.

10 Burns, Janet. “Google employees resign over company’s Pentagon contract, ethical habits.” Forbes, 14 May, 2018, https://www.forbes.com/sites/janetwburns/2018/05/14/google-employees-resign-over-firms-pentagon-contract-ethical-habits/?sh=3a4634bd4169

11 Shane, Scott, Cade Metz, and Daisuke Wakabayashi. “How a Pentagon contract became an identity crisis for Google.” The New York Times, 30 May, 2018, https://www.nytimes.com/2018/05/30/technology/google-project-maven-pentagon.html

12 Sandoval, Greg. “Google just released a set of ethical principles about how it will use AI technology.” Business Insider, 7 June, 2018, https://www.businessinsider.com/google-principles-ai-tech-2018-6

 

   
 

 

In 2019, employee activism surfaced in opposition to Google’s partnership with militarized policing agency U.S. Customs and Border Patrol (CBP). The legal battles resulting from Google leadership’s handling of such activism13 yet again generated negative press regarding the militarization of Google technology. In the wake of news exposing the inhumane treatment of immigrants at the US-Mexico border, Google employees’ public petition14 and protests opposing Google bidding on a contract with CBP yet again sparked a debate over Google’s use of technology for militarized policing agencies. Controversy surrounded the firing of five Google employees involved in activism against CBP collaboration,15 an ensuing lawsuit,16 and a National Labor Relations Board case deeming that the firings were illegally retaliatory,17 all have continued to jeopardize the company’s public image and reinforce the idea that employee activism against military contracts may be a liability.

 

 

 

In 2021, nearly 700 Google employees signed a public letter18 and several employees spoke out in a variety of mainstream press outlets calling on Google to end the “Project Nimbus” contract with the Israeli military and government—specifically citing concerns with the use of Google Cloud technology to oppress Palestinians by upholding an illegal military occupation.19

 

_____________________________

13 Schneider, Joe. “Google workers sue over firings stemming from border project.” Bloomberg, 29 Nov., 2021, https://www.bloomberg.com/news/articles/2021-11-29/google-workers-sue-over-firings-stemming-from-border-project

14 “Google must stand against human rights abuses #NoGCPforCBP.” Medium, 14 Aug, 2019, https://medium.com/@no.gcp.for.cbp/google-must-stand-against-human-rights-abuses-nogcpforcbp-88c60e1fc35e

15 Wong, Julia Carrie. “Google fires employee who protested company’s work with US border patrol.” The Guardian, 25 Nov., 2019, https://www.theguardian.com/technology/2019/nov/25/google-firing-protest-rebecca-rivers

16 Schneider, Joe. “Google workers sue over firings stemming from border project.” Bloomberg, 29 Nov., 2021, https://www.bloomberg.com/news/articles/2021-11-29/google-workers-sue-over-firings-stemming-from-border-project

17 Paul, Kari. “Google broke US law by firing workers behind protests, complaint says.” The Guardian, 2 Dec, 2020, https://www.theguardian.com/technology/2020/dec/02/google-labor-laws-nlrb-surveillance-worker-firing

18 https://www.theguardian.com/commentisfree/2021/oct/12/google-amazon-workers-condemn-project-nimbus- israeli-military-contract

19 Schubiner, Gabriel and Bathool Sayed. “New Amazon and Google contracts with Israel betray company values, and workers like us.” 12 Oct., 2021. https://www.nbcnews.com/think/opinion/new-amazon-google-contracts-israel-betray-company-values- workers-us-ncna1281349

 

   
 

 

2. Consistent ethical opposition to military and militarized policing agency partnerships could cast doubt amongst current and future clients as to the Company’s ability to quietly, confidently fulfill contractual obligations.

 

In Alphabet’s opposition statement, the Board of Directors states that a risk assessment of military and militarized policing agency contracts is not needed for shareholders because Google’s AI principles and transparency surrounding such partnerships ensure ethical behavior and accountability.

 

We see Google’s AI Principles—issued in 2018 after employee dissent and public controversy surrounding the application of Google technology to military drone use (“Project Maven”)–as welcome steps to help ensure employees, the public, shareholders, and prospective clients alike can rest assured that Google is an ethical actor.

 

These measures have not stopped employee activism in opposition to these contracts nor kept Google out of mainstream news publications for controversy and significant labor unrest tied to employee opposition to military and militarized policing contracts.

 

Such unrest (including employee protest, risk of whistle-blowing, firings tied to employee protest, lawsuits connected with labor violations, etc.) may give the impression that in a highly competitive field, Google is not a reliable choice for prospective clients looking for contractual obligations to be fulfilled without disruption or unwanted attention.

 

Without a risk assessment, neither shareholders nor Company management can fully understand the impact of military and militarized policing contracts on the long-term success of the Company.

 

3. Contracts with military and militarized policing agencies may jeopardize the Company’s ability to recruit and maintain a diverse, competitive workforce, thus undermining the Company’s commitments to Diversity, Equity, and Inclusion.

 

As shareholders, we are proud that the Company is committed to diverse and inclusive hiring from marginalized and minority communities in order to maintain a diverse, innovative workforce that gives the Company a competitive edge and maintains a positive public image; however, given the contexts in which militaries and militarized policing agencies operate, marginalized and minority communities are most likely to be harmed—and thus such partnerships will continue to associate the Company with violence against, surveillance, and criminalization of marginalized communities, including Black, brown and Muslim communities in the U.S. and abroad. Such partnerships may continue to be perceived as creating a hostile environment for current and future employees who identify as members of these marginalized communities.

 

   
 

 

There are several examples of how the activities of militaries and militarized policing agencies that Google has done or continues to do business with often most harm marginalized communities, in some cases in violation of international law. For instance, there is now ample evidence that the U.S. military committed and enabled human rights abuses against Afghani civilians as a part of its invasion of and warfare in Afghanistan.20 In addition, Mexican and Central American immigrants, including children, have been amongst the communities most harmed by the inhumane practices of the militarized policing agency U.S. Customs and Border Protection.21 Human Rights Watch, Amnesty International, and Israeli human rights organization B’tselem have documented extensively the harm that the Israeli military-enforced system of apartheid continues to do to Palestinians through systematic racial discrimination in everything from voting rights22 to freedom of movement.23 These are just some examples of how the activities of militaries and militarized policing agencies often most harm marginalized communities, thus giving current and prospective Company employees a sense that the Company condones such discriminatory activities against communities including Black, brown, and Muslim communities in the U.S. and abroad.

 

Alphabet’s Opposition Statement

 

As previously referenced, in its opposition statement the Board of Directors states that a risk assessment of military and militarized policing agency contracts is not needed for shareholders because 1) Google’s AI Principles prevent Google’s military and militarized policing contracts from using Google technology to harm users, enable illegal surveillance, or build weapons; and 2) the Company’s transparency regarding its military and militarized policing contracts is sufficient to assuage shareholders that Google is behaving ethically within these contracts.

 

_____________________________

20 Gossman, Patricia. “Afghanistan Papers Detail US Role in Abuse.” Human Rights Watch. 11 Dec, 2019, https://www.hrw.org/news/2019/12/11/afghanistan-papers-detail-us-role-abuse

21 Long, Clara. Written testimony submitted to the U.S. House Committee on Oversight and Reform Subcommittee on Civil Rights and Civil Liberties for a hearing on “Kids in Cages: Inhumane Treatment at the Border.” 10 July, 2019, https://www.hrw.org/sites/default/files/supporting_resources/kids_in_cages_testimony.pdf

22 As Human Rights Watch summarizes on page 146 of their report “Israel’s Apartheid Against Palestinians: Cruel System of Domination and Crime Against Humanity,” : “Palestinians in Israel are citizens who have the right to vote in national elections, unlike Palestinians in the West Bank, Gaza, and East Jerusalem (except for the small minority of Palestinian Jerusalemites who have applied for and been granted Israeli citizenship).”

23 As is stated on pages 3-4 of the 2016 Report of the Secretary-General to the United Nations Human Rights Council: “Palestinians’ freedom of movement is restricted through a complex and multilayered system of administrative, bureaucratic and physical constraints that permeate almost all facets of everyday life,” including a “permit regime” that “ allows Israeli authorities to limit and control Palestinians’ movement in the Occupied Palestinian Territory beyond their immediate residential area…Movement by Palestinians within the West Bank is restricted by a system of checkpoint and permit requirements, as well as by the expansion of settlements and related infrastructure.”

 

Report of the Secretary-General to the United Nations Human Rights Council. “Freedom of Movement Human rights situation in the Occupied Palestinian Territory, including East Jerusalem.” February 2016, https://www.ohchr.org/sites/default/files/Documents/Countries/PS/SG_Report_FoM_Feb2016.pdf

 

   
 

 

Google’s AI Principles, which were the Company response to employee and public concern about the use of Google technology in drone programming by the Department of Defense, and transparency, are welcome mechanisms to assure shareholders and the wider public that Google is behaving ethically.

 

Unfortunately, in the opinion of the Proponent, the Board of Directors in its opposition statement has neither

 

a) provided sufficient evidence that military and militarized policing agency contracts are not a risk to shareholders, potentially jeopardizing the Company’s public image, ability to assure clients of its ability to fulfill contractual obligations, or commitment to hire and retain a diverse workforce;

 

 nor

 

b) assuaged shareholders that the Company has a plan to address the risk of once again pursuing large, controversial Cloud computing contracts with militaries and militarized policing agencies (for example, its recently secured Project Nimbus contract and its recent bid for the Joint Warfighter Cloud Capability contract) after previously publicly ending or not bidding on such contracts due to employee unrest and bad publicity.

 

If management has decided that pursuing such contracts is in the best interests of shareholders, especially as management seeks to expand further into the cloud computing market, a risk assessment is a responsible, needed step to assuage shareholder concerns about whether this change in direction indeed does pose a risk in the aforementioned areas and equip the Company to effectively address said risk.

 

 

 

THIS IS NOT A SOLICITATION OF AUTHORITY TO VOTE YOUR PROXY. PROXY CARDS WILL NOT BE ACCEPTED BY THE FILER OR ANY CO-FILER. PLEASE DO NOT SEND YOUR PROXY TO THE FILER OR ANY CO-FILER. PLEASE VOTE YES ON ITEM 21 ON THE MANAGEMENT’S PROXY CARD.

 

 

THE FOREGOING INFORMATION MAY BE DISSEMINATED TO SHAREHOLDERS VIA TELEPHONE, U.S. MAIL, E-MAIL, CERTAIN WEBSITES AND CERTAIN SOCIAL MEDIA VENUES, AND SHOULD NOT BE CONSTRUED AS INVESTMENT ADVICE.