PHI 314 Week 1 Case Study
docx
keyboard_arrow_up
School
Wilmington University *
*We aren’t endorsed by this school
Course
314
Subject
Philosophy
Date
Dec 6, 2023
Type
docx
Pages
5
Uploaded by CommodoreArtEagle24
Apple is delaying its plan to use neuralMatch
Wilmington University
PHI 314 Ethics for Computer Professionals
Apple is delaying its plan to use neuralMatch
Apple Computer, Inc. was founded in 1976 by Steve Wozniak and Steve Jobs. Their
vision was to bring user-friendly computers into homes and offices worldwide. Throughout the
years to follow, Apple would make their mark on the technology industry with the introduction of
the Apple II, Macintosh Computer, iMac and the iBook. Apple would continue to innovate into
the 2000s with the iPod, iPhone and iPad. Eventually by 2011, Apple unveiled an online
storage and syncing service that would be used for music, files, software and photos, named
iCloud.
In 2021, Apple announced a new technological tool named neuralMatch, which allows
Apple to scan photos stored on iCloud and iOS devices. This tool would specifically scan all of
Apple users photos and cross reference the images to an existing database of child sex abuse.
If a match is found, a human reviewer would conduct a thorough assessment and notify the
National Center for Missing and Exploited Children (NCMEC) about that user and the account
will be disabled. Apple continued to state, “Before an image is stored in iCloud Photos, an on-
device matching process is performed for that image against the known child sexual abuse
material (CSAM) hashes,” and the system has “extremely high level of accuracy and ensures
less than a one in one trillion chance per year of incorrectly flagging a given account” (Ahmed,
D. 2021).
While this new technology sounds to be a vital tool in stopping perpetrators in these
heinous acts, Apple decided to indefinitely delay the release of neuralMatch. “Based on
feedback from customers, advocacy groups, researchers, and others, we have decided to take
additional time over the coming months to collect input and make improvements before
releasing these critically important child safety features,” Apple stated in their press release.
This technology affects numerous groups including Apple themselves, users of the Apple
products, law enforcement, child protection agencies (NCMEC), advocates for privacy, the
victims of child abuse and society as a whole. While none of these stakeholders are in favor of
letting victims continue to be abused, this technology poses privacy ethical concerns. Exposing
CSAM and bringing perpetrators to justice is unquestionably an important societal goal, but
does this goal justify the violation of individual privacy rights?
Privacy advocates and consumers argue that individuals have the right to keep their
personal data, information and photos private. Even Apple CEO, Tim Cook, in his April 2022
keynote speech at the IAPP conference in Washington, D.C., repeated his long-standing claim
that Apple believes privacy is “a fundamental human right” (Lomas, N. 2022).
The utilitarianism view promotes the greatest possible balance of good over evil.
Undoubtedly fighting against CSAM is good over evil but I argue the use of this neuralMatch
technology violates the individual rights and are not worth sacrificing, no matter the greater
good. Furthermore, neuralMatch comes with flaws that may not have been initially considered.
What if this was used to accuse any individuals potentially falsely or even maliciously frame
individuals. Matthew Green, who is John Hopkins University’s top cryptography researcher,
said that this system could easily be used to trap innocent people by sending them offensive
images to trigger a match with various child sexual abuse material. He further added,
“Researchers have been able to do this pretty easily. What happens when the Chinese
government says, ‘Here is a list of files that we want you to scan for… Does Apple say no? I
hope they say no, but their technology won’t say no” (Hunt, G. 2023).
There are also concerns of the expansion of this technology not only to CSAM, but to all
private and personal content not at all related to CSAM. By expanding and allowing Apple to
scan all of its user’s images, it essentially means the end to encryption and leaves the door
open to broader abuses (McKinney, I., Portnoy, E., 2021). Currently in other countries such as
in India, new laws allow requirements for platforms to identify the origins of messages and pre-
screen content. Similar laws are in effect in the countries of Turkey and Indonesia. The
potential misuse of neuralMatch by governments or other bad actors could have consequences
that lead down a slippery slope of the erosion of individual rights to privacy.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Taking the step back and considering the utilitarianism view for the greater good but also
reviewing the rights-based morality of individuals is a wise decision by Apple in delaying rolling
out this technology. There are many factors for everyone involved to consider. Is it worth giving
up individual rights to privacy? Is there a way this technology can be advanced and improved
upon to stop perpetrators without giving up those rights? Do Apple’s users have rights to
privacy when signing up to the user agreement?
I still side with a rights-based morality in this debate but ultimately the decision lies with
everyone involved to consider the potential risks, the benefits that are in stopping CSAM, ethical
issues of privacy, consequences of misuse and the advancement of this technology before
moving forward with a decision.
References:
O’Sullivan, F. (2021, September 6).
ICloud neuralmatch 2023: Apple’s Child Safety Photo Scan
update
. Cloudwards.
https://www.cloudwards.net/icloud-neuralmatch/
Press, A. (2021, September 3).
Apple delays plan to scan U.S. iphones for images of child sex
abuse
. KTLA.
https://ktla.com/news/nexstar-media-wire/nationworld/apple-delays-plan-to-scan-
u-s-iphones-for-images-of-child-sex-abuse/
Research guides: This Month in business history: The founding of Apple Computer, Inc..
The
Founding of Apple Computer, Inc. - This Month in Business History - Research Guides at Library
of Congress. (n.d.).
https://guides.loc.gov/this-month-in-business-history/april/apple-computer-
founded
Bradshaw, T., Murgia, M. (2021, August 6). WhatsApp attacks Apple’s Child Safety
“surveillance” despite government acclaim. Financial Times.
https://www.ft.com/content/43c1aa97-c4e7-4301-913d-2c89cc63800d
Ahmed, D. (2021, August 7).
Apple’s neuralmatch tool will scan iphones for child abuse content
.
HackRead.
https://www.hackread.com/apples-neuralmatch-tool-scan-iphone-child-abuse/
Bajak, F. (2021, August 6).
Apple to scan U.S. iphones for images of Child sexual abuse
. AP
NEWS.
https://apnews.com/article/technology-business-child-abuse-apple-inc-
7fe2a09427d663cda8addfeeffc40196
Lomas, N. (2022, April 12).
Tim Cook uses privacy keynote to attack sideloading
. TechCrunch.
https://techcrunch.com/2022/04/12/tim-cook-iapp-keynote/#:~:text=In%20the%20keynote
%20speech%20this,for%20its%20own%20commercial%20profit
Portnoy, E., & McKinney , I. (2022, February 18).
Apple’s plan to “think different” about
encryption opens a backdoor to your private life
. Electronic Frontier Foundation.
https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-
backdoor-your-private-life
Hunt, G. (2023, April 17). Apple’s new tool to scan iphone for child sexual abuse material.
VPNRanks.
https://www.vpnranks.com/blog/apples-new-tool-to-scan-iphone-for-child-sexual-
abuse-material/
Spinello, R. A. (2021). Cyberethics: Morality and law in Cyberspace. Jones et Bartlett Learning.