Lesson 1 Progress Check _ WAR-961S Ethics of Emerging Military Technologies4
pdf
keyboard_arrow_up
School
Air University *
*We aren’t endorsed by this school
Course
601S
Subject
Philosophy
Date
Jan 9, 2024
Type
Pages
8
Uploaded by PresidentLlama3892
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
1/8
Lesson 1 Progress Check Due
No due date
Points
100
Questions
10
Time Limit
15 Minutes
Allowed Attempts
Unlimited
Instructions
Attempt History
Attempt
Time
Sco
KEPT
Attempt 5
(https://canvas.asu.edu/courses/153180/quizzes/1151412/history?
version=5)
9
minutes
80 o
of 1
LATEST
Attempt 5
(https://canvas.asu.edu/courses/153180/quizzes/1151412/history?
version=5)
9
minutes
80 o
of 1
Attempt 4
(https://canvas.asu.edu/courses/153180/quizzes/1151412/history?
version=4)
11
minutes
70 o
of 1
Attempt 3
(https://canvas.asu.edu/courses/153180/quizzes/1151412/history?
version=3)
9
minutes
70 o
of 1
Attempt 2
(https://canvas.asu.edu/courses/153180/quizzes/1151412/history?
version=2)
15
minutes
0 ou
of 1
Attempt 1
(https://canvas.asu.edu/courses/153180/quizzes/1151412/history?
version=1)
9
minutes
30 o
of 1
This quiz checks your understanding of lesson concepts.
This is a timed assessment.
You are allowed multiple attempts.
Minimum passing score is 80%.
Take the Quiz Again
(https://canvas.asu.edu/courses/153180/quizzes/1151412/take?
user_id=486526)
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
2/8
Correct answers are hidden.
Score for this attempt: 80
out of 100
Submitted Dec 23 at 12:56am
This attempt took 9 minutes.
10 / 10 pts
Question 1
According to Roff, in order to meet the principle of distinction, a Lethal
Autonomous Robot [LAR] must be capable of making ______ decisions.
tactical, but not operational or strategic tactical, operational, strategic, and grand-strategic tactical and operational, but not strategic tactical, operational, and strategic Answer is correct. Roff writes on page 217, ”Perhaps LARs will ultimately affect grand strategy, but this is not the level of strategy that they must be capable of creating. Rather, if they are to meet the international legal (and moral) requirements of the principle of distinction, they must be capable of making strategic, operational, and tactical decisions.”
10 / 10 pts
Question 2
According to Johnson and Axinn, the decision ______________
whether to kill a human should be left up to properly-programmed
autonomous systems, as they offer greater precision and impartiality than
humans can.
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
3/8
whether to kill a human should be left up to properly-programmed
autonomous systems, as they do not suffer moral injury or post-traumatic
stress.
whether to kill a human should always be made by a human. Answer is correct. Johnson and Axinn note on page 134 that “[w]e . . . argue that the decision to take a human life must be an inherently human decision and that it would be unethical to allow a machine to make such a critical choice.”
whether to kill a human should be left up to properly-programmed
autonomous systems, as they are not susceptible to emotions like rage.
10 / 10 pts
Question 3
According to “The Weaponization of Increasingly Autonomous
Technologies: Considering how Meaningful Human Control might move
the discussion forward,” the foundational issue with autonomous weapons
is ____________
the possibility that machines may turn against their users. achieving sufficiently discriminate intelligence and surveillance capability. the potential for machines to learn. machines making life and death decisions independently.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
4/8
Answer (a) is correct. “The Weaponization of Increasingly Autonomous Technologies: Considering how Meaningful Human Control might move the discussion forward” notes on page 8 that “This approach [from human intent] attempts to address what is seen as the fundamental issue presented by the weaponization of highly autonomous technologies: that a machine could take life and death decisions, without any human intent involved.”
0 / 10 pts
Question 4
Incorrect
Incorrect
According to Kirkpatrick’s “Drones and the Martial Virtue Courage,”
courage as a virtue differs from courage as an act. A terrorist flying an
airplane into a building ______________
acts with courage but does not possess the virtue of courage. acts with courage and possesses the virtue of courage. acts without courage but possesses the virtue of courage. does not possess courage and does not act with courage. Answer is incorrect. Kirkpatrick writes on page 206, “An agent, like
the terrorist, can act courageously without possessing and exercising
the virtue of courage. The reason for this is because the terrorist
does not engage the will in the sense that she intends to serve a just
cause, nor does
she serve an objectively good end.” (Emphasis in
the original.)
10 / 10 pts
Question 5
According to Sparrow’s “Robotic Weapons and the Future of War,” many
believe that military service helps to develop virtues of character. Yet,
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
5/8
when military members are safe from harm because they operate their
weapons remotely, they are denied the opportunity to display _____
willingness to work hard. excellence. courage and willingness to sacrifice. Answer is correct. Sparrow writes on page 126, “Taking humans ‘out of harm’s way’ by allowing them to fight remotely will also remove them from the opportunity to exercise those virtues that are currently constitutive of being a good warrior. Most dramatically, it does not seem possible to demonstrate courage. . . . It is also hard to see how . . . operators could demonstrate a willingness to sacrifice themselves for the good of the unit. It will still be possible for operators to work hard. . . .”
obedience. 0 / 10 pts
Question 6
Incorrect
Incorrect
Strawser notes that someone might object to the use of Uninhabited
Aerial Vehicles (UAVs) on the ground that using them offends against the
idea that morally justified combat should be a “fair fight.” He offers several
defenses against this claim, among them that _____________
that a “fair fight” might engender violations of the jus in bello
criterion of
proportionality.
that a “fair fight” might engender violations of the jus ad bellum
criterion of
proportionality.
that a “fair fight” might engender violations of the jus in bello
criterion of
discrimination.
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
6/8
Answer is incorrect. It is unlikely that a fair fight would violate jus in
bello criteria. If it did, it would not be fair.
the idea of a “fair fight” is not morally compelling. 10 / 10 pts
Question 7
According to Cook, if a commander transfers the continuous ability to
make ethically good decisions to an autonomous weapon, he or she
_________
will probably be able to concentrate on tactical matters more effectively. shirks the responsibility of command. Answer is correct. Cook writes on page 222, “If a commander voluntarily surrenders . . . [the] persistent ability to make the morally right decision, he or she has effectively reneged on the responsibility of command.”
will probably make fewer ethical mistakes. will probably be better able to concentrate on human aspects of leadership.
10 / 10 pts
Question 8
According to Cook, when autonomous weapons make mistakes,
_________
it is clearly the manufacturer’s responsibility.
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
7/8
it is clearly the programmer’s responsibility. it is clearly the user’s responsibility. it is unclear how to place blame. Answer is correct. Cook writes on page 218, “It is not entirely clear who would bear responsibility, moral or legal, for mistakes involving automation.”
10 / 10 pts
Question 9
According to Horowitz and Scharre, the principle of “meaningful human
control” is intended, in part, _____________
to ensure that autonomous weapons can control humans when necessary
to accomplish their mission.
to ensure that a human is accountable for the actions of an autonomous
weapon.
Answer is correct. Horowitz and Scharre write on page 6, “[O]ne of the motivations behind a principle of ‘meaningful human control’ is the concern that greater autonomy in weapons could lead to an ‘accountability gap,’ where so many key tasks have been programmed to be performed by machine that, if there were an incident resulting in excessive civilian deaths, no human would have responsibility and be held accountable.”
to ensure that autonomous weapons operate with at least 99.999%
reliability in the absence of onboard redundant systems.
12/23/23, 10:56 AM
Lesson 1 Progress Check : WAR-961S Ethics of Emerging Military Technologies
https://canvas.asu.edu/courses/153180/quizzes/1151412
8/8
to ensure that autonomous weapons operate with at least 99.999%
reliability, necessitating onboard redundant systems.
10 / 10 pts
Question 10
According to Roff, it is difficult to know who is to blame if an autonomous
weapon makes a mistake. For Roff, this difficulty results, in part
__________
from the classified nature of such machines. from the Rules of Engagement governing the use of such machines. from the classified nature of operations involving such machines. from the complexities inherent in developing such machines. Answer is correct. Roff writes on page 214, "[T]he complexity required in creating autonomous machines strains the causal chain of responsibility. . . . This is because we must not only understand how much distance between creator/manufacturer/programmer and machine is required to hold those persons to account, but also to what extent we can hold those responsible for the decision to field or develop LARs [lethal autonomous robots] in the first place.”
Quiz Score: 80
out of 100