gcs_projects

pdf

School

Washington State University *

*We aren’t endorsed by this school

Course

302

Subject

Electrical Engineering

Date

Feb 20, 2024

Type

pdf

Pages

10

Uploaded by SuperHumanCoyote3992

Report
CptS/EE 302 Group Case Study Project Case Descriptions 3.1.22 Background An important component of CptS/EE 302 is a group project based on a real case, which accounts for 30% of your grade in the class. Your ethical analysis will culminate in a group case study (GCS) report which is the capstone project of this course and which serves in place of a final exam. Please skim through the case descriptions below, noting the ones that you find most interesting and reading them more thoroughly. After your team has decided on several possible cases, the team captain should go to the Google Docs site ( GCS Cases and Teams ) and choose an available case (first- come, first-served basis). Each case can only be chosen by one team. The team captain should write their team’s name in the column next to the case. If you know of an interesting case not included below, feel free to send a brief summary to me via email (shira@wsu.edu). If I think it’s a good match for this course, I ll add it to this list, and you ll be allowed to use it for your case study. For those teams that haven’t chosen a case by 11:59 pm, Monday, 3.7.22 , I’ll assign a case. Due Dates (more info in assignment prompt) 3.7.22: Topic choice due 3.25.22: GCS-PM plan due (10%) 4.8.22: GCS draft report due (5%) 5.2.22: GCS final report due (25%) Case Descriptions Safety Critical Hardware and Software Systems 1. Korean Airlines Flight 007 . On September 1, 1983, Korean Airlines Flight 007 departed Anchorage, Alaska, for Seoul, South Korea. Ten minutes after take-off, the plane began deviating from its planned route and continued to fly off-route for the next five and a half hours. The plane strayed into Soviet airspace and was shot down by a Soviet fighter jet. Why did the plane fly off-course, why was it shot down, and what could have been done to prevent the incident? See en.wikipedia.org/wiki/Korean_Air_Lines_Flight_007 for an entry point into the case. 2. Three Mile Island Nuclear Disaster . Three Mile Island is a nuclear power plant near Middletown, Pennsylvania. On March 28, 1979, the plant experienced one of the most significant accidents in the history of nuclear power generation. One of its reactors suffered a partial meltdown, resulting in the release of a large amount of radioactive material into the surrounding environment. Why did the meltdown occur? What could have done to prevent it? See en.wikipedia.org/wiki/Three_Mile_Island_accident for an entry point into the case. 3. Fukushima Nuclear Disaster . On March 11, 2011, a magnitude 9.0 earthquake hit northeastern Japan, causing a tsunami. Both the earthquake and the tsunami damaged the Fukushima Daiichi Nuclear Power Plant located on Japan’s eastern coast. The damage led to the disabling of the reactor cooling systems, which in turn resulted in the release of a large amount of radioactive material and the subsequent evacuation of a 30 km area surrounding the plant. Why did the accident occur, and what could have been done to prevent it? See en.wikipedia.org/wiki/Fukushima_Daiichi_nuclear_disaster as an entry point into the case. 4. NASA’s Culture: Past and Present Problems . In class, we discussed the Space Shuttle Challenger disaster which occurred in 1986, but in 2003 the Space Shuttle Columbia disintegrated after
2 reentering Earth’s atmosphere. NASA was cri ticized for the way it assessed risks as well as on its decision making, and the shuttle missions were suspended for two years. Later, space shuttles were used in a limited capacity until the program ended in 2011. NASA has had many successes since then, but it has also incurred some bad press, e.g., for an incident involving the International Space Station. Does NASA have a cultural problem, or does it receive too much political pressure? Can something be done to prevent disasters, or are they just a part of space exploration that we must accept? To get started with your research, refresh your knowledge of the Space Shuttle Challenger disaster , learn about the Space Shuttle Columbia disaster , and read an opinion piece on the International Space Station incident . 5. Denver International Airport Baggage Handling System . Denver International Airport, which opened in 1995, awarded BAE Automated Systems a $193 million contract to develop the most advanced automated airline baggage handling system in the world. The baggage system was supposed to reduce delays and shorten waiting times at carousels. It relied on a complex 21- mile network of metal track that automatically routed luggage to its proper destination. The project is recognized as one of the most epic failures in software and hardware engineering; it caused a 16-month delay in the opening of the airport and a budget overrun of $118 million. The system never worked as designed and has not been used since 2005. Why did the project fail? What should have been done to ensure its success? See calleam.com/WTPF/wp- content/uploads/articles/DIABaggage.pdf for an entry point into the case. 6. Ariane 5 Satellite . On June 4, 1996, the Ariane 5 satellite launch vehicle began its maiden flight. Some 40 seconds after it took off, it veered sharply off course and self-destructed. Satellites worth $500 million were lost in the incident. Why did it happen? How could it have been avoided? www.s1m0n3.org/s0cr4t3s/main/data/ariane5_final.pdf provides an entry point into the case. 7. Boeing 737 MAX . Boeing’s 737 “MAX” model was involved in crashes in October of 2018 (Lion Air Flight 610) and again in March of 2019 (Ethiopian Airlines Flight 302). The crashes claimed the lives of 346, the largest number of casualties in recent airline history. Both flights had similar sequences of events leading up to the crash, leading investigators to identify the cause of the crash as a defective software system (MCAS or “Maneuvering Characteristics and Augmentation System”). The Boeing 737 MAX was taken out of service from March 2019 to November 2020, and Boeing paid more than $2.5 billion in settlements after being charged with fraud. What led to the flaws in the design of the Boing 737 MAX and how could they have been avoided? spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software- developer and www.nytimes.com/2019/06/01/business/boeing-737-max-crash.html provide entry points into the case. Privacy 8. Electronic Search Warrants. Changes to Rule 41 allow judges to grant electronic search warrants for devices with unknown locations. In addition, judges can now grant bulk search warrants which allow law enforcement personnel to search millions of devices, even remotely. A starting point for this case is “Imagine Tha t ,” the Dec. 8, 2016 , episode of “On the Media.” Access it at www.wnyc.org/story/on-the-media-2016-12-09. Focus on the segment entitled “Expanding the Government’s Hacking Powers, Under the Radar.” Is it ethically acceptable for such broad search warrants to be granted? If not, what are some ethically acceptable alternative solutions? 9. Airport Full Body Scanners . In 2007, full-body scanners began being used to screen passengers in U.S. airports. While the scanners are intended to detect forbidden objects (including non- metal ones) on a person without making physical contact, the devices are capable of taking an image of a person’s naked body. This has led to concerns over personal privacy: Should a naked picture of a person be taken for screening purposes? Are the full-body scanners used in airports
3 ethically acceptable? Under what circumstances? See en.wikipedia.org/wiki/Full_body_scanner for an entry point into this case. 10. RFID Tag Implant s. In Taiwan, all pet dogs must receive an implanted RFID tag so they can be identified and returned to their owners in case they go missing. The Verichip Corporation has created a similar implantable chip for humans which can be loaded with important health information and accessed in the event of a medical emergency. It can also be used to track missing children or even as an embedded credit card. Are RFID tag implants for humans ethically acceptable? Under what conditions? What types of usage policies should be put into place by companies producing them or by governments? See spectrum.ieee.org/computing/hardware/rfid-inside for an entry point into this case. 11. Electronic Medical Records . In the medical profession, there has been a concerted push toward digitizing all medical records. While this so-called modernization of the medical profession promises many benefits, it also raises numerous privacy concerns. What are the ethical issues surrounding the digitization of medical records? What policies should be put into place in order to address these issues? See www.ncbi.nlm.nih.gov/pmc/articles/PMC4394583/ for an entry point into this case. 12. Data Mining of Consumer Information . The online collection of consumer data, including web browsing data, is now a routine practice among companies that maintain an online presence. Data mining involves the post hoc analysis of data in order to identify patterns, make business decisions, and tailor products and services to customers’ needs. What ethical problems does this practice raise? What polices should be put into place to address these problems? For an entry point into this case, see www.computerworld.com/article/2485493/enterprise- applications-big-data-blues-the-dangers-of-data-mining.html . 13. Joint Vision 2010 and 2020. In this document, a Major in the U.S. Marines describes the U.S. Department of Defense’s vision for Full Spectrum Dominance using information -age technological advances in “military operations other than war” (MOOTW). This case raises ethical challenges, questions, and compromises in the DoD’s approach. For an entry point into this case, see mattcegelske.com/joint-vision-2020-americas-military-preparing-for-tomorrow- strategy/ . What information is it ethically permissible for a governmen t’s military to collect for MOOTWs? What means of information collection are ethically permissible? 14. Ashley Madison . In August 2015, The Impact Team publicly released data on 37 million users of Ashley Madison, an adult dating website frequently used by people wishing to commit adultery. For an entry point into the case, see www.tripwire.com/state-of-security/security-data- protection/cyber-security/the-ashley-madison-hack-a-timeline/ . Is it ethically acceptable for a company like Ashley Madison to knowingly promote adultery? What data security measures should be in place? What screening process should be in place to ensure that the data used to sign up with the site are genuine? Once the data were released, was it ethically acceptable for researchers and journalists to examine it? Overall, who is at fault in this case? Did site users deserve any consequences they suffered? Are users or their family members entitled to compensation? What kind? 15. Employee Surveillance . Are you entitled to privacy at work? Should your employer be able to access, for example, your workplace telephone and online activity without your knowledge? What about during lunch breaks? Bourke v. Nissan, Smyth v. Pillsbury, and Shoars v. Epson are all court cases on employee privacy that were won by the employer. For an entry point into this case, see employment.findlaw.com/workplace-privacy/privacy-in-the-workplace-overview.html . 16. Huawei and Global Telecommunications . The Chinese telecom company Huawei is the largest telecommunications equipment manufacturer in the world. It s involved in building and maintaining a portion of the global telecommunications network which reaches one third of the world’s population. It s also vying to play a significant role in the crea tion of the world’s 5G communications network. In 2019, the U.S. government restricted U.S. companies from doing
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
4 business with Huawei due to concerns that Huawei’s networks could pose a substantial cybersecurity risk by surreptitiously collecting communications data through its networks. What are the United States government’s specific concerns and are they justified? How can the world build a global high-speed communications network that we can be confident safeguards the privacy of its users? For an entry point into this case, see www.bbc.co.uk/news/resources/idt- sh/Huawei . 17. Amazon Echo and Other Smart Speakers. Smart speakers are becoming ubiquitous, and because of their convenience, us ers don’t seem to worry about the possibility of their privacy being violated. Smart speakers like the Amazon Echo are always listening for their activation word, and by default, they store all the data generated by users. Suppose someone created a device capable of listening to every sound made in someone’s house using their smart speaker? This could lead to any number of unpleasant outcomes, including the possibility of blackmail. Is there any way to stop such an invention and invasion of privacy? In addition, almost all the 100k+ apps available for the Echo are written by third parties, and users don’t know how trustworthy these apps are. Start your research at en.wikipedia.org/wiki/Amazon_Echo and news.ncsu.edu/2021/03/alexa-skill-vulnerabilities/ . 18. Home Surveillance. The availability of webcams, small video cameras, and technology that allows homeowners to view a person ringing their doorbell from within their home or even away from home raises many privacy issues. Should companies be allowed to monitor employees work ing from home? What recourse does an individual have when their neighbor’s video cameras capture views of them working in their yard or leaving home? Should hosts be required to inform their guests that video cameras are installed throughout their home that will capture their actions without their knowledge? How can we prevent homeowners from installing video cameras in their bathrooms and bedrooms? As starting points, check out www.nbcnews.com/tech/tech-news/big-tech-call-center-workers-face-pressure-accept-home- surveillance-n1276227 and www.bbc.com/news/technology-58911296 . Intellectual Property 19. SCO Group v. IBM . In 2003, the SCO Group filed a $5 billion lawsuit against IBM alleging that IBM used code from the Unix operating system, owned by SCO, to build the open-source Linux operating system. See en.wikipedia.org/wiki/SCO_Group,_Inc._v._International_Business_Machines_Corp . for an entry point into the case, which was revived in 2015 (see www.sltrib.com/news/2303519-155/its- alive-sco-group-suit-revived ). The case raises many ethical questions surrounding copyright law, the GPL, and open source software development. What are the issues, and how should they best be resolved? 20. Enish LLC v. Microsof t. This case goes to the heart of the controversy over whether software features are eligible for patents. For an entry point into this case, see www.ipwatchdog.com/2016/05/13/federal-circuit-says-software-patent-claims-not-abstract- are-patent-eligible/id=69147/ . Should software be patentable, and, if so, under what circumstances? 21. Microsoft Office Open XML vs. Open Office XML. In 2003, Microsoft proposed a new open standard for their Office 2003 XML format. At the same time, Open Office XML was submitted for standardization. In a controversial fast-track process, the International Standards Organization (ISO) approved Microsoft’s proposal. However, the ISO later removed their approval of the Microsoft standard. See en.wikipedia.org/wiki/Standardization_of_Office_Open_XML for an entry point into this case. Was the ISO standardization process used ethically acceptable? Why did the process proceed as it did? In your view, how should the process have proceeded?
5 22. Linksys Use of Open Source Code in Routers. Linksys used the Linux kernel in their WRT54G Wi-Fi routers but refused to release their source code when they were caught. They were trying to protect their driver implementations. The company was sued by the FSF and was forced to release the source code. They then started using QNX for future devices but continued to release a WRT54GL router with Linux on it. For an entry point into this case, see www.linuxinsider.com/story/43996.html . What issues surrounding open source software are raised by this case? Was Linksys’s behavior ethically acceptable? Was the remedy mandated by the court ethically acceptable? If you had worked for Linksys when it released the WRT54G router, what would you have done? 23. Universal Studios, Inc. v. Reimerdes. Digital Rights Management technology controls access to digitized intellectual property, such as CDs, DVDs, and games. In a lawsuit, Universal Studios sued to prevent the distribution of DeCSS, a computer program capable of decoding a DRM scheme for DVDs. Universal Studios won the suit, and websites providing access to DeCCS were taken down. For an entry point into the case, see en.wikipedia.org/wiki/Universal_City_Studios,_Inc._v._Reimerdes . Did the court make the right decision? Should the development of software to undermine DRM be banned? Why or why not? Business Ethics and Whistleblowing 24. Whistleblowing at Hughes Aircraft . In the 1980s, Hughes Aircraft manufactured hybrid computer chips for the U.S. military, which used the chips in a variety of weapons systems. The U.S. military required the chips to pass stringent quality assurance tests. About 10 percent of the chips failed to pass these tests. In August 1986, it was discovered that one of the supervisors was ordering inspectors to pass chips that were actually defective and to repair chips that were found to be defective, in violation of the required process. Floor workers reported the violations to one of the supervisors, who relayed the violations to management. That supervisor and a quality control agent who also reported the violations were ultimately fired. They paid a high price for whistleblowing; they were both unemployed for an extended period of time. Eventually, a lawsuit found Hughes Aircraft guilty of defrauding the government. An entry point into this case can be found at citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.19.4043&rep=rep1&type=pdf . Was the whistleblowing worth it in this case? What would you have done? 25. Uber and the ADA . The popular Uber app helps people who need rides to find drivers. Under the American with Disabilities Act (ADA), a series of lawsuits were brought against Uber in 2015. The lawsuits alleged that Uber is guilty of not complying with ADA regulations which require companies providing transportation services to accommodate the disabled. For an entry point into this case, see www.thedailybeast.com/uber-disability-laws-dont-apply-to-us . Should Uber be required to comply with the ADA’s regulat ions? Why or why not? 26. Intel’s Approach to Benchmark Testing . One of Intel’s graphic media accelerator drivers was found to be optimized for running certain benchmark tests. An entry point into the case can be found at techreport.com/review/17732/intel-graphics-drivers-employ-questionable-3dmark- vantage-optimizations . Have any other companies engaged in this practice? Is this practice ethically acceptable? Why or why not? 27. VW Emissions Scandal . In September 2015, the United States EPA discovered that some of Volkswagen’s vehicles had software designed to cheat in required laboratory emissions test s. The software allowed the vehicles to pass the emissions tests, even though the vehicles could emit up to 40 times the acceptable limit during regular driving. Roughly 11 million vehicles were later discovered to have the cheating software. What about VW allowed this kind of scandal to happen? How could it have been prevented? What impact did the scandal have on the
6 numerous different stakeholders involved in the case? See en.wikipedia.org/wiki/Volkswagen_emissions_scandal for an entry point for your research. Online Speech, Commerce, and Communities 28. LICRA v. Yahoo! . In 2000, the French government filed an international court case against Yahoo!. At issue was a Yahoo! auction site, accessible in France, that sold Nazi memorabilia, which is illegal in France. The French Government wanted the auction site to be blocked in France, while Yahoo! claimed that, since its servers were located in the U.S., Yahoo! should not be under the jurisdiction of French law. For an entry point into this case, see en.wikipedia.org/wiki/LICRA_v._Yahoo! . Which country’s laws should be in effect in this case? How should the conflict betwee n the two countries’ laws be best handled? 29. Facebook Emotion Experiment . In 2014, Facebook published a study in which they surreptitiously manipulated the content of the activity feeds of 600,000 users. They found that when they added more positive content to the activity feeds, users were more likely themselves to post positive content. The study is reported in www.pnas.org/content/111/24/8788.full.pdf . The editorial boards of numerous newspapers and magazines weighed in on the study (e.g., www.usnews.com/opinion/articles/2014/06/30/was-facebooks-emotional-contagion- experiment-ethical ). Was Facebook’s experiment ethically acceptable? Why or why not? If not, how might you have run a more ethical experiment that got at the same kinds of research issues? 30. The Usability of Open Source Software . Designing software to be humanly usable requires a commitment to involving real users early and often in the development process. Typically, however, the development of open-source software does not proceed in this way. Instead, the developers themselves take a leading role in defining the features, functionality, and design of the software under development. There is little time or incentive to run usability studies or to make usability a priority. F or an entry point into this case, see firstmonday.org/article/view/1018/939 . Do open source software developers have an ethical obligation to follow best practices for designing humanly-usable software? Why or why not? 31. Technical Intervention in the Fake News Era. There has been an increase in the presence of fake news articles and sites on the internet. In fact, some have argued that fake news had an influence on the 2016 U.S. presidential election. Several social media companies such as Facebook have decided to address the proliferation of fake news using technological solutions (see, e.g., www.wsj.com/articles/facebook-moves-to-curtail-fake-news-on-trending-feature- 1485367200 ). For an entry point into this case, listen to the Fresh Air podcast “How Fake News Spreads & Why People Believe It,” which aired December 14, 2016. It can be accessed at www.npr.org/player/embed/505594273/505594281 . Should technology companies attempt to identify and/or filter out fake news on the internet, or should this responsibility fall to individual internet users? What are feasible technological and human solutions? Software and Hardware Security and Crime 32. Stuxnet . In 2010, the Natanz Nuclear Facility in Iran suffered continuing problems that were eventually traced to the Stuxnet worm, a virus that had infected the centrifuges used during uranium enrichment. Analysts attributed the virus to sabotage by Western intelligence agencies, and speculation most popularly attributes the attack to an Israeli-US collaboration. See en.wikipedia.org/wiki/Stuxnet for an entry into this case. What rules of war should apply in this novel form of cyberwarfare? Should economic harm be governed by agreements like the Geneva Convention? Beyond international agreements, the Laws of Armed Combat (LOAC) appeal to traditional practices and custom, neither of which apply to novel approaches in cyberwarfare. Should limits be set? The established approach in military ethics is to appeal to what makes going to war just, what is just practice during wars, and what is just after a war has
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
7 ended. What is justice in cyberwarfare when countries are not at war and do not necessarily intend to go to war? 33. Hacking the Power Grid . Smart power grids are relatively open and, thus, vulnerable to attacks. If the power supply grid is interrupted or shut down, do all stakeholders suffer the same degrees of harm? Who should have priority? Are the ethical considerations in dealing with a shutdown different if the cause is a geological or meteorological event rather than intentional hacking? See www.forbes.com/sites/emc/2014/03/20/protecting-power-is-our-electricity-safe-from- hackers/ - 7f2a29757dff for an entry point into this case. 34. NSA Backdoors. The National Security Agency (NSA) has requested that it be granted a backdoor to encrypted data when a court order is obtained for access to the data. For entry points into this case, see www.theguardian.com/us-news/2015/feb/23/nsa-director-defends-backdoors- into-technology-companies and www.slate.com/articles/technology/bitwise/2015/09/fbi_cia_nsa_want_backdoor_access_to_d ata_yet_they_can_t_keep_their_own_data.html . Should the government have special access to private data under certain circumstances? If so, under what circumstances? Why or why not? 35. Target Credit Card Fiasco . In 2013, attackers lifted a previously unheard of 40 million credit and debit cards from retail mega-chain Target's point-of-sale systems. Investigators suspect the attackers initially gained access to Target's network using credentials obtained from heating, ventilation, and air-conditioning subcontractor Fazio Mechanical Services via a phishing email that included the Citadel Trojan. See www.nbcnews.com/technolog/massive-target-credit-card- breach-new-step-security-war-hackers-2D11778083 for an entry to this case. How can we trust that those we trust with our data are not too trusting? Emerging Technologies 36. Unmanned Aerial Vehicle (Drone) Use in Warfare . In June 2010, Faisal Shahzad, an American from Pakistan, planted a bomb in Times Square, New York, as revenge for the death of Baitullah Mehsud, a Pakistani Taliban leader who had been killed in a drone strike in 2009. Shahzad’s bombing attack failed, but he also said it was payback for the U.S. occupations of Afghanistan and Iraq. He said, “the drone hits in Afghanistan and Iraq, they don’t see children, they don’t see anybody. They kill women, children, they kill everybody.” Are targeted drone attacks more ethical than other kinds of attacks? Are ethical concerns different for drone pilots than for other ‘distance warriors’? To get started, see spectrum.ieee.org/to-protect-against-weaponized- drones-we-must-understand-their-key-strengths , and spectrum.ieee.org/lethal-autonomous- weapons-exist-they-must-be-banned . 37. Westworld . The Westworld series, with its depiction of a futuristic amusement park where visitors can live out their fantasies, raises serious questions concerning the ethical responsibility of technologists who are looking to push the limits of artificial intelligence. A starting point for this case is any episode of Westworld or else en.wikipedia.org/wiki/Westworld_(TV_series) . Using the futuristic world of Westworld as your starting point, explore the ethical dilemmas such a futuristic vision poses for computer scientists and electrical engineers. When do machines gain consciousness? Can their actions even be controlled or predicted? What responsibilities do the computer scientists and electrical engineers who develop these technologies bear? How can they approach the development of these technologies in an ethically acceptable way? 38. Driverless Vehicles . One of the world’s first autonomous bus pilot programs has begun in the Hernesaari district of Helsinki, Finland, where driverless buses are possible because Finnish law does not require a driver for vehicles on the road. The start-up Otto created technology for driverless big rigs and was purchased by Uber in 2016. To get started, see www.techradar.com/news/self-driving-cars and www.leeds.ac.uk/main- index/news/article/4931/making-self-driving-cars-human-friendly . Are ethical concerns different for driverless cars than for cars with drivers? How will safety be ensured? How should
8 software engineers approach the development of algorithms to avoid accidents? Who is responsible in the case of an accident? 39. 23andMe . This company samples your saliva to test your genetic makeup so it can report your ancestry. The sequencing results can include indicators of a proclivity to inherited health conditions. In 2013, the Food and Drug Administration ordered 23andMe to stop providing health information until it could show its conclusions were accurate, but FDA approval was granted for a revised health report in 2015. Are companies like 23andMe exploiting fears or providing a valuable service? Should the FDA be able to determine whether or not people have access to such services? To begin your research, see en.wikipedia.org/wiki/23andMe . 40. Health Data . Big Data systems enable collection of masses of health data through hospitals and doctors’ offices. See www.hrcsonline.net/pages/data on the United Kingdom and www.cihi.ca/en on Canada. Since the UK and Canada have public health systems with socialized medicine, are they morally justified in collecting these data for research purposes with or without a patient’s knowl edge? Does making the data anonymous by excluding any identifiers make it morally acceptable? W hat if someone’s data show they are carrying a disease they aren t aware of, e.g., a cancer not yet large enough to generate symptoms but likely to be beyond treatment once discovered? Wouldn’t that person want and deserve to know? Does the risk of a confidentiality or privacy breach for everyone outweigh the need for some to receive potentially life-saving information? 41. Bitcoin . Bitcoin is subject to a plague of problems, including hacking and stealing using malware, ransomware, mining Trojans and botnets, double-spending and race attacks, Finney attacks, and 51% attacks. The darknet Sheep Marketplace shut down in 2013 after bitcoins worth $6 million were stolen. Bitcoins can also be used on the darknet through sites like Silk Road, Silk Road 2, or the less restrictive Black Market Reloaded, all of which have been closed by the FBI, to buy drugs and other illegal items. Bitfinex, an illegal bitcoin exchange, was hacked on August 2, 2016, for allegedly $60 million. Is bitcoin the economy of the future? Or is it morally compromised as the ideal currency for illegal drug, weapon, and human trafficking because of its capacity to maintain anonymity? Is bitcoin theft too risky? See en.wikipedia.org/wiki/Bitcoin to get started. 42. Affective Computing and Microexpressions. Currently computers and computer programs are incapable of incorporating human emotions into their interactions with humans. Interdisciplinary researchers are trying to change this by studying and developing systems or devices capable of identifying human feelings, emotions, and moods. Their goal is to develop computers that can emulate empathy. One approach they’re exploiting is the use of facial microexpressions, expressions that are so fleeting that other humans rarely notice them. These microexpressions “give away” feelings that people themselves don’t ne cessarily know they feel. Affective computing can have many benefits to humankind, but what other unintended consequences may it have? Can the emotional data captured be used for unethical purposes? What other publicly available data might be combined with emotional data, and how could the combination be used? For a starting point, see en.wikipedia.org/wiki/Affective_computing . 43. Video Surveillance and Facial Recognition Software . Video surveillance has been used for many years now, but formerly it was primarily used by retail stores. Now, however, it’s becoming pervasive. Video cameras are located atop traffic signals; they’re in hospitals, prisons, companies, and even in many homes. In some instances, they’re so small that they can’t be detected, so a person has no idea whether their actions are being recorded. In addition, facial recognition software is becoming increasingly popular, and it is now used regularly in combination with video surveillance by police departments. Unfortunately, its use has led to the arrests of innocent citizens because of inaccuracies and biases in the software. Is it ethical to use facial recognition software when it’s faulty? Should video surv eillance be limited? What can people do to reclaim some privacy? In light of the recent camera hacking (see www.ndtv.com/world-news/thousands-of-security-cameras-hacked-exposing-tesla-jails-
9 hospitals-2387447 ), should video cameras be required to have security features? To get started on your research, see medium.com/digital-diplomacy/video-surveillance-and-facial-recognition- is-this-the-end-of-a-free-society-29d6956702fb and thehill.com/policy/technology/569543- federal-agencies-planning-to-expand-use-of-facial-recognition . 44. AI Applications in Mental Health . Researchers are finding more ways to integrate AI into our daily lives, and one promising area of application is the diagnosis and treatment of mental illness. Some AI apps are already using social media to determine whether someone has a mental illness based on their interactions, comments, and posts. Because people still believe a stigma is attached to having a mental illness, many won’t seek help even when they’re having problems. An AI application might be especially useful for these people. However, what if these applications are inaccurate? What if an algorithm misdiagnosed someone leading to a bad outcome? Will there be controls to prevent companies from targeting a market based on an application they’ve used which diagnoses mental health problems? Wi ll companies use AI mental health applications to screen potential employees? See psychologytoday.com/us/blog/integrative-mental-health-care/201910/artificial-intelligence-ai- and-mental-health-care and digitaldxventures.com/use-of-ai-for-detection-diagnosis-and- treatment-of-mental-health-disorders.html to get started on your research. 45. Wearables . It started with wearable fitness devices, moved to smart watches, and is now moving into smart glasses and smart clothing. It seems inevitable that wearables are not only here to stay but that they’ll increase in prevalence. There are many obvious benefits to wearables, but as with so many technological innovations, there are downsides as well. What impact do (and will) wearables have on privacy? How will privacy be safeguarded? Will wearables be allowed as evidence in court proceedings? What if a smart device provides an inaccurate reading, e.g., the wrong heart rate, which leads to a health issue or even death? Who would the responsible party be in such an event? Should there be age restrictions on the use of wearables and, if so, on which wearables, what age(s), and who will decide. To start exploring this case, see en.wikipedia.org/wiki/Wearable_technology , spectrum.ieee.org/wearable-data-court , and www.reuters.com/technology/italy-data- authority-asks-facebook-clarifications-smart-glasses-2021-09-10/ . Other 46. QAnon, Facebook, and the Storming of the U.S. Capitol on Jan. 6, 2021. On Jan. 6, 2021, rioters stormed the U.S. Capitol in a last-ditch effort to overturn the 2020 presidential election. Five people were killed, including one police officer, and numerous people were injured. Members of Congress feared for their lives. Many of the rioters were associated with the conspiracy group QAnon which believes that a secret cabal plotted against President Donald Trump while he was in office. While the actions of the rioters were illegal, were they unethical? How should the U.S. dea l with groups of citizens who hold strong beliefs that aren’t necessarily true? What can the American public do to prevent such occurrences? Does Facebook deserve any blame for what happened? To get started, see en.wikipedia.org/wiki/QAnon , en.wikipedia.org/wiki/2021_storming_of_the_United_States_Capitol , spectrum.ieee.org/computing/networks/the-careful-engineering-of-facebooks-filter-bubble , and abcnews.go.com/ABCNews/live-updates/live-updates-Facebook/?id=80401245 . 47. Law Enforcement and the Use of AI and Facial Recognition Software. In recent years, police departments have started relying on facial recognition software to assist in policing. In fact, several individuals involved in the Jan. 6, 2021, break-in of the U.S. Capitol building were identified by means of facial recognition software and subsequently arrested. However, it has been shown that facial recognition software is biased and less accurate for people of color, and several police departments have arrested and jailed persons of color who were later proven to be innocent. Most recently, U.S. prisons are considering the use of AI to monitor the phone calls
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
10 of prison inmates. Police departments claim that citizens are safer with the use of such software. Others argue that it’s a violation of privacy and that it magnifies the racial bias already existing in the criminal justice system. Should such approaches to policing be banned? If we don’t ban them, then how do we ensure that innocent people aren’t affected? Where do we draw the line between the safety of the public and the rights of the individual? To get started, check out www.reuters.com/article/us-usa-tech-prison-idUSKBN2FA0OO , www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html , www.usatoday.com/story/tech/2019/11/19/police-technology-and-surveillance-politics-of- facial-recognition/4203720002/ , and sites.law.berkeley.edu/thenetwork/2021/03/01/the- future-of-face-recognition-software-for-policing-is-there-one/ .