<br />
<b>Notice</b>:  Function _load_textdomain_just_in_time was called <strong>incorrectly</strong>. Translation loading for the <code>twentysixteen</code> domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the <code>init</code> action or later. Please see <a href="https://developer.wordpress.org/advanced-administration/debug/debug-wordpress/">Debugging in WordPress</a> for more information. (This message was added in version 6.7.0.) in <b>/opt/bitnami/wordpress/wp-includes/functions.php</b> on line <b>6121</b><br />
{"id":4153,"date":"2016-10-05T16:00:27","date_gmt":"2016-10-05T16:00:27","guid":{"rendered":"https:\/\/www.trafficsafetystore.com\/blog\/?p=4153"},"modified":"2016-10-05T16:00:27","modified_gmt":"2016-10-05T16:00:27","slug":"self-driving-cars-programmed-to-kill","status":"publish","type":"post","link":"https:\/\/staging.trafficsafetystore.com\/blog\/self-driving-cars-programmed-to-kill\/","title":{"rendered":"Should Self Driving Cars Be Programmed to Kill?"},"content":{"rendered":"<p>When it comes to self driving cars, one of the biggest questions on many minds is, \u201cwould you let your car kill you for the greater good.\u201d<\/p><p>A modern spin on the <a href=\"http:\/\/people.howstuffworks.com\/trolley-problem.htm\">trolley problem<\/a> (which has <a href=\"https:\/\/www.facebook.com\/TrolleyProblemMemes\/\">become a joke<\/a>), a situation where a vehicle loses control and two options arise. Crash into a group of pedestrians crossing the tracks, and saving the driver, or crash into a sandpit, saving the pedestrians and killing the driver.<\/p><p>Since this is a tough topic to touch on, researchers at the Massachusetts Institute of Technology (MIT) have created <a href=\"http:\/\/moralmachine.mit.edu\/\">The Moral Machine<\/a>. It\u2019s a game (for lack of a better term) which puts the user in control of the situation.<\/p>\n<iframe width=\"840\" height=\"473\" data-src=\"https:\/\/www.youtube.com\/embed\/XCO8ET66xE4?feature=oembed\" frameborder=\"0\" allowfullscreen src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" class=\"lazyload\" data-load-mode=\"1\"><\/iframe>\n<p>The three functional scenarios of the game are judge, design, and browse.<\/p><p>The first mode, as the name implies presents users with random moral dilemmas revolving around the idea of a car losing the brakes. Design mode enables users to create their own scenarios.<\/p><p>Although the default impact is death, users can set the fates of each character independently using a dropdown menu, plus they can also add legal implications by setting the pedestrian signal.<\/p><p>Finally users can browse the scenarios and use the like button to express appreciation, and\/or they also can share or link the scenarios using the corresponding buttons.<\/p><p>Although a novel concept, there are a few key limitations worth noting. <a href=\"http:\/\/www.theverge.com\/2016\/6\/23\/12010476\/social-dilemma-autonomous-vehicles-car-moral-machine-trolley-problem\">The Verge<\/a> had a conversation with Anuj K. Pradhan, who is an assistant research scientist in UMTRI\u2019s Human Factors Group specializing in human behavior systems.<\/p><p>Pradhan said that while these studies and tools are helpful, they shouldn\u2019t be used as a direct comparison,<\/p><blockquote><p><i>Because human drivers who face these situations may not even be aware that they are [facing a moral situation], and cannot make a reasoned decision in a split-second. Worse, they cannot even decide in advance what they would do, because human drivers, unlike driverless cars, cannot be programmed.<\/i><\/p><\/blockquote><p>Although self driving cars pose a moral dilemma to the average consumer,for many programmers responsible for these systems, it\u2019s not as big of a deal as you would expect.<\/p><p>As <a href=\"https:\/\/www.theguardian.com\/technology\/2016\/aug\/22\/self-driving-cars-moral-dilemmas\">The Guardian<\/a> found in recent conversations with employees at X, the Google sibling in charge of developing self-driving cars, it\u2019s not as big of a deal as you would think because it\u2019s an issue which hasn\u2019t come up.<\/p><p>Simply put, rather than worrying about programming logic to determine who lives and dies, the priority is on ensuring the situation never comes up. The article went on to mention how even if the situation reached that level there\u2019s no time to make a moral decision.<\/p><p>Andrew Chatham, a principal engineer on the self driving car project mentioned, you\u2019re much more confident about things directly in front of you, just because of how the system works, but also your control is much more precise by slamming on the brakes than trying to swerve into anything.<\/p><p>So it would need to be a pretty extreme situation before that becomes anything other than the correct answer. Nathaniel Fairfield, another engineer on the project, joked with The Guardian that the real question is \u201cwhat would you \u2026oh, it\u2019s too late\u201d<\/p><p>Image Source: \u00a0<a href=\"https:\/\/www.flickr.com\/photos\/bekathwia\/8612888280\/in\/photolist-e86iBN-p1FYTd-eBWbNx-8XUCAY-pehkzt-cvhYgh-iKrUr5-g4P8w-ENwDQj-oSLAA2-qPTpog-eFeCPe-9sgehf-4zEjo8-4zJAgw-oQLPXk-cHWdgW-o7T6qb-wpZxdf-diJQue-hRrw8M-o5Zrtr-rntpFL-eb2sof-9o1FrD-9iYZ98-7RcU7D-dLznp8-8hr8UZ-99aahD-Asbs1P-CJdtas-a7mSqE-HwL6zY-df5xw3-gTzd3u-a4EhoN-raUYJg-48dwvP-2aXF8b-zFbgxp-6rdujU-xgS3j-cDdb3y-pcZ8z-FnbXbs-F5Qc1P-HraHQ9-8wXj6S-4jRJVh\">Becky Stern<\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>When it comes to self driving cars, one of the biggest questions on many minds is, \u201cwould you let your car kill you for the greater good.\u201dA modern spin on the trolley problem (which has become a joke), a situation where a vehicle loses control and two options arise. Crash into a group of pedestrians crossing the tracks, and saving &hellip; <a href=\"https:\/\/staging.trafficsafetystore.com\/blog\/self-driving-cars-programmed-to-kill\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Should Self Driving Cars Be Programmed to Kill?&#8221;<\/span><\/a><\/p>\n","protected":false},"author":3,"featured_media":4156,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[141],"tags":[218,219],"class_list":["post-4153","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-self-driving-cars","tag-moral-machine","tag-car-morals"],"_links":{"self":[{"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/posts\/4153","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/comments?post=4153"}],"version-history":[{"count":0,"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/posts\/4153\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/media\/4156"}],"wp:attachment":[{"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/media?parent=4153"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/categories?post=4153"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/staging.trafficsafetystore.com\/blog\/wp-json\/wp\/v2\/tags?post=4153"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}