A St Cross Special Ethics Seminar, recorded at St Cross College, Oxford in February 2024. Demandingness objections have become a stock argument in ethics claiming that single moral demands or entire moral theories must be given up or altered if they ask too much of agents. But can we clearly distinguish an acceptable level of demandingness from one that is too high? I argue that demandingness objections inevitably fail to make that distinction without borderline cases because they are sorites-susceptible. First, I show that the heap paradox applies to demandingness objections and the expression “overdemanding” because two conditions are met. There is an ordering of values on one dimension decisive for the expression’s application: the cost to the agent. Also, the expression “overdemanding” is tolerant, because the difference between two neighbouring levels of demandingness is so small that it does not allow us to say that this is the difference between an acceptable level of demandingness and critical overdemandingness. Second, I discuss attempts to overcome or bypass the vagueness of demandingness objections. I will argue that these strategies are not very promising and that we should rather embrace the vagueness.
--------
38:29
Morality and Personality
Professor Predrag uses a comparison of money and morality to explore the mutual relationship between morality and personality. To clarify the tension that exists between morality and personality, Cicovacki opens his talk by comparing the development of the money economy and morality. Money and morality play a similar function with respect to social interactions: they make most diverse things commensurable and impose the rules that should have universal validity, regardless of to whom they apply. Personality is characterized by the uniqueness of each individual, as well as by a need for continuous development. To close an unhealthy gap between morality and personality, morality should be conceived not on the model of the money economy, but by becoming more sensitive to who we are and in what kind of situations we find ourselves. Cicovacki argues that we should favor a maximalist rather than a minimalist conception of morality: the one that urges us to become as good human beings as we can, rather than to focus merely on enabling acceptable social intercourse. The questions that such a conception of morality should ask are: 1. What is the moral cost of being who you are? and 2. What is the moral cost of not being who you are?
--------
45:51
Is AI bad for democracy? Analyzing AI’s impact on epistemic agency
Professor Mark Coeckelbergh considers whether AI poses a risk for democracy n this St Cross Special Ethics Seminar Cases such as Cambridge Analytica or the use of AI by the Chinese government suggest that the use of artificial intelligence (AI) creates some risks for democracy. This paper analyzes these risks by using the concept of epistemic agency and argues that the use of AI risks to influence the formation and the revision of beliefs in at least three ways: the direct, intended manipulation of beliefs, the type of knowledge offered, and the creation and maintenance of epistemic bubbles. It then suggests some implications for research and policy.
--------
30:38
Shallow Cognizing for Self-Control over Emotion & Desire
In the first St Cross Special Ethics Seminar of 2023, Dr Larry Lengbeyer explores 'shallow cognizing' as a form of self-control Shallow cognizing is a familiar but overlooked practice of self-control, typically initiated without conscious intention, that enables us to short-circuit potential upwellings of emotion and desire in ourselves. We will consider the range of contexts in which the practice is manifest, speculate about its roots in the compartmentalized structure of our cognitive systems, ponder its benefits and costs (its uses and misuses), and contemplate its relation to virtue. We will then continue in this exploratory vein by asking whether taking account of this neglected phenomenon might improve our understanding of issues in practical ethics, such as duties of doctors to obtain informed consent from patients, and how to balance free expression with proper care for others' sensibilities, in the classroom and perhaps elsewhere.
--------
45:26
The Moral Machine Experiment
In this St Cross Special Ethics Seminar, Dr Edmond Awad discusses his project, the Moral Machine, an internet-based game exploring the ethical dilemmas faced by driverless cars. I describe the Moral Machine, an internet-based serious game exploring the many-dimensional ethical dilemmas faced by autonomous vehicles. The game enabled us to gather 40 million decisions from 3 million people in 200 countries/territories. I report the various preferences estimated from this data, and document interpersonal differences in the strength of these preferences. I also report cross-cultural ethical variation and uncover major clusters of countries exhibiting substantial differences along key moral preferences. These differences correlate with modern institutions, but also with deep cultural traits. I discuss how these three layers of preferences can help progress toward global, harmonious, and socially acceptable principles for machine ethics. Finally, I describe other follow up work that build on this project.