Trust

From Wikiversity
Jump to navigation Jump to search

Definition[edit | edit source]

Definitions of trust[1][2] typically refer to a situation characterized by the following aspects:

  • Trustor/Trustee Relationship: One party (trustor) is willing to rely on the actions of another party (trustee);
  • Temporal aspect: situations of trust are directed to the future.
  • Control: the trustor (voluntarily or forcedly) abandons control over the actions performed by the trustee.
  • Uncertainty: As a consequence of abandoned control, the trustor is uncertain about the outcome of the other's actions; they can only develop and evaluate expectations. The uncertainty involves the risk of failure or harm to the trustor if the trustee will not behave as desired. Vladimir Ilych Lenin expressed this idea with the sentence "Trust is good, control is better".[3]. In a social context, trust has several connotations.[4]


Learning Tasks[edit | edit source]

  • (COVID-19) Contact tracing is a relevant measure for epidemical containment strategy, i.e. limit the impact of an epidemiological spread to minimal number of people (not lockdown) by identification of those community members that had contact with someone who was infected. This contact tracing needs trust on different levels of stakeholders, that apps performing those contact tracing respect the privacy of those people that support the IT based epidemiological control with the intallation of that app. Discuss the following aspects
    • (Privacy of Epidemiological Status of Citizens) trust in the software, that it does not expose epidemiological status of individuals to other people that do not need to know that. Compare that with public health strategies for notifiable diseases!
    • (Commercial Data Harvesting and Health Related Data) Now we compare public health app for contact tracing with basic principles of data collection in economy. For tailored advertisments for users of mobile devices Commercial Data Harvesting is a strategy to show only specific advertisments that users more likely would respond to (generate a user profile for the tailored advertisments). By this strategy user profiles can be aggregated and traded between companies and used for business management. Analyse the technical possibilities to collect data on different levels of IT infrastructure (e.g. hardware, operating systems, applications, network, ...) for Commercial Data Harvesting and identify measures that assures the privacy of health related data on the app.
    • (Risk Literacy) The measures to assure the privacy of health related data must be comprehensible to the public, because contact tracing will work for epidemiology if a large majority of the population will use the app (e.g. 60%). If you look into the basic Risk Management cycle of knowing about the risk and being able to select appropriate measures for risk mitigation then also the reponse to that risk needs to be addressed. So knowledge about the princples, benefits and weaknesses of specific technologies must be comprehensible (e.g. Wikiversity course about different options of contact tracing). So public acceptance of technology needs capacity building and learning. So that people know what they are doing. Identify main steps for that capacity building course for public health authorities and the public would finaly use that app for contact tracing.
  • (Open Innovation Ecosystem) How can you use an Open Innovation Ecosystem to build with trust? Assume you have 1000 contributors that want contribute to an innovative step and the group of contributors perform experiments with different specifications, to find the best solutions in a process of innovation? Only 1 contributor comes up the best solution (999 have a result with a lower quality). Discuss the principle of "The winner takes it all" in comparison to the trust in a shared benefit of all 1000 contributors. What is a trusted open innovation ecosystem in which the contributors will share the benefits and drawback of the work for the innovative step. Open Innovation Ecosystem could provide open access to the results for all 1000 contributors to problem solving. Design an Open or Closed Innovation Ecosystem for yourself and explain to the other students why you designed the innovation ecosystem the way you did it. Ask the other students if they would trust your proposal of the innovation ecosystem as why they trust you or why they would not trust your proposal.
  • (Neuroscience) Some studies indicate that trust can be altered e.g. by the application of oxytocin.[5]. Identify the current state of the art in neurosciences and discuss the societal challenges of those results? What are required ethical guidelines?

Trust between People and Collaboration[edit | edit source]

Trust can be attributed to relationships between people. Collaborative work requires a mutual trust between humans and respectful interaction based on human rights and a sustainable way of interaction. Decision making in a trusted network of people is made by many individuals and is composed of multiple single decisions.

  • Explain how a collaborative objective (e.g. of Humanitarian Open Streetmap Team, Doctors without Border, Wikipedia, ...) can be accomplished and contribute to a Common Good?
  • Define the terminology Collaboration and Interoperability and identify examples in which interoperability improves collaboration and vice versa, how collaboration can define requirements for interoperability.
  • Analyze learning environments between students and the relationship between students and teachers according to trust and privacy. The final degree will show the success at the end of a learning process. Explain the role of trust between students and teachers within learning process to make errors and learn from errors without the fear. Teachers as facilitators of learning processes and as supporters if difficulties and problems occur.

Trust between groups/organisations[edit | edit source]

Conceptually, trust is also attributable to relationships within and between social groups (history, families, friends, communities, organisations, companies, nations, etc.). It is a popular approach to frame the dynamics of inter-group and intra-group interactions in terms of trust.[6] The trust formations of people towards different actors in society has been shaped into a generic model of trust with three levels of application, i.e. Macro, Micro, and Meso.[7]

Trust of People in Technology/ICT[edit | edit source]

The Digital Revolution, the Internet of Things (IoT) and the notion of alternative facts creates a high scientific demand on trust in technology, ICT and privacy especially with the commercial data harvesting strategies of TwoogleBook et. al.[8]. One of the key current challenges in the social sciences is to re-think how the rapid progress of technology has impacted constructs such as trust. This is specifically true for information technology that dramatically alters causation in social systems.[9]

We access data, information and knowledge with information and communication technology. Key questions are:

  • is the information build on scientific facts and evidence?
  • who generated the information and are the named authors the real authors of the information?
  • what are the drivers for publishing a certain information?

In general we have to distinguish between:

  • (Technology) trust in technology, that is works as expected (no exploitation of user data) and
  • (Information Channel) trust in an information channel and
  • (Information) trust in the information exchange via the available information channel

When it comes to the relationship between people and technology, the attribution of trust is a matter of dispute. The intentional stance[10] demonstrates that trust can be validly attributed to human relationships with complex technologies. However, rational reflection leads to the rejection of an ability to trust technological artefacts.[11]

In the social sciences, the subtleties of trust are a subject of ongoing research. In sociology and psychology the degree to which one party trusts another is a measure of belief in the honesty, fairness, or benevolence of another party. The term "confidence" is more appropriate for a belief in the competence of the other party. Based on the most recent research [citation needed], a failure in trust may be forgiven more easily if it is interpreted as a failure of competence rather than a lack of benevolence or honesty. In economics, trust is often conceptualized as reliability in transactions. In all cases trust is a heuristic decision rule, allowing the human to deal with complexities that would require unrealistic effort in rational reasoning.

See also[edit | edit source]

Resources[edit | edit source]

See also[edit | edit source]

References[edit | edit source]

  1. Mayer, R.C., Davis J.H., Schoorman F.D. (1995). An integrative model of organizational trust. Academy of Management Review. 20 (3), 709-734.
  2. Bamberger, Walter (2010). "Interpersonal Trust – Attempt of a Definition". Scientific report, Technische Universität München. Retrieved 2011-08-16.
  3. Seligman, Adam B. (1998). "On the limits of Confidence and Role Expectations". American Journal of Economics and Sociology. 
  4. McKnight, D. H., and Chervany, N. L. (1996). The Meanings of Trust. Scientific report, University of Minnesota.
  5. Kosfeld, M., Heinrichs M., Zak, P. J., Fischbacher, U., and Fehr, E. (2005) Oxytocin increases trust in humans" Nature 435, 2005, 673-676.
  6. Hardin, R. (eds.) (2002). Trust and trustworthiness. Russell Sage Foundation.
  7. Khosrowjerdi, M. Trust in people, organizations, and society: a generic model. International Journal of Electronic Government Research, Volume 12 Issue 3, July 2016, Pages 55-70 |doi=10.4018/IJEGR.2016070104
  8. "TwoogleBook et. al." is an artificial word for data harvesting companies mainly used to avoid real company names.
  9. Luhmann, N. (2005) Risk: a sociological theory. AldineTransaction.
  10. Dennett, D.C. (1989) The Intentional Stance. Bradford Books.
  11. Shneiderman, B. (2000) Designing trust into online experiences. Communications of the ACM Volume 43, Number 12, Pages 57-59