Advocacy in Technology and Society/Emerging Tech - Identifying the Stakeholders

From Wikiversity
Jump to navigation Jump to search


Topic Summary[edit | edit source]

Summarize what was covered during this topic through the lecture, discussion, and the readings assigned this week. 

Stakeholder Analysis

Our class in Week 4 centered on how to identify the stakeholders of emerging technologies. To perform a stakeholder analysis, we must systematically narrow in on who is most impacted by emerging technologies, who has the control or decision-making to impact technology, who is likely to benefit from the technology, and what is the relationship between the impacted and impacting stakeholders. Systematically organizing information may include grouping people by different demographic, locations, or socioeconomic statuses.

Once we can narrow in on the ‘who’ and ‘what’ of the situation, we can dive deeper into the changes likely to happen over time, and how each stakeholder will play a role in the direction of decision making.

The goal of the stakeholder analysis is to identify risk to see who and what is missing from the current activity so that we organize to mitigate future negative impacts and have concrete criteria for identifying who should be held accountable if things do go wrong.

To prepare for this week's conversation, students read:
Data Feminism by Catherine D'Ignazio and Lauren Klein In What Gets Counted Counts, Chapter 4 of “Data Feminism”, D’Ignazio and Klein discuss how, although classification systems are essential to any working infrastructure, they encompass and enforce binaries and hierarchies. This chapter prompts readers to gain an understanding of one of the principles of a data feminist perspective: rethinking binaries and hierarchies. In the process of assessing the classification systems that currently exist, Klein and D’Ignazio utilize the issue of the gender binary as an example that illustrates how acts of counting and classification must always find a balance between harming and benefitting marginalized communities. Chapter 4 further challenges readers to not only assess who is and is not being counted but to additionally recognize the power that classification systems hold in telling an individual’s or collective experience of oppression. Klein and D’Ignazio identify and discuss how counting and classification systems act as reflections of the social, cultural, or political values of a given community or time period in society. Thus, to provide an accurate reflection and depiction of marginalized communities, Klein and D’Ignazio assert that the role of the storyteller should be held by a community member.
Data Justice: Social Work and a More Just Future by Lauri Goldkind, Lea Wolf, and Walter LaMendola Goldkind et al. urge social workers to gain an understanding of the extent to which new data-driven technologies oppress the populations that they aim to serve across the micro, mezzo, and macro levels. The article highlights a selection of technologies whose data and algorithmic strategies have impacted the well-being of individuals, organizations, and communities. In discussing the ramifications of emerging data-driven technologies, Goldkind et al. reveal that so to have theories and frameworks that work to ameliorate the impact of said technologies. Ultimately, the article serves to promote the need for a data justice framework to be adopted by the social work profession in that this framework brings forth social work ethics up-to-date with the data revolution.
Students were provided with a couple of optional readings:
Automating Inequality by Virginia Eubanks In Chapter 1, “From Poor House to Database”, Virginia Eubanks provides a historical analysis of the poorhouse. Eubanks identifies how from its conception, although these institutions were developed to provide public economic relief for low-income folks, they instead functioned as systems that punished folks experiencing poverty. The 19th Century was a time period that heavily relied on poorhouses as a method of regulating poverty by acting as a classification system that differentiated the “deserving” from the “undeserving” poor. This binary of poorness was reinforced by the scientific charity movement that advocated for the need for caseworkers who primarily worked to filter out those who were not identified to be  “deserving” of public welfare benefits. Eubanks further discusses and identifies how periods of economic crisis, elected officials, and policies contributed to the shapeshifting of poorhouses over time. Although the physical infrastructure of poorhouses ceases to exist, the foundational values and beliefs that upheld those institutions for over a century persist and have transformed into automated decision-making systems that continue to profile, police, and punish folks experiencing poverty. 
In Chapter 3, “High-Tech Homelessness in the City of Angels”, Eubanks highlights the automated decision-making system that was created to streamline the process of connecting a person who is unhoused to a housing resource in Los Angeles: the Coordinated Entry System (CES). In discussing the increasing minimization of Skid Row – a 50-square-block area in downtown Los Angeles that is home to a community of people who are unhoused – Eubanks reveals that the CES does not account for the lack of affordable housing. This absence of affordable housing is identified to be attributed to the various instances where the city of Los Angeles, whether it be city officials or community members, continuously chose to prioritize profit and to reinforce the binary of “deserving” and “undeserving” poor. Eubanks asserts that this binary is ever-present in the CES algorithm that works to rank people who are unhoused according to their vulnerability level. In discussing the CES data entry process, Eubanks shines a light on the personhood of the people behind the CES data. Although CES data has acted to inform the public on the state of the homelessness crisis in Los Angeles, by providing a critical analysis of the CES privacy practices, or lack of, Eubanks reveals that this system is another system that surveils and criminalizes people experiencing poverty.

Stakeholders

In order to systematically categorize information, people must be grouped into one of three levels levels of stakeholders:

Primary Stakeholders: are the people most affected by the technology and most dependent on the resources. These are the people that are targeted during the development of emerging technology and have the most to gain or lose from its impacts.  For example, in class, we used an example of an app that sent automatic alerts if the user had been exposed to Covid-19.  In this example, the primary stakeholders, are the general public, or those who had the app and were either receiving or giving the Covid exposure information.

Secondary Stakeholders: are the people who are indirectly affected or less affected by the technology than the primary stakeholders or less dependent on the resource. For example, apple who may house the app in their app store, but are not directly impacting the technology.

Key Stakeholders: are the people who have significant influence over the technology or are the most impacted by the technology (may also be primary or secondary stakeholders). In the Covid exposure alert example, key stakeholders would be the technology builders, philanthropists, health advocacy groups, or policymakers.  


Stakeholder Analyses

Most stakeholder analyses are built on a grid, further made popular by the surge in use frequency of Microsoft Excel.

Power/Interest Grid: Eden & Ackerman 2014 This four-square grid analyzes how much power and control one has over the outcome (high or low) and how invested/interested/impacted they are in the success of the project. It can be important to maximize those in power who also have an interest in a positive outcome.

Power/Dynamism Grid (Gardner et al., 1996): The four-square grid analyzes how power/influence shifts over time and how much the power can be influenced (Dynamism). The factors influencing the outcome must be aligned with the values of the mission, adding in the cultural aspects of changes such as policy and current advocacy efforts.

3dSA - Three Dimensional Stakeholder Analysis : A three-dimensional grid that utilizes stakeholder power, interest, and attitudes. Attitude also adds in a layer of culture and context, looking at how everyday shifts impact the level of engagement of the stakeholder. This allows a more dynamic evaluation by incorporating size and degree of influence.

Stakeholder Salience Method: Three Venn Diagrams analyzing power, legitimacy, and urgency. Legitimacy represents the people and things that matter most, and by viewing the overlap with power and urgency, it identifies where there are aligned or misaligned goals.

RACI (Responsible, Accountable, Consulted, and Informed) Matrix: A chart representing an array of different primary stakeholders in a company based on their positionality, and their engagement with certain tasks.

Ethical Matrix (Cathy O’Neil) - Auditing Algorithms.

Looking at different levels of ecosystems: Community, Neighborhoods, Individual, Individuals belonging to certain groups. For example for Covid exposure app, in evaluation we may look at groups such as healthcare providers, google/apple, or those in high/low risk areas (ideally allowing users to identify and label themselves and thinking about structural oppression). Keeping in mind the layers of data protection - for example risks that may be imposed if a user identified as having pre-existing conditions or insurance coverage, and how that information could be used against the user in the future.

Once the stakeholders are identified, it is compared based on Well-being, Autonomy, Fairness, Anti-oppressive, Anti-Racist & Inclusive.

Crisis Text Line Case Study: The Crisis Text Line is a tech based startup that responds to users who are reach out in a mental health emergency. The goal is to create care in a tech setting, however, the company is funded by Silicon Valley. This creates a conflict because the organization was invested in not for equity, but to receive coded user data. Until February 2022, Crisis Tech Line shared their anonymized data with a company developing an Artificial Intelligence product. This inherently benefits the company, and while they don't receive the personal information, they still use conversations to further their AI learning.

Additionally, organizations like Crisis Tech Line can take advantage of the gaps in the current mental health care system - users may benefit from the immediate help, but it may prevent them from receiving more extensive mental health service. However, these text lines may also be a very effective way for people who do not have access or resources to utilize mental health recourses.

One addition consideration is that the Crisis responders are often volunteers, while the trainers are clinical based providers - while it is great that there is clinical oversight, the majority of the work of the organization is done by volunteers.

Insights[edit | edit source]

Go beyond your summary, synthesize what you learned on this topic, drawing on personal insights and that of your peers, the content, what you read in class and in the media

Throughout this class it became clear that Social Workers have a vast role in the technology field. Considering who are the stakeholders and who has a genuine interest in the outcome leads me to believe that emerging tech companies would greatly benefit for the social work code of ethics, and the knowledge and expertise that social workers bring to the table. There are many conversations about whether it is more effective to work within systems or influence them externally, while these models note the influence of external key stakeholders such as advocacy organizations or philanthropists - it is important to recognize how important it is to have social workers within the companies themselves. From within, a social worker can take on the role of both an external advocate for the primary stakeholders, but retain the influence of an internal stakeholder.

Furthermore, across all stakeholders, a wide array of perspectives and diverse group of people allows is imperative to conducting analysis effectively and throughout, as it increases the chances that all aspects of the evaluation are considered. The prevalence of systematic oppression across our systems and institutions, creates an increases level of risk for marginalized communities, particularly when there are not advocates in the room that share their identity. Creating more diverse stakeholders, in turn protects the most vulnerable, and thus protects the wider population as a whole, benefitting all primary stakeholders involved and mitigates potential risks.


What does advocacy look like here?[edit | edit source]

Considering your topic, respond to the following: 1. How can advocates use technology and data-based practice to further their advocacy efforts? And 2. How are current and emergent technologies presenting challenges in society, and how can advocates work in pursuit of better, fairer, and more just technology?

Equitable Participation of Key Stakeholders

When evaluating stakeholders, it is important to understand who is at the table and has power in the decision making, and who does not have influence but should. Stakeholders should represent a diverse range of perspectives and backgrounds, so that risks can be mitigated and primary stakeholders have advocates on their behalf. Considering the power/dynamism grid - the values of the key stakeholders should align with the goals of the company, keeping those in power more invested in the work they are doing.

Data Justice Framework (Linnet Taylor, 2017):

Visibility: Analyzing the representation of the users of technology and the privacy protecting their personal data

Engagement with Technology: How in practice, do we give people an option to opt-in or opt-out of tools that can be potential harmful id companies use their data in precarious or exploitive ways or sell sensitive data to other actors.

Nondiscrimination: During development of technology, who are the people in decision making positions - what is the data being inputted into algorithms that could exasperate or challenge existing biases? How do we ensure that users will not be discriminated against based on bias data inputted into the system?

Adding Internal and External Audits: Rather than placing the full onus on community members, adding in additional stakeholders representatives to evaluate technology.

Echo Park Lake Residents Displaced

A photo of Echo Park Lake located in Los Angeles, California. Photo captures a lake, waterfall, and palm trees on a sunny Saturday in March of 2022.

Social workers can incorporate a data justice framework when assessing the impact of automated-decision making technologies, like the Coordinated Entry System (CES) and the manner in which it was used to displace people who were unhoused and had established a home in the grassy parts of Echo Park Lake. Echo Park is one of the public parks owned and operated by Los Angeles City Parks. Over the course of the COVID-19 pandemic, people struggling with homelessness began to establish a commune-like encampment within the parameters of Echo Park. This community was short-lived in that people who were unhoused were uprooted from their homes and displaced on March 26, 2021. Nearly a year later, the park that was once open and bustling with community members and visitors, is restricted by a seemingly temporary fence that surrounds the park and is nearly empty most days.

A Los Angeles Times column highlighted the experience of David Busch-Lilly, a former Echo Park Lake resident who was displaced by policing and city officials who promised that he, alongside with his neighbors, would be connected to housing.

Unfortunately, David's case of being handed an empty promise was unfortunately not unique:

Of the 183 people who were unhoused and residing at Echo Park lake, 17 people received long-term housing, 48 are on a waiting list, 15 were unsuccessfully housed, 6 people died, and at least 15 went back to living on the streets - As reported and discussed by the UCLA Luskin Institute on Inequality and Democracy & Erika Smith, Columnist from the Los Angeles Times.


In thinking about the experiences of these community members, social workers may advocate for their well-being by advocating for city and policing officials to adopt the view of CES as a tool that should them in their efforts to establish funding for long-term affordable housing.


Advocacy Success Efforts in Protecting Digital Rights for Low-Income Folks

As mentioned by Goldkind et al. in their discussion of data justice, Dutch civil rights actvitsts were successful in advocating for the digital rights and well-being of low-income folks to be prioritized over the Dutch government’s act of surveillance.

Through these activists efforts, it was revealed that Netherland government agencies were utilizing a personal data system called SyRi, which stands for “system risk indication” to seek out “unlikely citizen profiles that warrant further investigation”. Civil rights activists challenged these acts of surveillance and by doing so, these efforts have prompted public organizations to reevaluate their process in collecting, managing and protecting data that is collected from the public. Social workers can learn from successful efforts like these to identify manners in which we can identify the stakeholders that need to be involved in the process of evoking similar change.



Annotated References[edit | edit source]

Cite what we covered in class and include 4 additional resources you find on the topic. Cite resources that are related to the topic, they can be in agreement with the topic, extending it, in disagreement or present conflict with the topic.

D’Ignazio, Catherine & Klein, Lauren (2020). Data Feminism, Chapter 4: What Gets Counted, Principle: Rethink Binaries and Hierarchies.

Data Justice, Social Work & a More Just Future (2021) access here: https://www.tandfonline.com/doi/full/10.1080/10705422.2021.1984354

Eubanks, Virginia (2015). Automating Inequality, Chapter 1: From Poor House to Database & Chapter 3: High Tech in Homelessness in the City of Angels

MindTools. (n.d.). Stakeholder analysis: Winning support for your projects. Project Management Skills From MindTools.com. Retrieved April 25, 2022, from https://www.mindtools.com/pages/article/newPPM_07.htm

Smith, Erika D. “Column: He Had to Leave Echo Park Lake and He’s Still Homeless. Is This What L.A. Wants?” Los Angeles Times, 26 Mar. 2022, https://www.latimes.com/california/story/2022-03-26/echo-park-lake-encampment-raises-questions-los-angeles-homeless-policy.

Vervloesem, Koen. “How Dutch Activists Got an Invasive Fraud Detection Algorithm Banned.” Automating Society Report 2020, Algorithm Watch, https://automatingsociety.algorithmwatch.org/report2020/netherlands/netherlands-story/.