Advocacy in Technology and Society/Social Work + Technology

From Wikiversity
Jump to navigation Jump to search

Topic Summary[edit | edit source]

In this lecture we learned that we are all embedded in technological systems and technological systems are embedded in our social lives. There are people who are most at risk or most vulnerable to technological systems either due to their race, class, or privilege. Social workers use Data Justice to fight for fairer systems, both for ourselves and others. Social workers also play the role as an advocate by shaping the tech systems from within the system. We use technology tools to further our efforts, such as easy transfer of information and feedback loops for example. We discussed how social workers can insert ourselves to create a better, fairer, and just technology. Our framework was about how to be critical of something we depend on: technology. We then discussed Digital Social Work and that every social worker is a digital social worker. We must view internet as a site for everyone, including social workers to meet while we all go to different venues, or websites. Using digital tools can enhance many of our interactions. Though they may not be face to face like in person would be, we need service organizations to move to a digital model and move away from a more analog way of doing things, such as using paper charts. The issue here is that these service organizations do not necessarily have the bandwidth or skills to move everything over to digital. This becomes an issue when 90% of people are now online. We also discussed how to use our knowledge work, or mental models of social work, and digitizing them. How can we encode knowledge for those who are technology resistant? We discussed neoliberalism and how it has infiltrated social work and has caused it to shift from things like grassroots organizations to standardized social work. 

Data Justice[edit | edit source]

The Data Justice Framework builds off of pursuits for fairness and goodness in the world. It uses OEDC's Fair Information Practice principles, which are collection, quality, purpose, use limitation, security, openness, individual participation, and accountability.

This is a way to unify movements against tools that can harm people in the world. This is used as a global framework and not just in the United States.

We discussed "Data For Good" which helps community based organizations that do not have the internal capacity to make sustainable technology platforms. The way to combat this issue is making the participation from other groups more sustainable. Social workers can help with these efforts by using our own knowledge as well. In order to sustain these organizations and move forward, we need feedback from stakeholders and who is impacted by them and skills that are needed. We need to also learn how to deal with Technology Superiority. The rise in automated products began in 2017, and this is when people believed that Data Science should also have a hippocratic oath, similar to social work. Data science borrowed and used the NASW Code of Ethics for their oath of data science. Their minimum standards include transparency, accountability, participatory, design/decision making. We also discussed using new tools to combat the issue of those who are vulnerable to being harmed online. We can create new tools using a PROP lens.

The Future of Data Justice Community Power and Data-Driven Systems - A conversation with the Digital Equity Lab, based at the Milano School of Public Policy at The New School

Data Feminism: "What Gets Counted Counts"[edit | edit source]

This chapter discussed websites that require signing up or logging in.Most websites require you to choose your gender, but they only include male or female. It excludes non-binary individuals and the authors believe there's enough people to warrant changes. Joni Seager, a data scientist, says that what gets counted counts because gender is used as the basis for policymaking and resource allocation and being non-binary means becoming invisible in these issues. There is more global data on gender being collected than ever before but it still leaves people out, including non-binary people, lesbians, and older women. They are also asked very narrow questions about their lives like what contraception they use instead of their interests or likes. Evidently, this decision was made to make advertisers more easily market to one gender or the other.

"This discrepancy leads right back to the issues of power we’ve been discussing since the start of this book: it’s corporations like Facebook, and not individuals like Maria Munir, who have the power to control the terms of data collection. This remains true even as it is people like Munir who have personally (and often painfully) run up against the limits of those classification systems—and who best know how they could be improved, remade, or in some cases, abolished altogether." (D’Ignazio, Catherine & Klein, Lauren,2020)

The criteria used by which people are divided into the categories of man and woman is exactly a classification system. Feminist scholars tried to separate sex from gender, but could not because both gender and sex are social constructs. The American Medical Association now calls gender a “spectrum” rather than a binary, and as of 2018 it issued a firm statement that “sex and gender are more complex than previously assumed

Early on, increasingly racist systems of classification began to emerge, along with pseudosciences like comparative anatomy and physiognomy. These allowed elite white men to provide a purportedly scientific basis for the differential treatment of people of color, women, disabled people, and gay people, among other groups. In each of these cases, as is true of any case of not fitting (or not wanting to fit) neatly into a box, it’s important to ask whether it’s the categories that are inadequate, or whether—and this is a key feminist move—it’s the system of classification itself. The matrix of domination describes how race, gender, and class (among other things) intersect to enhance opportunities for some people and constrain opportunities for others. Under the matrix of domination, normative bodies pass through scanners, borders, and bathrooms with ease; these systems have been designed by people like them, for people like them, with an aim—sometimes explicit—of keeping people not like them out. Being represented, though, also means being made visible, and being made visible to the matrix of domination—which continuously develops laws, practices, and cultural norms to police the gender binary. This poses significant risks to the health and safety of minorities.


The book said that data must be classified in some way to be put to use because by the time that information becomes data, it’s already been classified in some way. “What distinguishes data from other forms of information is that it can be processed by a computer, or by computer-like operations,” as Lauren has written in an essay coauthored with information studies scholar Miriam Posner. And to enable those operations, which range from counting to sorting and from modeling to visualizing, the data must be placed into some kind of category—if not always into a conceptual category like gender, then at the least into a computational category like Boolean (a type of data with only two values, like true or false), integer (a type of  number with no decimal points, like 237 or −1), or string (a sequence of letters or words, like “this”). (D’Ignazio, Catherine & Klein, Lauren, 2020).


The ethical complexity of whether to count gender, when to count gender, and how to count gender illuminates the complexity of acts of classification against the backdrop of structural oppression. By challenging the binary thinking that erases the experiences of certain groups while elevating others, we can work toward more just and equitable data practices and consequently toward a more just and equitable future. Binary categories fail to acknowledge what rests in between the two categories or outside them. The creators who fit in these binary groups rarely think about those who will not fit. For example, Facebook’s algorithms disproportionately flag Native American names for violation because those names often differ in structure and form from Anglo-Western names. The example of Facebook demonstrates the fundamental importance of obtaining consent when counting and of enabling individuals to refuse acts of counting and classification in light of potential harms. When data are collected about real people and their lives, risks ranging from exposure to violence are always present. But when deliberately considered, and when consent is obtained, counting can contribute to efforts to increase valuable and desired visibility and when a community is counting for itself, about itself, there is the potential that data collection can be not only be empowering but also healing.


No person can fit their whole self into a form, regardless of how many blank text fields are provided. Counting and measuring do not always have to be tools of oppression. We can also use them to hold power accountable, to reclaim overlooked histories, and to build collectivity and solidarity. When we count within our own communities, with consideration and care, we can work to rebalance unequal distributions of power

Organizing in the Digital Age: Digital Macro Practice is Here…to Stay[edit | edit source]

These articles illustrate how digital technologies, including social media, have impacted the ways individuals, groups, and communities come together to advocate and effect social change.

The authors bring awareness to the need for social workers in the technology industry. Their hope is to encourage social workers to take a more proactive role in integrating digital technologies in meaningful ways and advocating for digital justice. "The social work profession is at a critical crossroad where we can take a proactive role in influencing the ethical use of digital technologies to benefit social good and advance social change, rather than be reactive to the whims of technology companies and developers that thus far, have dictated the rules of digital engagement and participation"[1] .

The aspirational goals for the future of social work and technology are improving the quality and quantity of life for the most socially underrepresented and sidelined members of society.

All technological advancements of the 21st century have primarily benefited the privileged. It has become crucial for social workers to apprehend and to protest advancing technological mechanisms of disenfranchisement. This responsibility also includes the necessary  step of being involved in producing, implementing, and evaluating technology’s myriad unintended impacts.

What is Data Justice? The Case for Connecting Digital Rights and Freedoms Globally[edit | edit source]

This paper address the concern that data-driven discrimination is advancing at a similar pace to data processing technologies. However, awareness and mechanisms for combating data-driven discrimination are not. There is a noticeable shift in policymaking worldwide from being data informed to being data driven. Data sources which allow researchers to gather people’s "movements, activities and behavior have ethical, political and practical implications for the way people are seen and treated by the state and by the private sector (and, importantly, by both acting in combination)"[2]. This distribution of privacy has significant social and political implications for lower-income communities, where authorities’ ability to gather accurate statistical data has previously been limited.

The power data has to sort, categorize and intervene has not been connected to or addressed by any social justice agencies or authorities who collect, manage and use data. There is also a low level of awareness amongst architects of how new data technologies may not be unbiased in terms of access, use or impacts.

Data justice, which is described as "fairness in the way people are made visible, represented and treated as a result of their production of digital data,[2]" is necessary to determine ethical paths through our data-driven world.

The article proposes three pillars for international data justice:

  • (in)visibility
  • (dis)engagement with technology
  • and (anti)discrimination

These pillars challenge the current data protection regulations and the growing assumption that being visible through the data we emit is part of the contemporary social contract.

Insights[edit | edit source]

In this topic, we discussed and identified the groups disproportionately at risk from the implementation of new technology. Social workers can add enormous value to the technology industry by sharing ethical standards of practice for fairer and more just technologies to ensure human rights on digital platforms. As social workers, we can offer our unique skillset and knowledge in human rights, social justice, and social and environmental ethics. These qualities and skills are widely lacking in the technology world, which has caused more social issues such as blatant discrimination in AI products and services, which have resulted in victimization and criminalization of marginalized communities. 
From now onwards, it will be crucial for social workers to use their social skills and knowledge in technologies for social justice. There is an emerging need for engaging social workers in technology development, deployment, and implementation to prevent further consequences in our society.

What does advocacy look like here?[edit | edit source]

How can advocates use technology and data-based practice to further their advocacy efforts?

Social workers and advocates can use technology to bring further awareness to data injustice in many ways. As discussed in class, social workers are needed throughout all stages of the data and technology innovation. We must hold technology companies accountable when their data is mishandled or when data science disregards one or more of their minimum standards of transparency, accountability, participatory, and design/decision making.

Additionally, technological advances are expanding advocates' ability to communicate with a wider audience and allow for the potential to make advocacy efforts more effective and efficient. Advocates can utilize technology to affect societal change by recruiting a larger audience, organizing efforts online, raising funding and communicating with major stakeholders.

How are current and emergent technologies presenting challenges in society, and how can advocates work in pursuit of better, fairer, and more just technology?

Breakthrough technological advances are being made across a range of fields. These technologies are highly disruptive and bring about major transformative shifts in how our society functions. These presenting challenges include:

  • Dislocation of labor markets and other market disruptions
  • Deepened inequalities
  • New risks to public and national security

Social workers and data justice advocates are crucial to help mitigate the disruptive impact technology has on our society. There needs to be a stronger presence of social workers in the technology field to advocate for the more vulnerable groups at risk from the implementation of technology. And finally, social workers must advocate for our community members to have an opportunity to assist in developing, implementing, and evaluating emerging technologies.

In modern conditions, as society develops, the meaning of information technology increases significantly due to the technologization of human activity, and technology, including in the order of social relations, turns into a factor of special social transformations, affecting various structures and subsystems of society. Information technology directly affects the development of the economy and society. The advocacy for technology looks the following: - The impact of information technology on social life has mainly led to information becoming more accessible to people. - Social attitudes have changed, and people now know that various elements of society are better informed than before. People also expect to be able to access more information about a particular product, service, or organization to make informed decisions about their future behavior. - Information technologies make it possible to optimize and, in many cases, automate information processes, which in recent years have occupied an increasing place in the life of human society (Wolff, 2021). The development of civilization is moving towards an information society, in which the objects and results of the labor of the majority of the employed population are no longer material values but mainly information and scientific knowledge. - Information technologies today occupy a central place in society's intellectualization and the development of its educational system and culture (Brey et al., 2012). In all developed and developing countries, computer and television equipment, educational programs on optical disks, and multimedia technologies are practically becoming common attributes of higher educational institutions and ordinary primary and secondary schools. Using educational information technologies has proven to be a very effective method for self-education systems and systems of advanced training and retraining of personnel. - By reducing the fixed costs of employment, widespread telecommuting helps people work flexibly part-time, share jobs, or work two or more jobs simultaneously (Wolff, 2021). Since changing employers does not necessarily require a change of location, telecommuting should improve job mobility and accelerate career growth. This increased flexibility may also reduce job stress and increase job satisfaction.



  References Brey, Ph., Briggle, A., & Spence, E. (2012). The good life in a technological age (Routledge Studies in Science, Technology and Society). Abington: Routledge. Wolff, J. (2021). How is technology changing the world, and how should the world change technology? Global Perspectives, 2(1), 27353. https://doi.org/10.1525/gp.2021.27353

Darla Drendel

Annotated References[edit | edit source]

  • D’Ignazio, Catherine & Klein, Lauren (2020). Data Feminism, Chapter 4: What Gets Counted, Principle: Rethink Binaries and Hierarchies.
  • https://datajusticelab.org/
  • Google Images: Where Digital Justice Can Begin | Angelica Poversky | TEDxUBC. (n.d.). Www.youtube.com. https://www.youtube.com/watch?v=6FnWiXAV-aQ
  • Kavanagh, C. (2019, August 28). New Tech, new threats, and new governance challenges: An opportunity to craft smarter responses? Carnegie Endowment for International Peace. Retrieved April 25, 2022, from https://carnegieendowment.org/2019/08/28/new-tech-new-threats-and-new-governance-challenges-opportunity-to-craft-smarter-responses-pub-79736
  • Maria Rodriguez, Heather Storer & Jama Shelton (2021) Organizing in the digital age: digital macro practice is here…to stay, Journal of Community Practice, 29:3, 199-202, DOI: 10.1080/10705422.2021.1984178
  • Rosemary Thackeray, Ph.D., MPH, MaryAnne Hunter, Empowering Youth: Use of Technology in Advocacy to Affect Social Change, Journal of Computer-Mediated Communication, Volume 15, Issue 4, 1 July 2010, Pages 575–591, https://doi.org/10.1111/j.1083-6101.2009.01503.x
  • Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society. https://doi.org/10.1177/2053951717736335
  1. Rodriguez, Maria; Storer, Heather; Shelton, Jama (2021-07-03). "Organizing in the digital age: digital macro practice is here…to stay". Journal of Community Practice 29 (3): 199–202. doi:10.1080/10705422.2021.1984178. ISSN 1070-5422. https://doi.org/10.1080/10705422.2021.1984178. 
  2. 2.0 2.1 Taylor, Linnet (2017-12). "What is data justice? The case for connecting digital rights and freedoms globally". Big Data & Society 4 (2): 205395171773633. doi:10.1177/2053951717736335. ISSN 2053-9517. http://journals.sagepub.com/doi/10.1177/2053951717736335.