Still Searching…

Von 2012 bis 2023 beschäftigte sich der Diskurs-Blog des Fotomuseum Winterthur interdisziplinär mit allen Aspekten der Fotografie und ihrer Rolle in der visuellen Kultur. Die insgesamt fast 50 eingeladenen Blogger_innen von Still Searching…  diskutierten fotografische Medien und Formen als Bestandteil komplexer technologischer, kapitalistischer und ideologischer Netzwerke und verhandelten aktuellste und relevante Fragestellungen rund um die Fotografie.

Blog series: Digital Infrastructures of Race and Gender

Safiya Umoja Noble | 06.12.2017 – 31.01.2018
Digital Infrastructures of Race and Gender

Till the end of January, Safiya U. Noble explores the intersectional ways race and gender are embedded in digital infrastructures. Noble suggests that logics and structures of race are a matter of network and platform design, which encode values that cannot be divorced from the digital. To open, she investigates the erosion of humanities and social science courses from the education of engineers, and suggests that the erasure of sociality impacts conceptions of technology’s promise. Later in the series, she explores other dimensions of the social stack and how race and gender are embedded in contemporary conceptions of the digital.

Engineering Beyond Bias: It’s Time To Call the Experts

Mittwoch, 06.12.2017
<p>This month, data scientist Cathy O’Neil caused a twitter storm when she alleged that academics are “asleep at the wheel” when it comes to critiquing artificial intelligence and algorithms and their impact in society. Within 24 hours, academics from the United States and Europe began to weigh in with evidence to the contrary, citing studies, conferences, scholars, and academic departments that have given more than three decades to the study of such.</p>

The Problems of Profiting from Internet Pollution

Montag, 08.01.2018
<p><!--[if gte mso 9]>--> Normal 0 21 false false false EN-US X-NONE X-NONE <!--[if gte mso 9]>--> <!--[if gte mso 10]>--> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:Arial; color:black; mso-ansi-language:EN; mso-fareast-language:EN-US;} <!--StartFragment--> <!--EndFragment--></p> <p><span>At the end of 2017, I attended and participated in an international conference on internet content moderation, </span><span><a href="https://atm-ucla2017.net/" target="_blank" rel="noopener"><span>All Things in Moderation</span></a></span><span>, at the University of California, Los Angeles, organized by my long-time research collaborator, </span><span><a href="http://newsroom.ucla.edu/experts/preview/5877c4372cfac202470891d1/" target="_blank" rel="noopener"><span>Dr. Sarah T. Roberts</span></a></span><span>, an authority on commercial content moderation. This conference was the first of its kind, bringing in stakeholders for public conversations that reflected the concerns of industry, activists, content moderation workers, journalists, academics, and policy makers. In today’s blog post, I want to talk about the ethical dimensions of regulating the internet and digital media platforms, whether by content moderation, algorithms and automated decision-making systems, or by public policy. </span></p>

The Problems of Platform Protections

Dienstag, 16.01.2018
<p>Yesterday, I celebrated the national Martin Luther King Jr. holiday in the United States, on the heels of the President of the United States <span><a href="http://www.newsweek.com/trump-america-shithole-country-780888" target="_blank" rel="noopener"><span>doubling down on his racist agenda</span></a></span><span> with abhorrent comments against people of color, immigrants, and those who don’t reflect his vision of America, an America where nazis and </span><span><a href="http://www.slate.com/blogs/the_slatest/2017/08/14/donald_trump_s_ties_to_alt_right_white_supremacists_are_extensive.html" target="_blank" rel="noopener"><span>white supremacists are legitimated</span></a></span><span> through more than just his re-tweets on Twitter. When we discuss Dr. King’s legacy, we spend considerable time talking about Dr. King’s commitments to </span><span><a href="https://www.theatlantic.com/business/archive/2014/01/remembering-martin-luther-king-jrs-solution-to-poverty/283193/" target="_blank" rel="noopener"><span>ending poverty and economic oppression</span></a></span><span>, which is fundamentally tied to racial and gender oppression. </span></p>

Robots, Race, and Gender

Dienstag, 30.01.2018
<p><!--[if gte mso 9]>--> <!--[if gte mso 9]>--> Normal 0 21 false false false EN-US X-NONE X-NONE <!--[if gte mso 9]>--> <!--[if gte mso 10]>--> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Calibri; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US; mso-fareast-language:EN-US;} <!--StartFragment--><span>Last week, I attended a meeting organized by </span><span><a href="https://genderedinnovations.stanford.edu/what-is-gendered-innovations.html" target="_blank" rel="noopener">Gendered Innovations at Stanford University</a></span><span> in Northern California. While there, I was thinking about the algorithmically-driven software that will be embedded in anthropomorphized computers – or robots – that will be entering the market soon. In this post, I want to offer a provocation, and suggest that we continue to gather interdisciplinary scholars to engage in research that asks questions about the re-inscribing of gender in both the software and hardware. </span><!--EndFragment--></p>