Under Pressure – On Forms of Authority and Decision-Making Power
Opening
Thu, Sep 27, 7 PMDuration
Sep 28 to Nov 25, 2018Opening hours
Tue-Sun 1-4 PM, 4:30-8 PMAdmission
freeGuided Tour for the Press: Tue, Sep 25, 10:00
Guided Tour with Sabine Winkler, curator, and Elisabeth Hajek, artistic director: Thu, Nov 22, 17:00 as part of VIENNA ART WEEK
Artists:
Rod Dickinson (GBR)*, İnci Eviner (TUR), HARD-CORE (NLD, FRA, ISL)*, Minna Henriksson (FIN)*, Bernd Hopfengärtner* & Ludwig Zeller (GER), Vladan Joler (SHARE Lab)(SRB), Isabella Kohlhuber (AUT), Stéphanie Lagarde (FRA), Liz Magic Laser (USA), Daniela Ortiz (PER/ESP), Olivia Plender (GBR), Sebastian Schmieg (GER), Stefanie Schroeder (GER)*, Superflux (IND/GBR, GBR), Nick Thurston & Steven Zultanski (GBR, USA), Pinar Yoldas (TUR/USA)
*Q21 Artists-in-Residence
The point of departure for this exhibition are authoritarian tendencies that have been increasingly apparent in recent times and which are manifesting more and more in political and economic fields and around AI development. Authoritarian politicians attracting majorities, a manifestly authoritarian financial market, authoritarian structures in IT systems and authoritarian tendencies in AI development – they all prompt the question of whether new forms of authoritarianism are distinguished by the fact that they allow individual freedoms, or at least give the impression of doing so.
The exhibition looks at the mechanisms, strategies and tactics that are used to restrict decision-making freedom, and the way security/protection, economy/finance, nation/cultural identity or the declared necessity of competition and efficiency are defined and fixed as decision premises. Proceeding from the dogmas of neo-liberalism and neo-nationalism, the exhibition is concerned with authoritarianism in politics, the economy, technology and art. The marginalisation of the political driven by neoliberalism, as well as the financial crisis, have led to an accelerated authoritarian capitalism. Authoritarian neo-nationalism, on the other hand, can be seen as one reaction to this development.
On the other hand, control over future decisions and actions, as well as influence, are the goal of pre-emptive algorithmic systems and forms of government. Information technology anticipation and technical assistants, such as Siri, Cortana and Alexa, internalise action programs and the ‘needs’ of users, and either anticipate or make decisions. Data analysis, the creation of profiles – these things aim not just to guess at but also inform our future decisions through technical systems.
But for now efficient machines (automation) compete with people, not just in terms of work itself, but also in making decisions. What are the consequences of transferring decision-making power to neuronal networks? Are we at risk of having our scope for action and thinking increasingly controlled and programmed through surveillance, Big Data rankings, social credit systems etc., and are we on a path that is leading us to digitally structured totalitarianism? Without us noticing, scope for decision-making and responsibility is disappearing in the disintegrating action continuities of the real and the virtual (hyper-reality).
This exhibition investigates dispositions of decision-making as well as forms of involvement and complicity in authoritarian political and algorithmic systems. What decisions will be required of us, how will decisions be automated, in what form do we ourselves consciously or unconsciously contribute to economic, algorithmic and state systems making decisions for us, and to what extent are we ourselves acting in authoritarian ways? When algorithms know more about us than we know ourselves, does this mean they can decide on our respective needs in a more rational and thus more appropriate way, and what does that mean for the conception of the autonomous self?
Could the resulting marginalisation of the subject (or the disengagement of subjectivity from subject, persona and person) bring an opportunity to effect change, and prevent authoritarian tendencies, or should we fear that this will merely displace or automate them, without ridding us of them? New dispositions are emerging in art by their questioning of the subjects of artist, curator and observer. Whether authoritarian structures in the art system are thereby abolished or only shifted remains speculative. What new forms of collaborative decision-making processes do we need?
In Rod Dickinson's installation „Zero Sum“ the exhibition visitors are invited to assume various different roles in a classical Game Theory dilemma (the “volunteer’s dilemma” or the “free-rider problem”) and to replay the four possibilities of the dilemma, through which they are guided by a virtual moderator.
Rod Dickinson shows how, disguised as a gain in freedom, automated work systems limit and control individual freedom of action.
HARD-CORE deals with robotic curating and has developed the Asahi 4.0 software, with which exhibitions can be curated automatically. Asahi 4.0 selects works of art via a random generator, de-subjectifies and collectivises decision-making processes. In the video “The Universal Blob (2”) five personalised entities reflect on a collective self and on curatorial practices that can be de-hierarchised, detached from aesthetic experiences and decisions. Or is Asahi 4.0 also dominant in its random decisions, not really collaborative?
Minna Henriksson investigates both lobbying and political engagement around the Finnish paper industry and its trade with South Africa after the Second World War. While left-wing unions opposed trade with the increasingly despotic apartheid regime, the association of Finnish paper manufacturers supported a continuation of trade. There are numerous examples of economic relationships with authoritarian governments or economic exploitation with authoritarian means from history and in the present day. The fear is that if authoritarian systems deliver larger profits in global competition, authoritarian capitalism will become standard.
In their video “Life Is Good For Now” Bernd Hopfengärtner and Ludwig Zeller stage a speculative view of a Switzerland that has decided to fully implement the right to informational self-determination. Mont Data, a cooperative, coordinates the mountain of scientific and commercial data, which confronts citizens with new decision-making possibilities and tasks. The two artists present fictitious accounts of their experiences in the fields of medicine, culture and everyday life in which possibilities for action against digital control and exploitation are presented and anticipated by retaining power over one’s own data.
In a monologue, a voice describes Facebook’s conduct from the company’s perspective – how data is evaluated, behaviour patterns and profiles created. Using data visualisations and graphics, Vladan Joler (SHARE Lab) illustrates Facebook’s quantification methods, showing how profiles of each user are created, data evaluation which the company sells for product advertising, micro-targeting campaigns, etc., and which in turn informs users’ decisions. The collected knowledge of our preferences, social and financial status, health, etc. can be used as leverage against us. But users themselves, trapped in echo chambers, increase the pressure by the need to perpetually comment on everything, something that is frequently mistaken for participation, yet which can exert enormous pressure on others.
The Substance of Value concerns a text passage selected from Capital by Karl Marx, and it translates the beginning of the first volume using a font designed by Isabella Kohlhuber in form, material and space. “According to Marx, value consists in the ‘abstract work’ represented in goods, which forms the substance of the exchange value accorded the goods.“ (Robert Kurz) This gives rise to contradictions and pressure scenarios such as that which posits the unrestrained accumulation of value as the goal, or that which, conversely, continually undermines the value substance of goods. How are meaning and value generated in this zone of representation, abstraction and reality, and what power relations does this express?
In „Déploiements (Deployments)”, Stéphanie Lagarde presents stagings of state control systems in public space in the form of two simulation processes that, from a potential future, play out engagements in the here and now. On show are French Air Force pilots rehearsing an aerobatic mission for the French national holiday. With gestures, hand movements and coded language they simulate the choreography of the upcoming air show. These images are combined with police training software used for monitoring demonstrations and crowds. On the one hand, motion sequences and procedures are automated according to certain patterns, on the other hand, future behaviour is both informed and controlled by pattern recognition in order to directly and indirectly restrict decision-making and the room for action.
Liz Magic Laser stages a therapeutic situation with actors in which a therapist invited the participants to bring together formative personal experiences and current political frustrations. Based on the method of primary therapy, the negative effects of traumatic experiences are to be reduced by re-experiencing them. In her installation “Primal Speech” Liz Magic Laser adapts the method of primary therapy as a political form of therapy. The coping with personal and political traumas is trained in order to defend oneself against the feeling of powerlessness as well as against authoritarian father, educational and political figures, against outside influence. The work can be understood as a commentary on the US election campaign and on Brexit in 2016.
In her work Daniela Ortiz explores concepts of nationality, class, race, equality and civil rights politics. She examines how the European system of immigration control and colonial racism are based on patterns of exclusion. In “The ABC of Racist Europe”, conceived as a picture book, Daniela Ortiz contrasts Eurocentric narratives with narratives from anticolonial and anti-racist perspectives. The authoritarian rejection of immigration manifests itself as a demand for cultural/societal decision-making power of a white middle class that seems to be losing its control of political decision-making. Daniela Ortiz deconstructs hegemonic narratives and practices and radically demands equality.
Olivia Plender’s work Set Sail for the Levant is based on a 16th-century board game entitled “The Game of the Goose”, which can be seen as a forerunner of Monopoly. In this game, only players who steal money from other participants and flee to the Middle East to evade criminal prosecution win, or avoid debt. Olivia Plender satirises ideological narratives such as those of Monopoly, which teaches people how to act in a capitalist system, by pointing out contradictions inherent in the system. The more radical and authoritarian its implementation, the more the promises of neoliberalism such as self-actualisation, success and free choice are combined with traditional narratives, national mythologies or cultural identity. Olivia Plender investigates how official, historical and contemporary narratives are constructed and the hierarchies that stand behind the “voice of authority” which is traditionally (re)produced in institutions in the public space such as the museum, the academy and the media.
In his project Decisive.Camera, Sebastian Schmieg investigates human and automated processes of categorisation, and deals with questions about photography and machine learning. Machine learning uses algorithms, which can use data sets to learn how to make predictions, to classify and to make decisions.The Decisive.Camera software analyses photos taken in the exhibition space and decides the extent to which they represent the problem, the solution, the past or the future. These decisions are made automatically and are the result of a learning process that Schmieg outsourced to visitors to The Photographers' Gallery in London in a previous project: they could assign the photos in the gallery's archive, including shots by renowned artists, to the above categories and thus training the machine-learning system that is now being used in Decisive Camera. How do we influence AI systems, which in turn change our perception? What decisions will AI systems take and who will control them?
To accelerate development of her career independence as an artist and and to get off welfare, Stefanie Schroeder was encouraged to take part in a company optimisation measure. A development prognosis used in this context determines the retention or the (forced) surrender of her independence. The artist analyses structures and functions of the optimisation imperative, investigates how optimisation and efficiency are used as measures of value and generalised as criteria for decision-making. The authoritarian claim to optimisation and efficiency becomes a primary decision-making criterion. With profit evaluation now a standard, exclusion and stigmatisation become formalised, raised to the level of dogma, to an authoritarian structure.
In their film Our Friends Electric, Superflux explore alternative forms and interactions with speech-based AI assistants. Superflux designed three AI devices to imagine potential relationships with speech assistants. These fictitious AI assistants, Eddi, Karma and Sig, are not real products, rather they represent archetypes of potential qualities of such devices. Superflux investigate how command structures determine dealings with technology and what other forms of interaction might be possible. But the work also addresses control functions of AI systems in relation to the unforeseeable development of self-learning machines.
In collaboration with writer Steven Zultanski, Nick Thurston has designed the text-based room installation Authorithy that is concerned with the relation between power, authority and the author in literature. The four text blocks describe how literary speech can create and dismantle conceptions and images of powerful characters. The diagram on the floor shows six words that come from a three-line poem by Nick Thurston: “Spelling worlds/With words/As work“. In the pictorial form of the diagram, the mono-linear structure of language is exchanged for a cartographic representation, visually referencing potential word and meaning combinations of the six words. The diagram refers to text passages on the wall which claim that literature can imagine complex worlds in our imagination with minimal means, create spaces of meaning, dismantle (power) fantasies.
In „Kitty AI: Artificial Intelligence for Governance“ Pinar Yoldas imagines an artificial intelligence (AI) that has assumed world domination. In the video, a 3D-animated cat talks about herself and her tasks as the ruler of a megalopolis set in 2039. Kitty AI acts as an emotional representative of an all-embracing AI regime in which “kitty love” and technology are to replace politicians. Do AI systems make better decisions because they can collect and evaluate more data? The technical solution skills of AI are highly promising, but the associated control functions as well as the loss of decision-making freedom are often concealed. Kitty AI presents herself as an agent for optimised governance in the form of emotional care.