Geför­dert durch:

Abstract

Schnei­der, Dia­na (2019, 6.5.): Pro and Cons of the con­cept of Algo­rithm TÜV to gover­nan­ce of app­lied AI, 18th Annu­al STS Con­fe­rence Graz 2019, Track Digi­ta­liz­a­ti­on of Socie­ty – S2: The poli­tics of algo­rith­mic gover­nan­ce: Data sub­jects and social orde­ring in the digi­tal age, Graz. [Tan­dem 5]

The usa­ge of algo­rith­ms and Big Data Ana­ly­tics is beco­m­ing a mat­ter of cour­se in more and more app­li­ca­ti­on are­as. Last but not least, with the AMS-model from Aus­tria [1] which will be used to deter­mi­ne the need for sup­port mea­su­res of unem­ploy­ed peop­le to obtain a job it beco­mes clear that not only the oppor­tu­nities and chal­len­ges of this tech­no­lo­gy should be dis­cus­sed, but con­cepts for the regu­la­ti­on of algo­rith­ms must be estab­lis­hed. This app­lies in par­ti­cu­lar if the usa­ge of algo­rithm and Big Data Ana­ly­tics is used to assess peop­le in social con­texts as in the given examp­le. At this point well-known key issu­es, such as data pro­tec­tion and data secu­ri­ty, dou­ble sub­jec­ti­vi­ty and the issue of fal­se posi­ti­ve and nega­ti­ve issu­es are rele­vant and use­ful. How the desi­re for effi­ci­en­cy on the one hand and the dan­ger of stig­ma­ti­sa­ti­on and discri­mi­na­ti­on by the sys­tems on the other can be eli­mi­na­ted is deba­ted at length in Ger­ma­ny. One pos­si­ble solu­ti­on is the demand for the foun­da­ti­on and imple­men­ta­ti­on of a so-cal­led “Algo­rithm TÜV” or “TÜV for Arti­fi­cial Intel­li­gence”. The acro­nym TÜV stands for a tech­ni­cal tes­ting orga­ni­sa­ti­on which per­forms secu­ri­ty con­trols requi­red by sta­te laws or regu­la­ti­ons. The demand which is men­tio­ned for examp­le by the Fede­ra­ti­on of Ger­man Con­su­mer Orga­ni­sa­ti­ons (Ver­brau­cher­ver­band Bun­des­zen­tra­le) usual­ly refers to algo­rith­ms of car soft­ware and auto­ma­tic decisi­on sys­tems, such as sco­ring algo­rith­ms [2; 3]. Ther­eby, socie­tal princi­ples and stan­dards, such as data pro­tec­tion com­pli­an­ce, the pre­ven­ti­on of discri­mi­na­to­ry aspects, and, last but not least, trans­pa­ren­cy could be ensured.

In the pro­po­sed con­tri­bu­ti­on I will dis­cuss the abo­ve solu­ti­on using MAEWIN pro­ject as an examp­le. The pro­ject is part of the of the Digi­tal Socie­ty rese­arch pro­gram fun­ded by the Minis­try of Cul­tu­re and Sci­ence of the Ger­man Sta­te of North Rhi­ne-West­pha­lia and inter alia aims to deve­lop a pro­to­ty­pe of decisi­on sup­port sys­tem for the use in Social Work. Becau­se the effects of this tech­no­lo­gy in the con­text of wel­fa­re sta­te insti­tu­ti­ons, espe­cial­ly in the area of social work, have hard­ly been rese­ar­ched, the “Algo­rithm TÜV” does not seem to be a sui­ta­ble solu­ti­on for star­ters to the pro­blem of pos­si­ble trans­fer social dis­pa­ri­ties into the digi­tal world.

Refe­ren­ces

[1] Holl, J.; Kern­beiß, G. and M. Wag­ner-Pin­ter (2018): Das AMS-Arbeits­markt­chan­cen-Modell. Doku­men­ta­ti­on zur Metho­de, Kon­zept­un­ter­la­ge vom Okto­ber 2018, Syn­the­sis­For­schung,  www.forschungsnetzwerk.at/downloadpub/arbeitsmarktchancen_methode_%20dokumentation.pdf, last visi­ted 17/12/2018.

[2] Pres­se­mit­tei­lung Ver­brau­cher­zen­tra­le Bun­des­band: Algo­rith­men-TÜV für Auto­soft­ware, 20/06/2017,  https://www.vzbv.de/pressemitteilung/algorithmen-tuev-fuer-autosoftware, last visi­ted 20/12/2018.

[3] SVRV (2018). Ver­brau­cher­ge­rech­tes Sco­ring. Gut­ach­ten des Sach­ver­stän­di­gen­rats für Ver­brau­cher­fra­gen. Ber­lin: Sach­ver­stän­di­gen­rat für Ver­brau­cher­fra­gen, http://www.svr-verbraucherfragen.de/dokumente/verbrauchergerechtes-scoring/, last visi­ted 20/12/2018.