• Invisible Images (Your Pictures Are Looking at You)

    http://thenewinquiry.com/essays/invisible-images-your-pictures-are-looking-at-you

    Super article, très important. À lire.

    When you put an image on Facebook or other social media, you’re feeding an array of immensely powerful artificial intelligence systems information about how to identify people and how to recognize places and objects, habits and preferences, race, class, and gender identifications, economic statuses, and much more.

    Neural networks cannot invent their own classes; they’re only able to relate images they ingest to images that they’ve been trained on. And their training sets reveal the historical, geographical, racial, and socio-economic positions of their trainers.

    Ideology’s ultimate trick has always been to present itself as objective truth, to present historical conditions as eternal, and to present political formations as natural. Because image operations function on an invisible plane and are not dependent on a human seeing-subject (and are therefore not as obviously ideological as giant paintings of Napoleon) they are harder to recognize for what they are: immensely powerful levers of social regulation that serve specific race and class interests while presenting themselves as objective

    as capital searches out new domains of everyday life to bring into its sphere, the ability to use automated imaging and sensing to extract wealth from smaller and smaller slices of everyday life is irresistible. It’s easy to imagine, for example, an AI algorithm on Facebook noticing an underage woman drinking beer in a photograph from a party. That information is sent to the woman’s auto insurance provider, who subscribes to a Facebook program designed to provide this kind of data to credit agencies, health insurers, advertisers, tax officials, and the police. Her auto insurance premium is adjusted accordingly. A second algorithm combs through her past looking for similar misbehavior that the parent company might profit from.

    Machine-machine systems are extraordinary intimate instruments of power that operate through an aesthetics and ideology of objectivity, but the categories they employ are designed to reify the forms of power that those systems are set up to serve. As such, the machine-machine landscape forms a kind of hyper-ideology that is especially pernicious precisely because it makes claims to objectivity and equality.

    #facebook #ia #machine_learning