Skip to content

Final Project for my class Politics of Code. An art project that tells a story about the transformation of a user to an inaccurate abstraction.

Notifications You must be signed in to change notification settings

safal312/bodies_and_abstractions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Bodies and Abstractions

Youtube Demo: https://youtu.be/YaR-IbtMQfM

Technologies Used: p5.js, ml5.js

My project is mainly focused on the assumptions that algorithms make about us. These assumptions may not be totally accurate and there isn't much that us as simple users can usually do to stop these assumptions from being made and saved.

In this project, I make use of the Betaface API that according to their website has customers like 20th Century Fox, Disney, etc. Making assumptions about people's race, gender, etc. has been a popular selling point in recent times. Hikvision in China was called out for making an AI product for "Uyghur recognition". In the Western world too, the use of AI technologies like facial recognition is quite significant in law enforcement. However, there have been cases like that of Robert WiIlliams where he was wrongfully arrested because of a faulty result of the facial recognition system that the authorities were using. Keeping these cases in mind, I wanted to comment on this issue through my project by highlighting these assumptions that are being made about us. Then, I started looking for providers that were doing these kinds of predictions for classification of social groups and other features. After a couple of searches, I was able to find this API, and I decided to make use of it.

Along with the classification of gender and race, I found that this API was trying to predict if you had a double chin or not, if you are bald, if you have a big nose, etc. It is important to ask, why is it that the creators thought it was necessary for the tool to predict these features. What are they useful for? Again, we don't have the answers as we don't control what is done with these assumptions. Moreover, one of the predictions is on the user's attractiveness. What is it that it judges you on for it to think that you are attractive? Our personal characteristics are judged based on the algorithm's own standard of beauty. The standard that is set by the algorithm becomes universal. We all get measured on a scale that is controlled by the central entity and is out of our reach.

This tool may misidentify your gender, your race, your age, etc. It may call you unattractive. It may call you old. It may say things you are insecure about without any emotion. Through this process, I just want to make the viewer conscious about what kind of assumptions are usually being made about them. These assumptions sum up to make your profile. Disintegrating the real, complex human life into predictable, abstract bodies stored as code. While this process is mostly done behind the scenes, in my project I wish to show this process in a more confrontational manner. These assumptions can be quite personal and even though we may feel as though collection of our data and computers making assumptions may not bother us, in this confrontational setting, it can make us feel weirded out or concerned.

In this project, I am trying to communicate a story. I've made use of p5.js and ml5.js libraries to achieve my goal. The user enters the website to be greeted by a friendly voice. It notifies you that it is going to "take a good look at you". You can't stop it. It just happens. After the response has been loaded, the voice shouts out, loud and clear, its assumptions that may be inaccurate or embarrassing to you. However, the algorithm doesn't care as it continues to say what it thinks about you. The image of the user slowly starts fading as the voice talks until a prompt is shown on the screen. The user is now unrecognizable, merely a shadow on all the assumptions that have been made. The prompt gives you the "option" to save the results. However, no matter how hard the user tries, the user can't refuse. The button is there, but it is not clickable. Then, as time passes, the results are saved automatically, the user has now disappeared from the screen. All that's left is the assumptions that the tool has made about the user. The user is greeted a goodbye and the website reloads, ready to greet another user.

About

Final Project for my class Politics of Code. An art project that tells a story about the transformation of a user to an inaccurate abstraction.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published