Human Generated Data

Title

Untitled (couple dancing in high school talent show, man lifting woman)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16723

Human Generated Data

Title

Untitled (couple dancing in high school talent show, man lifting woman)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16723

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.3
Human 99.3
Person 96.6
Person 95.4
Acrobatic 94.5
Leisure Activities 87.8
Dance Pose 84.3
Gymnast 55
Sport 55
Athlete 55
Gymnastics 55
Sports 55
Person 45.1

Clarifai
created on 2023-10-29

people 99.7
monochrome 98
man 97.7
one 96.9
adult 95.5
two 95
indoors 94.7
woman 91.7
group together 89
group 88.1
three 87.2
child 84.9
dancing 84.3
family 83.5
street 82.4
music 82.3
window 80.8
wear 80
fun 79.6
many 78.5

Imagga
created on 2022-02-26

man 18.1
male 17.7
building 17.5
musical instrument 16.8
adult 14.7
person 13.5
people 13.4
device 13.3
architecture 12.6
fashion 12.1
balcony 12
men 12
stringed instrument 11.4
dark 10.9
cleaner 10.5
women 10.3
house 10
dirty 9.9
dress 9.9
religion 9.9
posing 9.8
black 9.6
statue 9.3
exterior 9.2
teacher 9
music 9
light 8.8
wall 8.8
couple 8.7
performance 8.6
action 8.6
window 8.6
style 8.2
body 8
business 7.9
urban 7.9
sculpture 7.8
portrait 7.8
grunge 7.7
old 7.7
dance 7.6
balance 7.6
sax 7.5
silhouette 7.4
educator 7.4
alone 7.3
danger 7.3
happiness 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 95
person 91.4
black and white 90.6
text 89.1
concert 87.8
statue 84.6
black 71.6
clothing 69.4
white 64.1
dance 61.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Sad 85.9%
Angry 4.8%
Confused 3.1%
Calm 2.2%
Disgusted 1.6%
Happy 1.1%
Surprised 0.7%
Fear 0.6%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 95%
Sad 2.1%
Happy 1%
Angry 0.8%
Surprised 0.5%
Confused 0.4%
Disgusted 0.2%
Fear 0.1%

Feature analysis

Amazon

Person
Person 99.3%
Person 96.6%
Person 95.4%
Person 45.1%

Categories

Imagga

interior objects 99.8%

Text analysis

Amazon

Y-ETTES
y'ettes