Human Generated Data

Title

Untitled (girl standing on drain coming out of wall)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14953

Human Generated Data

Title

Untitled (girl standing on drain coming out of wall)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14953

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 100
Apparel 100
Person 99.7
Human 99.7
Person 99.5
Person 98.7
Person 98.3
Raincoat 91.3
Overcoat 77.1
Suit 68.7
Coat 65.5

Clarifai
created on 2023-10-29

people 99.9
woman 98.3
adult 97.6
group 95.7
wear 94.8
monochrome 94.6
dress 94.5
administration 94.4
leader 93.6
man 93.6
wedding 93.2
three 93.1
two 92.9
four 88.5
indoors 86.7
outfit 85.8
bride 84.7
home 83.6
actress 83.3
portrait 82.2

Imagga
created on 2022-03-05

building 17.5
people 17.3
man 16.8
dress 16.3
old 16
shop 15.9
city 15
adult 14.6
window 13.9
male 13.5
architecture 13.4
barbershop 12.3
wall 12.1
fashion 12.1
portrait 11.6
tourism 11.6
bride 11.5
historical 11.3
travel 11.3
ancient 11.2
statue 10.7
urban 10.5
couple 10.5
sculpture 10.4
person 10.4
men 10.3
monument 10.3
black 10.2
historic 10.1
house 10
clothing 9.7
art 9.4
stone 9.3
weapon 9.2
wedding 9.2
case 9.1
sword 9
history 8.9
mercantile establishment 8.9
women 8.7
love 8.7
bouquet 8.5
room 8.4
street 8.3
vintage 8.3
tourist 8.2
religion 8.1
family 8
home 8
interior 8
life 7.9
happiness 7.8
luxury 7.7
culture 7.7
bathroom 7.5
world 7.5
nurse 7.5
church 7.4
business 7.3
hair 7.1
smile 7.1
crutch 7.1
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

dress 95.2
clothing 92.7
text 87.9
window 87.2
person 86.4
woman 86
wedding dress 80.1
black and white 79.8
footwear 62.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 95.9%
Calm 75.5%
Sad 18.6%
Surprised 2.3%
Happy 1.3%
Fear 0.7%
Disgusted 0.7%
Angry 0.5%
Confused 0.4%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 92.8%
Sad 3.6%
Surprised 1.4%
Happy 1%
Confused 0.6%
Disgusted 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Male, 79.7%
Calm 100%
Surprised 0%
Sad 0%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Coat
Person 99.7%
Person 99.5%
Person 98.7%
Person 98.3%
Coat 65.5%

Categories