Human Generated Data

Title

Untitled (women at Women's Club meeting)

Date

1947

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16207

Human Generated Data

Title

Untitled (women at Women's Club meeting)

People

Artist: Jack Gould, American

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16207

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Clothing 100
Adult 99.3
Female 99.3
Person 99.3
Woman 99.3
Person 99.2
Person 97.6
Boy 97.6
Child 97.6
Male 97.6
People 97.1
Person 97.1
Coat 97
Sitting 96.7
Person 95.8
Overcoat 95.2
Furniture 84.3
Face 82.4
Head 82.4
Table 79.9
Footwear 77.1
Shoe 77.1
Shoe 72.7
Person 71.1
Photography 67.2
Reading 58
Portrait 57.6
Dress 57.2
Indoors 56.9
Desk 56.4
Hat 56.4
Chair 55.9
Restaurant 55.7
Floor 55.7
Flooring 55.7
Dining Table 55.7
Formal Wear 55.3
Suit 55.3
Accessories 55.2
Bag 55.2
Handbag 55.2
Couch 55.2
Jacket 55.1
Fashion 55.1

Clarifai
created on 2018-08-23

people 99.9
group 98.7
adult 98.4
man 97.4
leader 97.3
group together 97.1
two 96.1
administration 95.4
woman 94.7
chair 94.3
three 92.2
room 91
actor 90.5
furniture 90
several 89.5
five 89.5
four 89
outfit 86.4
military 86.1
actress 85.2

Imagga
created on 2018-08-23

male 22.7
man 21.5
black 18.9
call 18.3
people 17.3
telephone 17.2
person 16.8
business 16.4
portrait 16.2
office 13.7
pay-phone 13.5
electronic equipment 13.3
businessman 13.2
adult 13.1
window 12.6
device 11.9
old 11.8
musical instrument 11.5
room 11.2
equipment 11.2
lady 10.5
laptop 10.5
one 10.4
computer 10.3
suit 9.9
career 9.5
men 9.4
wall 9.4
house 9.2
silhouette 9.1
posing 8.9
body 8.8
home 8.8
urban 8.7
model 8.5
building 8.5
dark 8.3
chair 8.3
city 8.3
technology 8.2
style 8.2
working 8
lifestyle 7.9
couple 7.8
smile 7.8
thinking 7.6
fashion 7.5
happy 7.5
vintage 7.4
wind instrument 7.2
looking 7.2
world 7.2
history 7.2
love 7.1
job 7.1

Google
created on 2018-08-23

Microsoft
created on 2018-08-23

person 98.3
man 92.1
posing 58.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-53
Gender Female, 90.7%
Calm 36.6%
Surprised 30.7%
Confused 14.2%
Fear 10.5%
Disgusted 8.1%
Angry 4.4%
Sad 3.5%
Happy 1.9%

AWS Rekognition

Age 51-59
Gender Female, 99.7%
Calm 97.9%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Angry 0.2%
Confused 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 60-70
Gender Female, 97.6%
Calm 59.8%
Confused 17.7%
Surprised 11.5%
Fear 10.1%
Sad 3.6%
Disgusted 1.2%
Happy 0.6%
Angry 0.6%

AWS Rekognition

Age 50-58
Gender Female, 100%
Surprised 88.4%
Confused 13.8%
Sad 9.8%
Fear 9.2%
Disgusted 6.6%
Calm 5.3%
Angry 0.7%
Happy 0.3%

AWS Rekognition

Age 6-14
Gender Female, 98.9%
Sad 87.6%
Calm 35.7%
Fear 17%
Surprised 6.7%
Confused 1.9%
Disgusted 1.5%
Angry 1.2%
Happy 0.7%

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 8
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Female 99.3%
Person 99.3%
Woman 99.3%
Boy 97.6%
Child 97.6%
Male 97.6%
Coat 97%
Shoe 77.1%