Human Generated Data

Title

Untitled (girls swim team posing by pool)

Date

1925

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1905

Human Generated Data

Title

Untitled (girls swim team posing by pool)

People

Artist: Hamblin Studio, American active 1930s

Date

1925

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Person 98.3
Person 97.3
Person 93.4
Person 92.9
Person 92.4
Person 92.3
Person 91.9
Person 88.9
Person 85.7
Person 85.6
Person 78
Room 78
Indoors 78
Person 73.8
Person 73.6
Person 72.8
Person 69.7
Interior Design 68.6
People 66.4
Leisure Activities 63.5
Person 62.3
Jury 61.8
Figurine 57.3

Imagga
created on 2021-12-14

interior 32.7
counter 20.3
architecture 19.5
room 18.4
dancer 17.7
house 17.5
table 17.5
building 16.6
indoor 16.4
people 16.2
modern 16.1
man 15.5
marble 15.5
art 14.8
person 14.7
performer 14.7
city 14.1
furniture 13.8
window 13.6
decor 13.2
design 12.9
luxury 12.9
home 12.7
sculpture 12.5
chair 12.5
light 12
restaurant 12
women 11.9
hall 11.6
business 11.5
glass 11.2
floor 11.1
entertainer 11.1
wedding 11
structure 10.6
new 10.5
inside 10.1
3d 10.1
adult 9.8
style 9.6
apartment 9.6
elegant 9.4
happy 9.4
life 9.3
statue 8.8
wall 8.8
urban 8.7
bride 8.6
marriage 8.5
contemporary 8.5
monument 8.4
column 8.3
kitchen 8.3
historic 8.2
gymnasium 8.2
dress 8.1
history 8
lamp 7.9
love 7.9
professional 7.9
residence 7.8
travel 7.7
crowd 7.7
old 7.7
residential 7.6
hotel 7.6
stone 7.6
fashion 7.5
historical 7.5
classic 7.4
decoration 7.3
group 7.2
balcony 7.1
male 7.1
businessman 7.1
indoors 7
shop 7
groom 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.1
person 88.9
dance 70.9
old 50.2
posing 36.8

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Male, 77.4%
Sad 93.1%
Calm 6.4%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 22-34
Gender Male, 85.2%
Calm 63.7%
Happy 13.2%
Sad 11.2%
Surprised 8.5%
Confused 1.8%
Angry 1.1%
Disgusted 0.4%
Fear 0.2%

AWS Rekognition

Age 30-46
Gender Male, 77%
Calm 84.2%
Sad 9.1%
Happy 4.7%
Confused 1.2%
Surprised 0.4%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 44-62
Gender Male, 91.4%
Calm 62.3%
Sad 14.2%
Angry 11%
Disgusted 4.3%
Confused 4.2%
Surprised 1.7%
Happy 1.6%
Fear 0.7%

AWS Rekognition

Age 40-58
Gender Male, 53.5%
Sad 64.4%
Calm 24.1%
Confused 3.6%
Happy 2.8%
Angry 1.8%
Surprised 1.5%
Fear 1.4%
Disgusted 0.3%

AWS Rekognition

Age 26-40
Gender Female, 67%
Calm 83.2%
Sad 14%
Happy 2.5%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 25-39
Gender Male, 50.7%
Happy 68.3%
Calm 20.7%
Sad 4.4%
Surprised 2.2%
Confused 2.1%
Angry 1.5%
Fear 0.5%
Disgusted 0.5%

AWS Rekognition

Age 23-35
Gender Female, 94.4%
Happy 56.4%
Calm 36.4%
Sad 4.2%
Confused 1.3%
Disgusted 0.6%
Surprised 0.4%
Angry 0.4%
Fear 0.4%

AWS Rekognition

Age 35-51
Gender Male, 89.3%
Calm 86.1%
Confused 4.9%
Sad 4.3%
Surprised 2.3%
Happy 1.2%
Angry 0.6%
Disgusted 0.4%
Fear 0.2%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 71.8%
a vintage photo of a group of people posing for a picture 71.7%
a group of people posing for a photo 71.6%