Human Generated Data

Title

Untitled (two girls in dresses taking gifts from chair)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9226

Human Generated Data

Title

Untitled (two girls in dresses taking gifts from chair)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Person 96.9
Clothing 96.6
Apparel 96.6
Shorts 89.2
Chair 85.8
Furniture 85.8
Female 83.4
Girl 65.8
Sleeve 63.7
Woman 62.3
Photo 61.2
Photography 61.2
Screen 59.6
Electronics 59.6
Leisure Activities 58.9
Piano 58.9
Musical Instrument 58.9
Display 57.7
Monitor 57.7
Advertisement 57.4
Poster 57.4

Imagga
created on 2022-01-23

negative 33.4
person 32.8
film 28.3
man 26.9
people 22.9
mask 21.4
adult 20.6
photographic paper 20.4
male 18.4
human 17.2
patient 17.2
black 15
fashion 14.3
photographic equipment 14
professional 13.7
home 13.5
instrument 12.4
medical 12.4
portrait 12.3
happy 11.9
case 11.8
sick person 11.7
laboratory 11.6
studio 11.4
sexy 11.2
work 11.1
happiness 11
equipment 10.7
chemical 10.6
attractive 10.5
urban 10.5
biology 10.4
doctor 10.3
men 10.3
covering 10.2
occupation 10.1
house 10
smile 10
protective covering 9.8
clothing 9.8
interior 9.7
medicine 9.7
style 9.6
device 9.2
business 9.1
suit 9
worker 8.9
working 8.8
indoors 8.8
lab 8.7
scientific 8.7
women 8.7
chemistry 8.7
test 8.7
research 8.6
modern 8.4
elegance 8.4
holding 8.2
technology 8.2
student 8.1
cheerful 8.1
nurse 8
art 8
science 8
posing 8
life 8
brunette 7.8
warrior 7.8
model 7.8
pretty 7.7
health 7.6
city 7.5
room 7.5
vintage 7.4
window 7.4
lady 7.3
salon 7.3
protection 7.3
coat 7.2
dress 7.2
looking 7.2
holiday 7.2
family 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 95.6
clothing 91.5
musical instrument 90.6
window 89.8
indoor 87.9
text 81.4

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Female, 61.5%
Calm 97.2%
Happy 1%
Surprised 0.7%
Sad 0.4%
Confused 0.2%
Angry 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 16-24
Gender Male, 65.2%
Calm 84.1%
Sad 12.4%
Happy 1.2%
Confused 0.7%
Angry 0.5%
Disgusted 0.4%
Surprised 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Piano 58.9%

Captions

Microsoft

a man and a woman standing in front of a window 52.6%
a person standing in front of a window 52.5%
a person standing in front of a window 52.4%