Human Generated Data

Title

Untitled (family in Christmas living room with several gathered around tree)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9524

Human Generated Data

Title

Untitled (family in Christmas living room with several gathered around tree)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 98.1
Human 98.1
Person 98
Room 98
Indoors 98
Person 94.5
Person 93.9
Apparel 92.3
Clothing 92.3
Interior Design 91.6
Person 90.4
Furniture 90.2
Person 86.8
Dressing Room 81.3
People 73.6
Bedroom 73
Chair 70.5
Female 64.4
Photography 60.1
Photo 60.1
Urban 59.6
Living Room 59.6
Bed 57.5
Suit 56
Coat 56
Overcoat 56

Imagga
created on 2022-01-28

man 25
people 22.9
shop 18.2
clinic 18.1
medical 17.6
male 16.4
person 15.8
room 15.1
doctor 15
mercantile establishment 14.2
medicine 14.1
professional 14
work 13.9
old 13.9
hospital 13.8
barbershop 13.7
salon 13.6
patient 13
men 12.9
health 12.5
surgeon 12
human 11.2
home 11.2
vintage 10.7
worker 10.7
adult 10.6
working 10.6
life 10.5
seller 10.4
ancient 10.4
business 10.3
city 10
family 9.8
place of business 9.4
office 9.2
care 9
technology 8.9
interior 8.8
instrument 8.8
laboratory 8.7
equipment 8.7
scene 8.6
table 8.6
antique 8.6
illness 8.6
tradition 8.3
occupation 8.2
tourism 8.2
retro 8.2
new 8.1
nurse 8.1
team 8.1
history 8
surgery 7.8
lab 7.8
sitting 7.7
test 7.7
research 7.6
biology 7.6
historic 7.3
religion 7.2
holiday 7.2
women 7.1
job 7.1
businessman 7.1
architecture 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 98.5
clothing 88.4
black and white 82.2
woman 73.8
person 73.2
dress 57.6
wedding dress 50.5
clothes 25.7
several 10.3
cluttered 10.2

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 99.3%
Calm 84.8%
Surprised 10.8%
Angry 1.7%
Happy 0.8%
Disgusted 0.8%
Confused 0.4%
Fear 0.3%
Sad 0.3%

AWS Rekognition

Age 16-24
Gender Female, 57.2%
Calm 86.4%
Sad 12.6%
Happy 0.4%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%

Captions

Microsoft

a group of people in a room 78.9%
a group of people standing next to a window 54.4%
a group of people standing in a room 54.3%

Text analysis

Amazon

e
1013 e
VAGOY
1013
VT2702 VAGOY
VT2702
عام

Google

a
3
t
s
e
t a s 3 e