Human Generated Data

Title

Untitled (Christmas Party Bar)

Date

1948

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1560

Human Generated Data

Title

Untitled (Christmas Party Bar)

People

Artist: John Deusing, American active 1940s

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Person 99.1
Person 94.5
Person 91.9
Person 84.3
Worker 82.9
Person 76.7
Person 75.5
Indoors 72.8
Hairdresser 64.6
Person 64
Photography 63.6
Portrait 63.6
Face 63.6
Photo 63.6
Person 62.1
Washing 59.6
Interior Design 57.1
Room 57
Crowd 56.8
Clinic 55.5

Imagga
created on 2021-12-14

silhouette 33.9
people 27.9
house 20.1
happy 20.1
man 19
business 18.2
person 18.1
home 17.9
design 17.4
couple 17.4
male 16.3
life 15.9
adult 14.3
women 14.2
happiness 13.3
art 12.8
modern 12.6
team 12.5
family 12.5
fun 12
love 11.8
businessman 11.5
player 11.4
indoors 11.4
smile 11.4
human 11.2
bright 10.7
sport 10.7
tracing 10.7
mother 10.3
grunge 10.2
glowing 10.2
smiling 10.1
drawing 10.1
room 10
businesswoman 10
fashion 9.8
crowd 9.6
corporate 9.4
men 9.4
symbol 9.4
relationship 9.4
casual 9.3
attractive 9.1
black 9
blond 9
interior 8.8
graphic 8.8
urban 8.7
shiny 8.7
lifestyle 8.7
hand 8.5
portrait 8.4
joy 8.4
leisure 8.3
retro 8.2
painting 8.1
backgrounds 8.1
group 8.1
cartoon 8
idea 8
looking 8
together 7.9
work 7.8
standing 7.8
hands 7.8
frame 7.8
summer 7.7
two 7.6
wife 7.6
poster 7.6
arrow 7.5
style 7.4
stucco 7.4
light 7.4
clothing 7.3
competition 7.3
cheerful 7.3
suit 7.2
activity 7.2
holiday 7.2
vibrant 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 96.8
person 92.4
man 92
drawing 85.9
clothing 83.6
window 81.4
posing 37

Face analysis

Amazon

Google

AWS Rekognition

Age 44-62
Gender Male, 56.7%
Calm 82.2%
Happy 12.6%
Surprised 1.9%
Angry 1.4%
Sad 0.7%
Fear 0.7%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 19-31
Gender Female, 97.3%
Happy 73.2%
Sad 15.1%
Calm 7.3%
Angry 1.4%
Surprised 1.4%
Confused 1%
Fear 0.4%
Disgusted 0.2%

AWS Rekognition

Age 29-45
Gender Male, 57.5%
Sad 46.7%
Calm 40%
Happy 11.9%
Confused 0.9%
Angry 0.3%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 30-46
Gender Male, 76.5%
Calm 55.3%
Happy 29.5%
Sad 10.7%
Confused 1.8%
Angry 1.5%
Surprised 0.6%
Disgusted 0.5%
Fear 0.2%

AWS Rekognition

Age 44-62
Gender Female, 59.7%
Calm 77.6%
Sad 16.2%
Fear 1.6%
Surprised 1.3%
Angry 1.3%
Confused 1.2%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 20-32
Gender Female, 64.9%
Sad 49.8%
Calm 47.4%
Fear 2.1%
Happy 0.5%
Angry 0.1%
Confused 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 26-42
Gender Female, 95.4%
Calm 44.5%
Happy 33.9%
Surprised 15%
Confused 3.2%
Angry 1.6%
Sad 0.9%
Disgusted 0.5%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a mirror posing for the camera 67.6%
a man standing in front of a window 67.5%
a group of people standing in front of a mirror posing for the camera 65.5%

Text analysis

Amazon

11
22
if

Google

22 11
22
11