Human Generated Data

Title

Untitled (people with packages in office)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16661

Human Generated Data

Title

Untitled (people with packages in office)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-18

Person 99.6
Human 99.6
Person 99.5
Person 98.6
Person 97.6
Person 97.2
Person 95.9
Face 73.5
People 68.7
Clinic 65.3
Worker 64.8
Clothing 57.9
Apparel 57.9
Display 57.3
Electronics 57.3
Screen 57.3
Monitor 57.3
Hairdresser 56.2

Imagga
created on 2022-02-18

people 31.2
man 26.2
couple 24.4
person 23.7
male 21.3
adult 20.2
barbershop 19.2
happiness 18.8
home 18.3
women 18.2
bride 17.5
room 17.1
happy 16.9
shop 16.9
smiling 16.6
two 16.1
love 15.8
sitting 15.4
salon 15.1
family 15.1
indoors 14.9
portrait 14.9
wedding 14.7
men 14.6
senior 14
groom 13.6
nurse 13.6
dress 13.5
worker 13.1
lifestyle 13
mercantile establishment 12.6
medical 12.3
human 12
old 11.8
husband 11.4
cheerful 11.4
work 11
professional 11
mother 10.8
care 10.7
patient 10.5
elderly 10.5
bouquet 10.5
wife 10.4
togetherness 10.4
hospital 10.4
doctor 10.3
kin 10.3
mature 10.2
smile 10
lady 9.7
health 9.7
two people 9.7
together 9.6
looking 9.6
wind instrument 9.4
brass 9.2
drink 9.2
indoor 9.1
sax 9.1
negative 9.1
musical instrument 9.1
fashion 9
clinic 9
interior 8.8
celebration 8.8
table 8.7
brunette 8.7
clothing 8.6
married 8.6
drinking 8.6
holiday 8.6
place of business 8.4
modern 8.4
pretty 8.4
fun 8.2
romantic 8
businessman 7.9
medicine 7.9
face 7.8
chair 7.6
marriage 7.6
enjoying 7.6
house 7.5
film 7.5
vintage 7.4
new 7.3
aged 7.2
team 7.2
working 7.1
day 7.1

Google
created on 2022-02-18

Microsoft
created on 2022-02-18

person 98.4
indoor 91.8
text 89.2
human face 83.8
window 80.7
clothing 77.2
man 56
posing 45.6

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Surprised 96.8%
Sad 1.2%
Confused 0.5%
Calm 0.5%
Happy 0.3%
Disgusted 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 38-46
Gender Male, 97.1%
Happy 75.8%
Calm 21%
Surprised 1.3%
Sad 0.7%
Disgusted 0.5%
Angry 0.3%
Fear 0.3%
Confused 0.2%

AWS Rekognition

Age 35-43
Gender Male, 86.3%
Calm 99.1%
Happy 0.6%
Disgusted 0.1%
Sad 0.1%
Angry 0%
Confused 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Happy 38.3%
Calm 26.4%
Sad 20.9%
Confused 4.4%
Disgusted 3.3%
Surprised 2.5%
Angry 2.2%
Fear 2%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Sad 64%
Calm 32.3%
Confused 2%
Happy 0.5%
Disgusted 0.4%
Angry 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 90%
Calm 98.6%
Sad 0.5%
Happy 0.4%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people sitting in front of a mirror posing for the camera 83.1%
a group of people sitting in front of a mirror 83%
a group of people sitting in front of a window 82.9%

Text analysis

Amazon

7
ans