Human Generated Data

Title

Untitled (men and women at party)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17080

Human Generated Data

Title

Untitled (men and women at party)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.8
Human 99.8
Person 99.1
Person 98.9
Person 97.6
Sitting 89.9
Clothing 86.6
Apparel 86.6
Person 86.2
Female 75.1
Outdoors 71
Nature 70
People 67.3
Face 62.1
Girl 62
Photography 61.8
Photo 61.8
Woman 58.1
Furniture 58.1
Musician 57.9
Musical Instrument 57.9
Leisure Activities 56.9

Imagga
created on 2022-02-26

man 37
hairdresser 34.4
home 30.3
people 30.1
person 29.1
male 27.7
adult 24.2
salon 24.2
couple 23.5
indoors 21.1
senior 20.6
smiling 20.3
lifestyle 19.5
happy 18.8
together 18.4
sitting 18
room 17.5
family 16.9
cheerful 15.4
patient 15.2
love 13.4
mature 13
table 13
women 12.7
happiness 12.5
husband 12.5
men 12
work 12
two 11.9
old 11.8
child 11.8
house 11.7
chair 11.6
interior 11.5
elderly 11.5
drinking 11.5
computer 11.3
attractive 11.2
clothing 11.2
portrait 11
indoor 11
business 10.9
mother 10.9
teacher 10.8
handsome 10.7
kin 10.7
office 10.6
loving 10.5
smile 10
retired 9.7
retirement 9.6
enjoying 9.5
adults 9.5
togetherness 9.4
nurse 9.4
friends 9.4
relationship 9.4
face 9.2
holding 9.1
aged 9
group 8.9
looking 8.8
affectionate 8.7
boy 8.7
couch 8.7
expression 8.5
sick person 8.5
desk 8.5
drink 8.4
color 8.3
case 8.3
wine 8.3
20s 8.2
classroom 8.2
technology 8.2
team 8.1
to 8
holiday 7.9
bonding 7.8
education 7.8
modern 7.7
girlfriend 7.7
pretty 7.7
worker 7.6
casual 7.6
talking 7.6
wife 7.6
father 7.6
females 7.6
meeting 7.5
fun 7.5
hospital 7.4
lady 7.3
dress 7.2
kitchen 7.2
working 7.1
businessman 7.1
day 7.1
medical 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 99.5
text 98.3
clothing 94.9
drawing 66.8
black and white 65.2
woman 59.2
furniture 55.8
table 51.9

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 80%
Calm 72.3%
Happy 27.4%
Sad 0.1%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 23-33
Gender Male, 74.4%
Calm 98.6%
Sad 0.8%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 21-29
Gender Female, 51.7%
Calm 29.3%
Fear 29.1%
Sad 14.8%
Disgusted 13%
Surprised 6.2%
Angry 4%
Happy 2.4%
Confused 1.2%

AWS Rekognition

Age 22-30
Gender Male, 79%
Sad 58.9%
Calm 25%
Confused 7.1%
Disgusted 5.1%
Surprised 2%
Angry 0.8%
Fear 0.7%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people looking at each other 80.6%
a group of people looking at the camera 79.8%
a group of people in a room 79.7%

Text analysis

Amazon

YT3
KODY
TS3