Human Generated Data

Title

Untitled (adults and children eating in living room)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17932

Human Generated Data

Title

Untitled (adults and children eating in living room)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Human 99.3
Person 99.3
Person 99.3
Person 97.4
Indoors 96.7
Room 96.7
Furniture 95.9
Person 92.4
Living Room 92.3
Bedroom 79.9
Clothing 72.4
Apparel 72.4
Bed 68.4
Fireplace 62.1
Electronics 57.6
Screen 57.6
Couch 56.1
Display 56
Monitor 56

Imagga
created on 2022-03-04

barbershop 39.8
couple 35.7
people 34.6
groom 33.8
shop 32
bride 30.7
love 26.8
happiness 26.6
wedding 25.7
man 25.5
mercantile establishment 25.4
dress 25.3
adult 24.4
married 24
celebration 23.1
husband 22.9
person 21.6
home 21.5
wife 20.9
bouquet 19.5
male 19.1
room 19
happy 18.8
family 17.8
two 16.9
place of business 16.9
ceremony 16.5
men 14.6
smiling 14.5
marriage 14.2
indoors 14.1
romantic 13.4
interior 13.3
together 13.1
flowers 13
cheerful 13
smile 12.8
indoor 12.8
gown 12.7
romance 12.5
face 12.1
women 11.9
wed 11.8
portrait 11.6
world 11.2
event 11.1
suit 10.8
holiday 10.7
table 10.6
grandfather 10.5
grandma 10.1
child 10
wedded 9.9
veil 9.8
new 9.7
party 9.5
pair 9.4
mother 9.3
flower 9.2
traditional 9.1
old 9.1
fashion 9
matrimony 8.9
kiss 8.8
engagement 8.7
day 8.6
sitting 8.6
establishment 8.5
senior 8.4
attractive 8.4
house 8.4
decoration 8.3
teacher 8.2
fun 8.2
kin 8.2
office 8.2
lady 8.1
professional 7.6
females 7.6
elegance 7.6
togetherness 7.6
human 7.5
relationship 7.5
rose 7.5
future 7.4
mature 7.4
tradition 7.4
girls 7.3
chair 7.2
hair 7.1
businessman 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 94.1
indoor 88.9
fireplace 78.7
clothes 23.2

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Calm 93.1%
Happy 3.6%
Sad 0.9%
Disgusted 0.9%
Confused 0.7%
Angry 0.3%
Fear 0.2%
Surprised 0.2%

AWS Rekognition

Age 45-51
Gender Female, 59.7%
Calm 66.3%
Sad 29.5%
Happy 1.6%
Confused 1.5%
Angry 0.4%
Disgusted 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 87.7%
Calm 95.5%
Surprised 2.4%
Sad 1.2%
Confused 0.4%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people sitting on a bed 61.9%
a group of people sitting in a room 61.8%
a group of people in a room 61.7%

Text analysis

Amazon

27

Google

MJI7--YT 37A°2--AGO
MJI7--YT
37A°2--AGO