Human Generated Data

Title

Untitled (family in living room, Manchester, New Hampshire)

Date

1931, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.78

Human Generated Data

Title

Untitled (family in living room, Manchester, New Hampshire)

People

Artist: Durette Studio, American 20th century

Date

1931, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.78

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99
Human 99
Person 98.5
Person 93.7
Person 92.8
People 90.6
Room 81.8
Indoors 81.8
Coat 80.8
Overcoat 80.8
Clothing 80.8
Apparel 80.8
Furniture 77.9
Suit 72
Family 70.9
Door 67.7
Living Room 65.3
Couch 65.2
Tie 64.2
Accessories 64.2
Accessory 64.2
Suit 61.1

Clarifai
created on 2023-10-15

people 99.9
family 99.6
group 99.1
leader 97.9
wedding 97.3
child 96
portrait 95.7
offspring 95.6
man 93.9
home 93.8
administration 93.2
monochrome 92.9
two 92.4
adult 91.8
groom 91
woman 90.8
three 88.7
documentary 88.3
son 87.9
four 85.7

Imagga
created on 2021-12-14

world 43.4
bow tie 35.4
man 31.6
necktie 27.3
people 24
male 22.8
groom 22.7
person 22.2
kin 21
portrait 19.4
businessman 17.7
happy 17.5
couple 16.6
child 15.2
happiness 14.9
business 14.6
adult 14.4
family 14.2
garment 14.1
black 12.6
love 11.8
clothing 11.3
men 11.2
wedding 11
smile 10.7
bride 10.7
face 10.7
office 10.4
smiling 10.1
head 10.1
silhouette 9.9
old 9.8
future 9.3
mother 9.1
life 8.9
group 8.9
boy 8.7
ancient 8.6
work 8.6
youth 8.5
two 8.5
dress 8.1
religion 8.1
decoration 8
working 8
together 7.9
brunette 7.8
art 7.8
corporate 7.7
culture 7.7
senior 7.5
holding 7.4
retro 7.4
alone 7.3
daughter 7.2
home 7.2
celebration 7.2
romantic 7.1
idea 7.1
women 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.8
suit 95.7
clothing 93.3
person 90.7
wedding dress 89.8
man 85.5
bride 84.3
black 83.2
old 82.3
white 79.7
posing 75.4
wedding 68.8
smile 68.7
flower 58.8
dress 55.6
room 47.5
picture frame 37.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Male, 97.7%
Calm 85.3%
Angry 10.3%
Sad 1.6%
Surprised 1.2%
Happy 0.8%
Confused 0.3%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 20-32
Gender Female, 51.2%
Calm 98.3%
Sad 0.4%
Happy 0.3%
Surprised 0.3%
Confused 0.3%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 50-68
Gender Male, 97.6%
Calm 99.4%
Sad 0.2%
Happy 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 2-8
Gender Female, 63%
Surprised 99.8%
Fear 0.1%
Calm 0%
Happy 0%
Confused 0%
Angry 0%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 0-3
Gender Female, 87%
Surprised 56.7%
Calm 31.2%
Fear 10.3%
Sad 0.6%
Happy 0.5%
Confused 0.5%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 21-33
Gender Female, 75.3%
Calm 56.2%
Sad 24.8%
Happy 8.8%
Confused 4.5%
Fear 1.9%
Angry 1.6%
Surprised 1.4%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Suit 72%
Tie 64.2%

Categories