Human Generated Data

Title

Untitled (portrait of family reading)

Date

1912

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21937

Human Generated Data

Title

Untitled (portrait of family reading)

People

Artist: Hamblin Studio, American active 1930s

Date

1912

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21937

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Clothing 100
Apparel 100
Person 99.6
Human 99.6
Person 98.3
Robe 97.4
Fashion 97.4
Gown 96.2
Person 94.8
Person 94.7
Wedding 92.1
Person 91.5
Female 89
Bride 83.6
Wedding Gown 83.6
Evening Dress 82.6
Suit 78.6
Overcoat 78.6
Coat 78.6
Woman 77.3
Dress 76
Shoe 75.4
Footwear 75.4
Person 70
Furniture 67.1
People 65.5
Bridegroom 60.1
Photography 59.2
Photo 59.2
Portrait 56.5
Face 56.5

Clarifai
created on 2023-10-22

people 99.9
group 99.2
group together 98.6
adult 97.4
woman 96.6
man 95.8
wedding 95.4
actor 92.3
leader 92
child 91.8
four 90.3
several 90
three 89
wear 87.4
actress 86.9
outfit 86.5
five 85.7
bride 85.5
music 85.1
family 83.7

Imagga
created on 2022-03-11

groom 79.3
people 33.5
man 26.2
person 24.7
couple 21.8
adult 21.4
bride 21.4
male 21.3
happiness 20.4
men 18.9
nurse 18.2
wedding 17.5
happy 16.3
love 15.8
two 15.2
women 15
businessman 15
dress 13.5
business 13.4
professional 13.3
life 13
portrait 12.9
fashion 12.8
black 12.6
ceremony 12.6
bouquet 12.5
family 12.5
marriage 12.3
smiling 12.3
clothing 12.2
smile 12.1
corporate 12
celebration 12
worker 11.9
married 11.5
husband 11.4
wife 11.4
waiter 10.8
cheerful 10.6
attractive 10.5
flowers 10.4
world 10
suit 9.9
religion 9.9
wed 9.8
human 9.7
group 9.7
day 9.4
elegance 9.2
employee 9.1
pretty 9.1
matrimony 8.9
looking 8.8
together 8.8
bridal 8.8
dining-room attendant 8.6
brass 8.5
room 8.4
old 8.4
church 8.3
businesswoman 8.2
mother 8.1
romance 8
job 8
wind instrument 7.9
standing 7.8
hands 7.8
face 7.8
gown 7.8
art 7.8
wall 7.7
youth 7.7
outdoor 7.6
adults 7.6
relationship 7.5
fun 7.5
outdoors 7.5
holding 7.4
teenager 7.3
building 7.2
sexy 7.2
handsome 7.1
romantic 7.1
posing 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 95.5
clothing 93.3
person 93.1
black and white 72.7
man 65.7
old 59.6
posing 37.8
clothes 19.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 97.2%
Calm 64.3%
Angry 20.9%
Surprised 6.1%
Sad 4%
Disgusted 2.6%
Confused 1.2%
Fear 0.6%
Happy 0.3%

AWS Rekognition

Age 20-28
Gender Male, 93.7%
Calm 100%
Sad 0%
Surprised 0%
Disgusted 0%
Fear 0%
Confused 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 19-27
Gender Female, 97.7%
Calm 94%
Sad 3.6%
Surprised 0.9%
Angry 0.5%
Happy 0.4%
Disgusted 0.4%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 53-61
Gender Male, 94.8%
Surprised 65.9%
Calm 14.4%
Happy 11.7%
Sad 3.3%
Angry 1.5%
Disgusted 1.4%
Confused 1.1%
Fear 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.6%
Person 98.3%
Person 94.8%
Person 94.7%
Person 91.5%
Person 70%
Shoe 75.4%