Human Generated Data

Title

Untitled (family portrait, outdoors, Victorian era clothes)

Date

c.1910

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22064

Human Generated Data

Title

Untitled (family portrait, outdoors, Victorian era clothes)

People

Artist: Durette Studio, American 20th century

Date

c.1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 99.7
Person 99.7
Person 99.5
Person 99.5
Person 99.4
Person 99.3
Person 99.2
Clothing 98.9
Apparel 98.9
Person 98.4
Person 93
Person 92.3
Chair 81.9
Furniture 81.9
Outdoors 80.2
Coat 76.3
People 73.1
Overcoat 70.1
Face 67.1
Nature 67
Suit 62.3
Shorts 59.7
Person 57.2

Imagga
created on 2022-03-11

kin 84.7
people 27.9
man 24.2
person 22.9
couple 21.8
male 20.6
love 18.9
happy 17.5
together 17.5
portrait 17.5
happiness 16.4
outdoor 16
park 15.6
outdoors 15
groom 15
lifestyle 14.4
adult 14.4
old 13.9
two 12.7
dress 12.6
outside 12
women 11.9
life 11.8
summer 11.6
married 11.5
stone 11.3
wedding 11
mother 10.8
romantic 10.7
family 10.7
bride 10.5
teenager 10
sky 9.6
boy 9.6
men 9.4
friends 9.4
friendship 9.4
holiday 9.3
garden 9.2
leisure 9.1
girls 9.1
child 9
human 9
religion 9
home 8.8
smiling 8.7
marriage 8.5
relationship 8.4
active 8.2
vacation 8.2
lady 8.1
romance 8
face 7.8
sitting 7.7
elderly 7.7
statue 7.6
fan 7.6
head 7.6
house 7.5
father 7.5
world 7.5
fun 7.5
church 7.4
room 7.3
aged 7.2
sunset 7.2
hair 7.1
smile 7.1
day 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

person 97.9
clothing 97.2
outdoor 92
man 85.7
text 84.6
woman 56.4
smile 55.1

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 60.6%
Calm 96%
Happy 1.9%
Surprised 1.3%
Sad 0.2%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 23-31
Gender Male, 95.4%
Calm 100%
Confused 0%
Surprised 0%
Sad 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 78.4%
Calm 54.4%
Surprised 41.6%
Disgusted 0.9%
Angry 0.8%
Confused 0.8%
Sad 0.7%
Happy 0.5%
Fear 0.4%

AWS Rekognition

Age 23-33
Gender Male, 87.8%
Calm 96%
Surprised 2.9%
Happy 0.3%
Sad 0.3%
Disgusted 0.2%
Angry 0.2%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 99.8%
Calm 75.7%
Confused 12.6%
Surprised 4.7%
Happy 2.3%
Disgusted 1.6%
Angry 1.4%
Sad 1.2%
Fear 0.5%

AWS Rekognition

Age 22-30
Gender Female, 72.1%
Calm 90.7%
Surprised 8.9%
Confused 0.1%
Disgusted 0.1%
Sad 0.1%
Angry 0.1%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a picture 87.5%
a group of people posing for the camera 85.8%
a group of people posing for a photo 81.1%

Text analysis

Google

A
A