Human Generated Data

Title

Untitled (photograph of a family seated and standing in front of trees and house)

Date

c. 1940

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3471

Human Generated Data

Title

Untitled (photograph of a family seated and standing in front of trees and house)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3471

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.3
Person 99.3
Person 98.8
Apparel 98.4
Clothing 98.4
Person 97.7
Person 97.7
Person 97.6
Person 97.1
Person 96.9
Person 96.3
Person 85.7
Dress 85.1
Female 83.2
People 80.3
Robe 74.5
Fashion 74.5
Suit 73.6
Coat 73.6
Overcoat 73.6
Crowd 72.4
Face 71.3
Gown 66.9
Woman 65.7
Photography 65.5
Photo 65.5
Portrait 65.5
Wedding 63.8
Girl 62.1
Stage 59.8
Priest 58
Wedding Gown 56.4

Clarifai
created on 2023-10-26

people 99.9
group 99.6
adult 99
man 97.6
woman 95.8
education 95.4
uniform 94.7
portrait 91.9
medical practitioner 91.3
many 90
medicine 89.6
wear 88.9
group together 87.7
musician 87.6
healthcare 87.4
coat 86
scientist 85.9
science 83.8
outerwear 83.6
doctor 82.6

Imagga
created on 2022-01-22

kin 26.2
people 22.9
groom 21
couple 20.9
negative 17.2
man 16.9
bride 16.3
old 16
black 15
happiness 14.9
person 14.6
happy 13.8
film 13.5
love 13.4
vintage 13.2
male 12.8
two 12.7
art 12.6
group 12.1
wedding 11.9
religion 11.6
adult 11.5
scene 11.2
portrait 11
dress 10.8
family 10.7
fashion 10.5
photographic paper 10.4
men 10.3
water 10
clothing 9.8
ceremony 9.7
husband 8.7
women 8.7
antique 8.6
life 8.6
married 8.6
marriage 8.5
wife 8.5
bouquet 8.5
dark 8.3
world 8.3
church 8.3
silhouette 8.3
retro 8.2
aged 8.1
nurse 8.1
history 8
celebration 8
holiday 7.9
matrimony 7.9
ancient 7.8
bridal 7.8
statue 7.7
mother 7.7
structure 7.6
gown 7.6
monument 7.5
light 7.3
color 7.2
night 7.1
together 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 98.9
posing 87.3
wedding dress 81.5
clothing 62.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 94.6%
Calm 99.9%
Confused 0%
Surprised 0%
Happy 0%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 53-61
Gender Male, 97.7%
Calm 94.1%
Happy 5.4%
Sad 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 54-64
Gender Male, 50.4%
Calm 88.9%
Happy 6.1%
Sad 2.7%
Confused 1.7%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Calm 99.5%
Surprised 0.1%
Sad 0.1%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Calm 92.8%
Happy 1.9%
Confused 1.6%
Sad 1.1%
Surprised 1%
Disgusted 0.8%
Fear 0.5%
Angry 0.2%

AWS Rekognition

Age 39-47
Gender Male, 100%
Calm 99.5%
Happy 0.2%
Surprised 0.1%
Confused 0.1%
Sad 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Male, 69%
Calm 89.2%
Sad 9%
Fear 0.8%
Angry 0.4%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 50-58
Gender Male, 99.3%
Calm 99.7%
Happy 0.3%
Surprised 0%
Sad 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 90%
Calm 99.8%
Happy 0.1%
Surprised 0.1%
Confused 0%
Disgusted 0%
Sad 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 26-36
Gender Male, 82.2%
Calm 90%
Happy 5.9%
Surprised 1.6%
Sad 1.5%
Fear 0.4%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%

AWS Rekognition

Age 47-53
Gender Female, 74%
Calm 50.9%
Happy 42.9%
Angry 2.5%
Surprised 1.1%
Sad 0.9%
Confused 0.7%
Disgusted 0.6%
Fear 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.7%