Human Generated Data

Title

Untitled (family portrait on steps to porch of house)

Date

c. 1945

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3422

Human Generated Data

Title

Untitled (family portrait on steps to porch of house)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Clothing 99.7
Apparel 99.7
Person 99
Human 99
Person 98.9
Person 98.9
Person 98.6
Person 98.6
Person 96.5
Accessories 95
Accessory 95
Sunglasses 95
Person 89.6
Dress 89.1
Female 88.8
Face 80.8
People 78.7
Person 77
Fashion 73.8
Robe 72.9
Woman 71.7
Door 71.6
Gown 69.5
Girl 64.6
Portrait 62.7
Photography 62.7
Photo 62.7
Evening Dress 59.9
Costume 58.1
Sailor Suit 56.3

Imagga
created on 2022-01-22

nurse 37.8
kin 33.3
people 29
couple 27.8
man 26.9
groom 24.9
bride 24.9
male 23.5
person 22.7
happiness 21.1
wedding 20.2
dress 18.1
men 18
two 17.8
adult 17.5
happy 16.9
love 16.6
portrait 16.2
married 15.3
bouquet 14.3
marriage 14.2
family 14.2
women 13.4
clothing 13.4
ceremony 12.6
smiling 12.3
patient 12.2
human 12
celebration 12
elegance 11.7
together 11.4
fashion 11.3
old 11.1
traditional 10.8
wed 10.8
husband 10.6
businessman 10.6
wife 10.4
day 10.2
suit 9.9
bridal 9.7
black 9.6
loving 9.5
mother 9.4
smile 9.3
life 9.2
new 8.9
romantic 8.9
matrimony 8.9
model 8.5
youth 8.5
business 8.5
hand 8.3
room 8.3
health 8.3
tradition 8.3
cheerful 8.1
lady 8.1
religion 8.1
group 8.1
romance 8
home 8
lifestyle 7.9
newly 7.9
holiday 7.9
flowers 7.8
pretty 7.7
teacher 7.5
outdoors 7.5
future 7.4
pose 7.2
looking 7.2
art 7.2
face 7.1
to 7.1
summer 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.1
wedding dress 97
posing 96.8
bride 94.9
clothing 93.1
person 89.5
dress 88.1
woman 87.5
wedding 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 57.5%
Surprised 87.6%
Calm 6.7%
Happy 3%
Fear 1.1%
Confused 0.7%
Sad 0.4%
Angry 0.3%
Disgusted 0.2%

AWS Rekognition

Age 25-35
Gender Male, 60%
Surprised 88.6%
Calm 10.1%
Happy 0.5%
Confused 0.3%
Disgusted 0.2%
Sad 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 38-46
Gender Male, 91.9%
Calm 96.6%
Surprised 1.9%
Happy 1%
Sad 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 43-51
Gender Male, 97.5%
Calm 99.9%
Angry 0%
Happy 0%
Sad 0%
Fear 0%
Disgusted 0%
Surprised 0%
Confused 0%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Calm 99.7%
Sad 0.2%
Surprised 0%
Happy 0%
Angry 0%
Fear 0%
Confused 0%
Disgusted 0%

AWS Rekognition

Age 41-49
Gender Male, 99.9%
Calm 75.8%
Happy 8.5%
Surprised 7.5%
Confused 2.6%
Sad 2.4%
Disgusted 1.7%
Fear 0.8%
Angry 0.7%

AWS Rekognition

Age 43-51
Gender Male, 100%
Calm 99.8%
Surprised 0.2%
Happy 0%
Disgusted 0%
Confused 0%
Angry 0%
Sad 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 54%
Calm 94.8%
Surprised 4.8%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Sad 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Captions

Microsoft

a group of people posing for a photo 92.8%
a group of people posing for the camera 92.7%
a group of people posing for a picture 92.6%

Text analysis

Amazon

1
hem 1
hem
.

Google

MAOONTARTIMWAMTRA
MAOONTARTIMWAMTRA