Human Generated Data

Title

Untitled (woman posed in front of George Washington portrait)

Date

c. 1910

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.686

Human Generated Data

Title

Untitled (woman posed in front of George Washington portrait)

People

Artist: Martin Schweig, American 20th century

Date

c. 1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Home Decor 99.5
Person 98.9
Human 98.9
Furniture 83.1
Clothing 78.9
Apparel 78.9
Sleeve 76.9
Linen 68.3
Person 62.5
Architecture 62.5
Building 62.5
Flare 58.7
Light 58.7
Shelf 58.3
Cabinet 57.2
Cupboard 57.1
Closet 57.1
LCD Screen 56.7
Electronics 56.7
Screen 56.7
Monitor 56.7
Display 56.7
Handrail 55.1
Banister 55.1

Imagga
created on 2022-01-08

trench coat 54.6
raincoat 43.6
coat 40.8
people 26.2
person 25.6
portrait 23.3
garment 23.1
man 22.2
happy 21.9
male 20.2
attractive 19.6
adult 18.9
lady 18.7
standing 18.2
dress 18.1
fashion 17.3
looking 16.8
smile 15.7
couple 15.7
pretty 15.4
home 15.2
happiness 14.9
clothing 14.6
statue 14.2
one 14.2
holding 14
old 13.9
brunette 13.1
love 12.6
smiling 12.3
suit 12
casual 11.9
lifestyle 11.6
model 10.9
monk 10.7
black 10.6
boy 10.4
men 10.3
child 10.1
alone 10
indoor 10
house 10
face 9.9
groom 9.9
family 9.8
cheerful 9.7
jacket 9.7
bathrobe 9.5
cute 9.3
two 9.3
camera 9.2
city 9.1
business 9.1
human 9
sculpture 8.9
sexy 8.8
robe 8.8
bride 8.6
youth 8.5
color 8.3
wedding 8.3
room 8.3
student 8.1
romance 8
interior 8
expression 7.7
married 7.7
bouquet 7.5
light 7.3
aged 7.2
hair 7.1
professional 7.1
women 7.1
lovely 7.1
door 7.1
indoors 7
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.6
indoor 97.1
clothing 92.7
standing 90.9
statue 77.7
dress 77.2
woman 73
portrait 70
human face 68.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Female, 99.8%
Calm 79.4%
Happy 16.5%
Disgusted 2.2%
Confused 1%
Angry 0.4%
Sad 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 97.7%
Calm 94.5%
Angry 1.4%
Sad 1.1%
Confused 0.8%
Disgusted 0.6%
Surprised 0.6%
Happy 0.5%
Fear 0.4%

Microsoft Cognitive Services

Age 43
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a person standing in front of a window 83%
a person standing in front of a door 82.9%
a person standing in a kitchen 82.8%