Human Generated Data

Title

Untitled (full-length portrait of two men standing behind two seated women with painted backdrop of river scene)

Date

c. 1856 - c. 1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Andrew S. Dibner, P2003.131.12864

Human Generated Data

Title

Untitled (full-length portrait of two men standing behind two seated women with painted backdrop of river scene)

People

Artist: Unidentified Artist,

Date

c. 1856 - c. 1910

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-24

Human 99.6
Person 99.6
Person 98.6
Person 98.4
Person 98.2
Officer 95.7
Military 95.7
Military Uniform 95.7
People 88.8
Art 87.2
Painting 87.2
Apparel 86
Clothing 86
Hat 69.5
Soldier 68.1
Person 65.1
Sailor Suit 62.8
Captain 61.9
Family 57.8

Imagga
created on 2022-01-24

kin 100
old 23.7
man 22.8
comedian 22.8
dress 22.6
religion 21.5
statue 20.9
person 19.5
portrait 18.8
male 18.6
performer 18.6
people 18.4
art 18.2
sculpture 18.1
culture 17.9
couple 16.5
face 16.3
love 15
ancient 14.7
adult 14.2
traditional 14.1
entertainer 14.1
happy 13.8
romantic 13.4
religious 13.1
god 12.4
church 12
black 12
decoration 11.7
fashion 11.3
historical 11.3
antique 11.2
monument 11.2
style 11.1
mask 11
happiness 11
stone 11
vintage 10.7
together 10.5
detail 10.5
clothing 10.2
costume 9.9
history 9.8
family 9.8
catholic 9.7
holy 9.6
world 9.5
party 9.5
military uniform 9.3
two 9.3
head 9.2
uniform 9.1
lady 8.9
mother 8.8
celebration 8.8
look 8.8
saint 8.7
spiritual 8.6
faith 8.6
golden 8.6
architecture 8.6
husband 8.6
wife 8.5
travel 8.4
relationship 8.4
attractive 8.4
color 8.3
historic 8.3
father 8.2
sibling 8
masquerade 7.9
hidden 7.9
theater 7.8
carnival 7.8
model 7.8
mysterious 7.8
men 7.7
child 7.7
old fashioned 7.6
cover 7.4
makeup 7.3
disguise 7.3
romance 7.1
smile 7.1
posing 7.1
hat 7

Google
created on 2022-01-24

Microsoft
created on 2022-01-24

clothing 98.9
person 98.2
text 97.5
old 97.1
human face 94.8
player 94.6
smile 87.8
posing 87.5
hat 75.3
black 71.8
dress 63.7
white 61.4
vintage 61.1
vintage clothing 58.8
man 54.8
time 52.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.1%
Calm 68.3%
Surprised 11.3%
Fear 6.8%
Happy 6%
Confused 2.3%
Sad 2.1%
Angry 2%
Disgusted 1.1%

AWS Rekognition

Age 26-36
Gender Male, 100%
Fear 76.8%
Disgusted 7.6%
Surprised 4.8%
Sad 3.8%
Angry 2.7%
Calm 2.4%
Confused 1.1%
Happy 0.7%

AWS Rekognition

Age 25-35
Gender Female, 99.9%
Calm 99.2%
Confused 0.3%
Happy 0.2%
Angry 0.1%
Surprised 0.1%
Sad 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 100%
Calm 99.2%
Confused 0.5%
Sad 0.2%
Angry 0%
Happy 0%
Surprised 0%
Fear 0%
Disgusted 0%

Microsoft Cognitive Services

Age 36
Gender Female

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 36
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Painting 87.2%
Hat 69.5%

Captions

Microsoft

a vintage photo of a group of people posing for the camera 94.3%
a vintage photo of a baseball player posing for a picture 86.6%
a vintage photo of a group of people posing for a picture 86.5%