Human Generated Data

Title

Untitled (Mayer family portrait in front of fireplace at Christmas time)

Date

December 29, 1954

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18132

Human Generated Data

Title

Untitled (Mayer family portrait in front of fireplace at Christmas time)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

December 29, 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18132

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.4
Person 99.4
Person 99.1
Person 97.9
Apparel 95.3
Shoe 95.3
Footwear 95.3
Clothing 95.3
Person 89.7
Sitting 83.4
People 73.9
Indoors 70.2
Living Room 70.2
Room 70.2
Shoe 67.6
Bedroom 59.4
Robe 56.7
Evening Dress 56.7
Gown 56.7
Fashion 56.7
Couch 56.5
Furniture 56.5
Shelf 55.3
Floor 55.2
Shoe 52.8

Clarifai
created on 2019-11-16

people 100
child 99.3
group 99.3
adult 98
two 97.4
woman 96.1
administration 94.7
furniture 94.3
offspring 93.9
family 93.4
room 92.9
actress 91.3
chair 91.3
man 91.3
music 91.1
boy 91
leader 90.9
sit 90.2
group together 89.4
wear 88.9

Imagga
created on 2019-11-16

kin 54.8
man 24.8
home 23.1
people 22.3
family 21.3
male 20.1
child 19.2
person 19
barbershop 18.6
room 18.1
couple 17.4
mother 17.3
old 16.7
shop 15.8
happiness 15.7
nurse 15.3
happy 15
adult 14.8
smiling 13
lifestyle 13
sitting 12
women 11.9
mercantile establishment 11.8
portrait 11.6
interior 11.5
parent 11.3
senior 11.2
house 10.9
vintage 10.7
indoors 10.5
father 10.2
couch 9.7
black 9.6
youth 9.4
chair 9.3
businessman 8.8
two 8.5
fashion 8.3
daughter 8.3
window 8.2
children 8.2
retro 8.2
new 8.1
worker 8.1
kid 8
love 7.9
together 7.9
place of business 7.9
smile 7.8
boy 7.8
antique 7.8
ancient 7.8
men 7.7
wall 7.7
grandfather 7.6
joy 7.5
human 7.5
clothes 7.5
future 7.4
office 7.4
holding 7.4
indoor 7.3
group 7.2
aged 7.2
dress 7.2
face 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 98.7
person 94.2
indoor 91.9
furniture 88.2
smile 86.3
human face 84.4
toddler 80.5
baby 79.1
text 76.3
chair 75.7
woman 69.2
footwear 69
child 62.8
family 58.1
old 52.3
room 40.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-14
Gender Female, 50.8%
Angry 45%
Sad 45%
Happy 54.3%
Disgusted 45%
Calm 45.7%
Confused 45%
Surprised 45%
Fear 45%

AWS Rekognition

Age 29-45
Gender Female, 54.9%
Fear 45.1%
Angry 45.3%
Sad 45%
Happy 49.5%
Calm 45.1%
Confused 49.2%
Surprised 45.1%
Disgusted 45.5%

AWS Rekognition

Age 5-15
Gender Male, 54.9%
Calm 54.2%
Disgusted 45%
Surprised 45%
Fear 45%
Happy 45.5%
Angry 45.1%
Confused 45.1%
Sad 45.1%

AWS Rekognition

Age 39-57
Gender Male, 54.5%
Sad 45.1%
Surprised 45.1%
Angry 45.3%
Confused 45.2%
Fear 45%
Happy 45.2%
Calm 53.9%
Disgusted 45.2%

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 95.3%

Categories

Imagga

people portraits 95.1%
paintings art 4.4%

Text analysis

Amazon

E
ARY
O

Google

ARY
ARY