Human Generated Data

Title

Untitled (woman passing out a gift to family member by Christmas tree)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3529

Human Generated Data

Title

Untitled (woman passing out a gift to family member by Christmas tree)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3529

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.6
Person 99.6
Person 99.6
Person 99.6
Plant 99.3
Tree 99.3
Person 98
Person 97.2
Clothing 94
Apparel 94
Ornament 88
Person 87.3
Furniture 84.3
Living Room 61.4
Indoors 61.4
Room 61.4
Footwear 58.4
Shoe 58.4
Floor 58.3
Couch 57.9
Christmas Tree 57.9
Kid 57.6
Child 57.6
Sitting 56.1
Dress 55.6

Clarifai
created on 2023-10-26

people 99.9
group together 99.5
group 99
child 98
adult 97.5
recreation 97.2
many 97.2
man 97
woman 96.4
several 94
five 88.8
leader 88.7
administration 88
family 87.8
wear 86.3
enjoyment 85.9
sibling 85.1
offspring 85
musician 84.9
three 84.8

Imagga
created on 2022-01-22

man 29
people 24.5
male 22.3
child 21.9
person 20.5
kin 20
world 15.2
boy 14.8
life 14.4
family 14.2
couple 13.9
black 13.8
sport 13.6
businessman 13.2
portrait 12.9
adult 12.6
men 12
happy 11.9
dad 11.2
happiness 11
holding 10.7
outdoor 10.7
bride 10.5
business 10.3
summer 10.3
love 10.3
youth 10.2
room 10.1
silhouette 9.9
father 9.9
grunge 9.4
parent 9.3
wedding 9.2
art 9.1
groom 9.1
park 9.1
dress 9
retro 9
outdoors 9
group 8.9
lifestyle 8.7
play 8.6
wall 8.6
face 8.5
active 8.4
dark 8.3
leisure 8.3
vintage 8.3
human 8.2
new 8.1
mother 8.1
team 8.1
activity 8.1
kid 8
player 7.8
run 7.7
sky 7.6
athlete 7.5
fun 7.5
hold 7.4
success 7.2
dirty 7.2
ball 7.2
looking 7.2
body 7.2
work 7.2
romantic 7.1
women 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.8
clothing 94.8
person 93.4
christmas tree 91.3
black and white 80.8
man 74.9
woman 58.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 100%
Happy 54.4%
Calm 26.5%
Surprised 5.1%
Sad 3.8%
Disgusted 3%
Angry 2.5%
Confused 2.4%
Fear 2.1%

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 88.5%
Happy 9.1%
Sad 0.7%
Angry 0.6%
Surprised 0.4%
Disgusted 0.3%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 39-47
Gender Male, 99.6%
Sad 62.5%
Confused 13%
Happy 9.3%
Calm 6.9%
Angry 2.8%
Surprised 2.6%
Disgusted 2.3%
Fear 0.7%

AWS Rekognition

Age 7-17
Gender Male, 86.5%
Calm 79%
Sad 18.5%
Disgusted 0.6%
Confused 0.6%
Surprised 0.4%
Angry 0.3%
Fear 0.3%
Happy 0.2%

AWS Rekognition

Age 49-57
Gender Male, 99.5%
Calm 97.8%
Surprised 0.6%
Sad 0.4%
Happy 0.4%
Confused 0.2%
Angry 0.2%
Fear 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 58.4%

Categories

Text analysis

Amazon

KODVK-SEELA

Google

YT3RA2-XAGON
YT3RA2-XAGON