Human Generated Data

Title

Untitled (portrait of four children)

Date

1920

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2078

Human Generated Data

Title

Untitled (portrait of four children)

People

Artist: Hamblin Studio, American active 1930s

Date

1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2078

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Shorts 99.9
Clothing 99.9
Apparel 99.9
Person 99.7
Human 99.7
Person 98.8
Person 98.8
Person 97.3
Chair 91
Furniture 91
Face 90.4
Shoe 82.5
Footwear 82.5
People 81.9
Boy 80.3
Shoe 71.1
Kid 69.7
Child 69.7
Portrait 65.3
Photography 65.3
Photo 65.3
Shoe 64.7

Clarifai
created on 2023-10-15

people 99.7
child 99.5
monochrome 97.9
group 97.1
family 95.7
son 94.6
sit 94.2
three 93.6
nostalgia 92.5
man 92.3
portrait 92
retro 89.7
boy 89
indoors 88.6
two 87.8
sibling 86.7
education 85.3
sepia 84.4
baby 84.3
nostalgic 84

Imagga
created on 2021-12-14

kin 37.4
people 23.4
man 21.6
statue 21.6
male 21.4
person 20.3
portrait 18.1
adult 16.3
art 15.4
sculpture 15.2
men 14.6
athlete 14.4
world 13
face 12.8
player 12.3
ballplayer 12.2
sport 11.9
sibling 11.6
black 11.4
dress 10.8
silhouette 10.8
family 10.7
fashion 10.5
group 10.5
mother 10.5
contestant 10.5
love 10.3
decoration 10.2
youth 10.2
city 10
antique 9.5
women 9.5
happiness 9.4
lifestyle 9.4
outdoor 9.2
travel 9.1
tourism 9.1
old 9.1
religion 9
team 9
history 8.9
happy 8.8
child 8.8
marble 8.7
couple 8.7
ancient 8.6
culture 8.5
head 8.4
one 8.2
celebration 8
clothing 8
bride 8
business 7.9
urban 7.9
figure 7.7
motion 7.7
sky 7.6
human 7.5
traditional 7.5
exercise 7.3
pose 7.2
body 7.2
father 7.2
romantic 7.1
summer 7.1
businessman 7.1
look 7

Google
created on 2021-12-14

Shorts 92.6
Black-and-white 83.9
Style 83.8
Smile 79.2
Hat 75.7
Child 73.8
Font 72
Vintage clothing 71.4
Art 71
Monochrome 69.8
Monochrome photography 69.6
Sitting 68.2
Fun 67.6
Room 65.5
Toddler 65.2
Stock photography 64.5
Photo caption 63.6
Happy 63.2
T-shirt 61.9
Team sport 60.7

Microsoft
created on 2021-12-14

clothing 98.6
person 98.3
text 96.9
outdoor 95.7
smile 93.5
human face 90.7
footwear 89.6
man 75.5
boy 70.5
posing 46.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-51
Gender Male, 90.3%
Calm 87.1%
Sad 4.7%
Surprised 2.5%
Confused 2.4%
Happy 1.6%
Angry 0.8%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 26-42
Gender Female, 63.8%
Calm 66.4%
Fear 16.9%
Surprised 7%
Sad 5.4%
Angry 1.5%
Happy 1.5%
Confused 0.9%
Disgusted 0.4%

AWS Rekognition

Age 20-32
Gender Female, 85.8%
Fear 84%
Calm 4.1%
Happy 4.1%
Sad 3%
Surprised 2.5%
Confused 1%
Angry 0.7%
Disgusted 0.5%

AWS Rekognition

Age 0-4
Gender Male, 90.1%
Happy 94.7%
Calm 1.9%
Surprised 1.5%
Confused 0.7%
Sad 0.6%
Angry 0.4%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 82.5%

Categories

Imagga

paintings art 81.7%
people portraits 18%

Text analysis

Amazon

22

Google

ra NAGON-YT3RA2 -MAMT2A3
ra
NAGON-YT3RA2
-MAMT2A3