Human Generated Data

Title

Untitled (family portrait on bench in garden)

Date

1930

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1910

Human Generated Data

Title

Untitled (family portrait on bench in garden)

People

Artist: Hamblin Studio, American active 1930s

Date

1930

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1910

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.7
Person 99.7
Person 99.6
Person 98.9
Person 98.6
Person 97.1
Person 97.1
Clothing 96.3
Apparel 96.3
Face 94.2
Outdoors 90.8
People 88.8
Grass 87.2
Plant 87.2
Tree 83
Female 82.7
Kid 80.8
Child 80.8
Yard 79.8
Nature 79.8
Vegetation 75.2
Smile 74
Photography 73.1
Photo 73.1
Portrait 72.6
Girl 69.4
Leisure Activities 65.3
Dress 64.5
Sports 63.9
Sport 63.9
Coat 63.4
Meal 63.1
Food 63.1
Shorts 59.1
Play 58.9
Boy 56.6
Golf 56.4
Woman 56.1
Hockey 55.8
Team 55.8
Team Sport 55.8
Pants 55.8
Overcoat 55.3
Suit 55.3

Clarifai
created on 2023-10-25

people 99.9
child 97.7
group 97.1
group together 97
man 96.8
adult 95.9
wear 95
boy 94.4
family 92.6
retro 92.4
nostalgic 91.5
snapshot 87.8
uniform 87.2
many 87.1
vintage 86.7
military 84.7
leader 83.1
nostalgia 82.9
several 82.8
war 81

Imagga
created on 2021-12-14

brass 58.8
wind instrument 46.2
musical instrument 33.6
man 29.6
people 26.8
silhouette 23.2
male 22.7
sunset 20.7
sport 19.6
athlete 18.9
cornet 18.9
person 18.7
player 17.8
adult 16.9
outdoor 16.8
sky 16.6
active 16.3
men 16.3
beach 16.1
kin 15.8
ballplayer 15
play 14.6
outdoors 14.3
trombone 14.2
summer 14.1
boy 13.9
lifestyle 13.7
child 13
fun 12.7
evening 12.1
black 12
contestant 11.7
leisure 11.6
vacation 11.5
couple 11.3
happy 11.3
dusk 10.5
sun 10.5
love 10.3
sand 9.8
portrait 9.7
together 9.6
day 9.4
winter 9.4
joy 9.2
hand 9.1
bugle 9
team 9
group 8.9
run 8.7
happiness 8.6
sea 8.6
youth 8.5
two 8.5
clouds 8.4
landscape 8.2
water 8
family 8
grass 7.9
outside 7.7
adventure 7.6
walking 7.6
field 7.5
friendship 7.5
ocean 7.5
holding 7.4
park 7.4
exercise 7.3
shadow 7.2
activity 7.2
women 7.1
businessman 7.1
travel 7

Microsoft
created on 2021-12-14

text 99.2
person 96.7
clothing 95.9
posing 85.5
outdoor 85.5
man 84.7
smile 74
child 59.3
human face 57.3
image 46.1
picture frame 7.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-16
Gender Male, 96.6%
Calm 81.3%
Sad 8.8%
Confused 3.3%
Angry 2.3%
Fear 2%
Happy 1.3%
Surprised 0.7%
Disgusted 0.4%

AWS Rekognition

Age 50-68
Gender Male, 99.1%
Calm 80.8%
Surprised 16%
Confused 0.9%
Angry 0.9%
Disgusted 0.4%
Sad 0.3%
Fear 0.3%
Happy 0.3%

AWS Rekognition

Age 41-59
Gender Male, 56.6%
Calm 48%
Happy 22.7%
Surprised 10.4%
Fear 8.8%
Sad 5.7%
Confused 2%
Disgusted 1.5%
Angry 0.9%

AWS Rekognition

Age 26-40
Gender Male, 96.1%
Calm 98%
Sad 0.7%
Surprised 0.4%
Fear 0.3%
Happy 0.2%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 0-3
Gender Male, 51.8%
Fear 47.9%
Calm 28.9%
Sad 15.5%
Confused 2.3%
Surprised 2.2%
Happy 1.9%
Disgusted 0.7%
Angry 0.5%

AWS Rekognition

Age 51-69
Gender Male, 52.5%
Calm 55%
Sad 31.7%
Happy 4.2%
Fear 2.6%
Angry 2.3%
Disgusted 2.2%
Surprised 1.1%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%