Human Generated Data

Title

Untitled (family portrait)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.784

Human Generated Data

Title

Untitled (family portrait)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.784

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Family 99.5
People 99.5
Human 99.5
Person 99.3
Tie 99.2
Accessories 99.2
Accessory 99.2
Person 99.2
Person 99.1
Person 98.6
Person 98.6
Person 98.4
Animal 93.9
Pet 93.9
Dog 93.9
Mammal 93.9
Canine 93.9
Clothing 80.7
Apparel 80.7

Clarifai
created on 2023-10-26

people 99.9
group 99.7
child 99.1
group together 98.8
family 97.5
son 96.2
adult 96.1
portrait 95.7
offspring 95.2
woman 95.2
man 94.7
four 94.6
monochrome 94
leader 93.1
sibling 91.1
recreation 91
three 91
several 88.4
war 85.3
five 84.3

Imagga
created on 2022-01-22

kin 100
people 30.1
outdoors 28.3
park 28
family 26.7
child 25
male 24.8
happiness 24.3
man 24.2
portrait 23.9
happy 23.2
autumn 22.8
adult 22.6
love 21.3
outdoor 20.6
lifestyle 19.5
fun 19.4
joy 17.5
parent 17.1
together 16.6
couple 16.5
person 16
kid 15.9
forest 15.7
boy 15.6
summer 15.4
two 15.2
smiling 15.2
smile 14.2
day 14.1
leisure 14.1
men 13.7
daughter 13.4
walking 13.3
holding 13.2
mother 13.1
outside 12.8
face 12.8
girls 12.8
women 12.6
vacation 12.3
cheerful 12.2
casual 11.9
children 11.8
season 11.7
play 11.2
hair 11.1
fall 10.9
active 10.8
cute 10.8
childhood 10.7
father 10.5
relationship 10.3
dad 10.2
clothing 10
attractive 9.8
old 9.7
black 9.6
walk 9.5
grass 9.5
sport 9.5
togetherness 9.4
kids 9.4
beach 9.3
romance 8.9
leaves 8.7
life 8.7
youth 8.5
enjoy 8.5
tree 8.5
pretty 8.4
garden 8.4
field 8.4
rest 8.3
20s 8.2
sibling 8.2
playing 8.2
sunset 8.1
group 8.1
leaf 7.8
sepia 7.8
son 7.6
hand 7.6
relax 7.6
healthy 7.6
head 7.6
serene 7.5
natural 7.4
lake 7.3
countryside 7.3
recreation 7.2
holiday 7.2
trees 7.1
sky 7

Google
created on 2022-01-22

Footwear 98.1
Vertebrate 91.9
Leg 90.9
Standing 86.4
Dress 85.6
Plant 85.3
Tree 81.8
Grass 78.3
People in nature 77.3
Vintage clothing 75.9
Smile 75.1
Suit 74.6
Family reunion 74.1
Toddler 72.8
Shorts 71.6
Monochrome 71.5
Event 71.3
Lap 71.3
Classic 71.1
Sitting 70.7

Microsoft
created on 2022-01-22

outdoor 99.9
grass 99.9
person 99.8
tree 99.7
clothing 99.5
baby 97.6
smile 97.4
human face 97
toddler 96.3
boy 82.9
woman 81.3
footwear 79.4
man 76.3
group 67.7
old 67.1
child 58.9
posing 53.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 0-4
Gender Female, 98.7%
Happy 52.8%
Calm 23%
Surprised 9.2%
Fear 5.8%
Confused 4%
Sad 2.6%
Disgusted 1.6%
Angry 0.9%

AWS Rekognition

Age 41-49
Gender Female, 100%
Calm 98.6%
Confused 0.3%
Surprised 0.3%
Angry 0.2%
Happy 0.2%
Sad 0.2%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 4-10
Gender Female, 100%
Happy 99.7%
Calm 0.1%
Confused 0%
Disgusted 0%
Surprised 0%
Sad 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 6-14
Gender Male, 90.7%
Calm 73.6%
Sad 22.1%
Angry 3.5%
Confused 0.2%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 45-51
Gender Male, 99.8%
Calm 90.6%
Happy 7.5%
Confused 0.6%
Sad 0.4%
Angry 0.4%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 6-12
Gender Male, 93.3%
Happy 99.9%
Confused 0%
Surprised 0%
Calm 0%
Angry 0%
Disgusted 0%
Sad 0%
Fear 0%

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Microsoft Cognitive Services

Age 50
Gender Female

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 2
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 99.2%
Dog 93.9%

Categories

Imagga

people portraits 100%