Human Generated Data

Title

Untitled (three little girls sitting outside)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21682

Human Generated Data

Title

Untitled (three little girls sitting outside)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21682

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.7
Human 99.7
Clothing 99.7
Apparel 99.7
Person 99.6
Person 99
Shorts 97.7
Dog 97.5
Mammal 97.5
Animal 97.5
Canine 97.5
Pet 97.5
Play 97
Shoe 96.4
Footwear 96.4
Shoe 96
Face 95.2
Shoe 94.5
Dress 94.2
Yard 94
Outdoors 94
Nature 94
Person 92.5
Shoe 92.3
Female 91.5
Grass 88.4
Plant 88.4
Kid 85.1
Child 85.1
Girl 80.8
Portrait 72.2
Photography 72.2
Photo 72.2
Boy 66
Helmet 65.5
Smile 65
People 63.2
Woman 57.9
Park 55.2
Lawn 55.2
Tree 55
Shoe 52.8

Clarifai
created on 2023-10-22

child 100
people 99.9
group together 99.3
group 99.1
sibling 98.9
son 98.9
boy 98.7
wear 98.1
three 97.1
family 96.7
two 96.6
offspring 96.1
recreation 95.4
four 95.3
several 94.9
enjoyment 94.7
facial expression 94.4
portrait 94.1
fun 93.2
sports equipment 92

Imagga
created on 2022-03-11

person 27.9
sport 25.4
athlete 24
man 23.5
people 22.9
adult 22.7
male 19.9
runner 18.9
portrait 16.8
beach 16
outdoors 14.4
outdoor 13.8
summer 13.5
sexy 12.8
sunset 12.6
contestant 12.6
silhouette 12.4
lifestyle 12.3
black 12
clothing 12
two 11.9
fun 10.5
child 10.4
outside 10.3
model 10.1
happy 10
active 10
attractive 9.8
sand 9.8
boy 9.6
men 9.4
sea 9.4
youth 9.4
leisure 9.1
fashion 9
body 8.8
together 8.8
love 8.7
play 8.6
covering 8.5
pretty 8.4
ocean 8.4
kin 8.3
healthy 8.2
exercise 8.2
style 8.2
pose 8.1
lady 8.1
posing 8
grass 7.9
autumn 7.9
women 7.9
cute 7.9
couple 7.8
smile 7.8
run 7.7
dark 7.5
girls 7.3
fitness 7.2
day 7.1
happiness 7
travel 7

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

grass 99.4
outdoor 99
person 97.3
text 89.1
clothing 83.6
black and white 83.2
human face 66.1
posing 40.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 78.6%
Happy 27.6%
Calm 26.5%
Sad 16.4%
Surprised 9.6%
Angry 7.9%
Disgusted 4.8%
Fear 4.7%
Confused 2.4%

AWS Rekognition

Age 23-31
Gender Male, 72.7%
Calm 99.2%
Happy 0.5%
Disgusted 0.1%
Surprised 0.1%
Sad 0%
Angry 0%
Fear 0%
Confused 0%

AWS Rekognition

Age 16-22
Gender Female, 87.6%
Happy 93.4%
Surprised 3.3%
Calm 2%
Sad 0.4%
Angry 0.3%
Disgusted 0.3%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Dog
Shoe
Helmet
Person 99.7%
Person 99.6%
Person 99%
Person 92.5%
Dog 97.5%
Shoe 96.4%
Shoe 96%
Shoe 94.5%
Shoe 92.3%
Shoe 52.8%
Helmet 65.5%

Text analysis

Amazon

SAL

Google

SAS
SAS