Human Generated Data

Title

Untitled (children playing blind man's bluff)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17611

Human Generated Data

Title

Untitled (children playing blind man's bluff)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17611

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.8
Apparel 99.8
Person 99.3
Human 99.3
Person 99.3
Person 98.9
Shorts 98.9
Person 98.9
Person 97.4
Dress 94.6
Female 93.6
Skirt 80.1
Woman 78.5
People 78
Plant 77.1
Kid 75.2
Child 75.2
Person 72.2
Girl 71.9
Face 70.7
Tree 69
Outdoors 67.9
Grass 66.4
Photography 60.7
Photo 60.7
Play 56

Clarifai
created on 2023-10-29

people 99.9
child 99.7
two 97.8
monochrome 94.3
family 93.8
group together 92.7
adult 92.4
boy 92.1
fun 91.7
son 88.5
three 88.1
woman 87
wear 86.8
man 86.6
enjoyment 85.5
recreation 85.4
one 85.2
portrait 84.7
group 84.6
beach 83.7

Imagga
created on 2022-02-26

child 27.9
groom 24.6
people 16.7
adult 16.2
person 15.8
man 15.5
summer 15.4
dress 14.5
portrait 14.2
male 14.2
love 14.2
face 14.2
happy 13.8
outdoor 13.8
hair 13.5
outdoors 12.9
beach 12.8
happiness 12.5
lifestyle 12.3
sand 12
girls 11.8
bride 11.6
holiday 11.5
water 11.3
couple 11.3
leisure 10.8
vacation 10.6
pretty 10.5
attractive 10.5
sexy 10.4
wedding 10.1
model 10.1
human 9.7
kin 9.7
wall 9.6
eyes 9.5
sea 9.4
head 9.2
old 9.1
fun 9
body 8.8
women 8.7
sad 8.7
smiling 8.7
cute 8.6
two 8.5
relaxation 8.4
art 8.2
lady 8.1
wet 8
clothing 7.8
sitting 7.7
men 7.7
lonely 7.7
world 7.6
skin 7.6
joy 7.5
sun 7.2
sunlight 7.1
travel 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

building 99.1
outdoor 98.7
clothing 94.3
person 92.7
footwear 79.7
black and white 73.7
text 69.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 56%
Calm 82%
Sad 11.9%
Happy 2.3%
Confused 1.9%
Surprised 0.9%
Disgusted 0.4%
Angry 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person
Person 99.3%
Person 99.3%
Person 98.9%
Person 98.9%
Person 97.4%
Person 72.2%

Categories