Human Generated Data

Title

Untitled (baby looking into mirror)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17875

Human Generated Data

Title

Untitled (baby looking into mirror)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17875

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 97.7
Clothing 97.7
Apparel 97.7
Shorts 93.8
Chair 92.9
Furniture 92.9
Face 91.9
Door 81.8
Pants 79.6
Indoors 79
Baby 76.7
Floor 75.8
Bedroom 75.7
Room 75.7
Kid 74.8
Child 74.8
Bed 70
Flooring 68.9
Portrait 66.9
Photography 66.9
Photo 66.9
Standing 59.7
Boy 59.4
Play 58.5
Finger 56.5

Clarifai
created on 2023-10-29

people 99.7
family 99
two 98.9
child 98.6
man 98.6
love 95.1
indoors 95.1
son 94.8
monochrome 93.4
adult 92.3
woman 91.4
room 90.9
couple 89.5
offspring 88.7
three 87.7
fun 84.3
togetherness 83.1
baby 82.5
actor 82.2
window 81.7

Imagga
created on 2022-02-26

people 27.9
man 26.2
person 25.8
male 24.2
adult 23.6
happy 20
device 19.6
happiness 19.6
portrait 19.4
dress 17.2
home 16.7
black 16.2
love 15
indoors 14.9
pretty 14.7
men 14.6
smiling 14.5
lifestyle 14.4
smile 14.2
couple 13.9
house 13.4
attractive 13.3
salon 13.1
women 12.6
bride 12.6
brass 12.5
standing 12.2
sexy 12
human 12
wedding 12
wind instrument 11.8
family 11.6
cute 11.5
lady 11.4
room 11.2
casual 11
two 11
face 10.6
interior 10.6
cheerful 10.6
look 10.5
looking 10.4
groom 10.2
professional 10
musical instrument 9.6
married 9.6
model 9.3
cornet 9.2
holding 9.1
gag 8.9
life 8.8
work 8.6
holiday 8.6
husband 8.6
negative 8.6
wife 8.5
modern 8.4
fashion 8.3
inside 8.3
occupation 8.2
style 8.2
window 8
together 7.9
gown 7.8
sitting 7.7
apartment 7.7
child 7.6
elegance 7.6
togetherness 7.5
bouquet 7.5
restraint 7.5
one 7.5
light 7.3
alone 7.3
business 7.3
confident 7.3
suit 7.2
hair 7.1

Microsoft
created on 2022-02-26

text 99
wall 97
person 95.5
man 93.7
window 88.9
clothing 81.3
image 30.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Happy 92.2%
Surprised 4.3%
Angry 1.6%
Sad 0.6%
Disgusted 0.5%
Confused 0.4%
Calm 0.3%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Female, 90.2%
Calm 90.9%
Happy 4.7%
Sad 1.3%
Angry 1%
Fear 0.6%
Confused 0.6%
Disgusted 0.5%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 97.7%

Categories

Text analysis

Amazon

سبو