Human Generated Data

Title

Untitled (baby on grandffather's lap)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17143

Human Generated Data

Title

Untitled (baby on grandffather's lap)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.8
Human 98.8
Person 92.5
Table Lamp 89.8
Apparel 87.3
Clothing 87.3
Lamp 70.1
People 68.8
Face 67.1
Indoors 67
Living Room 67
Room 67
Female 63.8
Furniture 63.2
Portrait 62.2
Photo 62.2
Photography 62.2
Couch 61.4
Plant 60.4
Child 60.2
Kid 60.2
Baby 58.5
Girl 56.7
Bowl 55.8

Imagga
created on 2022-02-26

person 33.5
musical instrument 31.6
wind instrument 30.8
man 28.2
adult 26.6
people 24.5
male 19.3
brass 17.7
home 17.5
lifestyle 15.9
sitting 15.5
happy 15
attractive 14.7
accordion 14.5
sexy 14.5
fashion 14.3
senior 14.1
device 13.9
portrait 13.6
house 13.4
holding 13.2
couple 13.1
lady 13
dress 12.6
pretty 12.6
interior 12.4
indoors 12.3
smile 12.1
keyboard instrument 11.6
studio 11.4
musician 11.4
one 11.2
hair 11.1
room 11.1
women 11.1
blond 11
hat 11
model 10.9
smiling 10.8
professional 10.8
black 10.8
teacher 10.6
happiness 10.2
face 9.9
newspaper 9.8
generator 9.7
computer 9.6
elderly 9.6
love 9.5
youth 9.4
casual 9.3
music 9.2
laptop 9.1
concertina 9.1
old 9.1
suit 9
singer 9
technology 8.9
style 8.9
together 8.8
vertical 8.7
boy 8.7
play 8.6
men 8.6
reading 8.6
expression 8.5
business 8.5
modern 8.4
free-reed instrument 8.3
human 8.2
alone 8.2
performer 8
father 7.8
retirement 7.7
notebook 7.6
fun 7.5
mature 7.4
20s 7.3
cheerful 7.3
looking 7.2
chair 7.2
handsome 7.1
family 7.1
job 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 93.8
text 85.5
toddler 70.4
baby 67.3
black and white 66.8
clothing 62.7

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 99%
Calm 99.7%
Happy 0.1%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Confused 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 4-12
Gender Female, 99.1%
Calm 88.6%
Sad 10.2%
Surprised 0.3%
Angry 0.3%
Disgusted 0.2%
Fear 0.2%
Confused 0.2%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Lamp 70.1%

Captions

Microsoft

a person standing in front of a computer 56.5%
a man and a woman standing in front of a computer 38.3%
a person standing in front of a computer 38.2%

Text analysis

Amazon

KODAKSLA

Google

YT3RA2-XAO
YT3RA2-XAO