Human Generated Data

Title

Untitled (man teaching baby to walk)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17817

Human Generated Data

Title

Untitled (man teaching baby to walk)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.4
Apparel 99.4
Human 97.7
Person 97.7
Shoe 96.1
Footwear 96.1
Shorts 95.9
Shoe 90.5
Face 88.3
Furniture 81.1
Chair 81.1
Portrait 73.1
Photo 73.1
Photography 73.1
Lamp 71.9
Child 68.9
Kid 68.9
People 67.8
Female 65.3
Floor 63.8
Door 63.7
Table Lamp 61
Play 59.2
Couch 58.9
Baby 58.3
Smile 57.7
Boy 56.8
Flooring 55.6
Coat 55.3
Suit 55.3
Overcoat 55.3
Head 55.2

Imagga
created on 2022-02-26

shower cap 64.1
cap 50.3
brass 41.6
headdress 39.8
wind instrument 31
clothing 28.2
person 27.5
people 26.8
adult 25.9
man 22.8
musical instrument 21.2
human 20.2
male 17
attractive 16.8
covering 15.8
consumer goods 14.9
cornet 14.7
lifestyle 14.4
sexy 13.6
pretty 13.3
happy 13.2
professional 12.8
dress 12.6
body 12
hair 11.9
women 11.9
health 11.8
life 11.7
portrait 11.6
smiling 11.6
holding 11.5
medical 11.5
smile 11.4
fashion 11.3
looking 11.2
casual 11
occupation 11
work 11
model 10.9
black 10.8
one 10.4
men 10.3
bass 10
ball 9.8
worker 9.8
lady 9.7
indoors 9.7
trombone 9.4
happiness 9.4
modern 9.1
active 9
working 8.8
equipment 8.8
standing 8.7
light 8.7
face 8.5
nurse 8.4
glasses 8.3
sport 8.2
exercise 8.2
fitness 8.1
activity 8.1
hospital 8
medicine 7.9
cute 7.9
profession 7.7
healthy 7.6
doctor 7.5
baritone 7.5
hat 7.4
mask 7.4
fit 7.4
wig 7.3
sensual 7.3
gorgeous 7.2
pose 7.2
home 7.2
science 7.1
job 7.1
look 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

wall 96.5
clothing 90.9
person 89
hat 88.7
black and white 83.4
human face 78.3
text 76.6
fashion accessory 76.2

Face analysis

Amazon

Google

AWS Rekognition

Age 6-16
Gender Male, 99%
Calm 72.8%
Sad 9.5%
Fear 7.4%
Surprised 4.2%
Happy 2.6%
Disgusted 1.4%
Angry 1.1%
Confused 1%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Sad 98.4%
Calm 0.8%
Disgusted 0.2%
Happy 0.2%
Fear 0.2%
Angry 0.1%
Confused 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%
Shoe 96.1%

Captions

Microsoft

a person wearing a hat 65.8%
a person wearing a costume 65.7%
a person wearing a costume 65.6%