Human Generated Data

Title

Untitled (woman helping baby learn to walk)

Date

1962

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16474

Human Generated Data

Title

Untitled (woman helping baby learn to walk)

People

Artist: Lucian and Mary Brown, American

Date

1962

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Apparel 99.8
Clothing 99.8
Chair 99.4
Furniture 99.4
Dress 99.1
Human 97.3
Person 97.3
Female 85.7
Face 85.1
Suit 76.2
Coat 76.2
Overcoat 76.2
Portrait 74.1
Photography 74.1
Photo 74.1
Girl 71.3
Table 70.7
Dining Table 70.7
Indoors 70.5
Woman 66.8
Shorts 66
Room 64.6
Door 63.7
Kid 62.4
Child 62.4
Shoe 61.7
Footwear 61.7
Performer 60.8
Shirt 60.8
Floor 59.8
Baby 55.4

Imagga
created on 2022-02-11

person 34.9
adult 30.4
people 30.1
portrait 21.4
man 20.8
professional 20.6
smiling 17.4
clothing 17
human 16.5
lady 16.2
home 16
fashion 15.8
health 15.3
attractive 14.7
lifestyle 14.5
women 14.2
indoors 14.1
bow tie 13.8
happy 13.8
casual 13.6
male 13.5
smile 12.8
style 12.6
pretty 12.6
medical 12.4
necktie 12
patient 11.8
model 11.7
nurse 11.6
instrument 11.5
interior 11.5
medicine 11.4
newspaper 11.1
blond 10.9
dress 10.8
negative 10.7
laboratory 10.6
one 10.5
room 10.3
sitting 10.3
happiness 10.2
alone 10
face 9.9
life 9.9
standing 9.6
dishwasher 9.4
house 9.2
product 9.2
black 9.1
coat 9.1
holding 9.1
care 9.1
suit 8.9
working 8.8
sexy 8.8
film 8.8
looking 8.8
work 8.8
hospital 8.8
look 8.8
lab 8.7
garment 8.7
couple 8.7
day 8.6
cute 8.6
men 8.6
performer 8.6
walking 8.5
doctor 8.5
worker 8.4
cheerful 8.1
light 8
body 8
job 8
hair 7.9
equipment 7.9
clinic 7.9
business 7.9
vertical 7.9
scientific 7.8
elegant 7.7
apartment 7.7
two 7.6
research 7.6
occupation 7.3
indoor 7.3
creation 7.3
pose 7.2
handsome 7.1
family 7.1
posing 7.1
dancer 7.1
modern 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

text 95.3
outdoor 90.2
clothing 84.3
dress 73.5
furniture 72.7
person 64.7
chair 59.6
footwear 56.5
black and white 52.7
posing 38.1

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Female, 99.9%
Happy 52.8%
Sad 32.4%
Calm 10.8%
Disgusted 1%
Angry 0.8%
Fear 0.8%
Confused 0.7%
Surprised 0.6%

Feature analysis

Amazon

Chair 99.4%
Person 97.3%
Shoe 61.7%

Text analysis

Amazon

3
MJI7--YT37A--

Google

MJI7--YT37A°2--AGON :::
MJI7--YT37A°2--AGON
:::