Human Generated Data

Title

Untitled (woman holding child's hand and child steps out door)

Date

c. 1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14430

Human Generated Data

Title

Untitled (woman holding child's hand and child steps out door)

People

Artist: Jack Gould, American

Date

c. 1946

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Apparel 99.8
Clothing 99.8
Human 99.3
Person 99.3
Shoe 99
Footwear 99
Person 97.6
Female 96.1
Dress 94.3
Shoe 88.7
Shorts 87.6
Path 86.4
Woman 86
Skirt 82.4
Shoe 77.8
Girl 73.4
Floor 70.3
Face 64.1
Photo 64.1
Portrait 64.1
Photography 64.1
Child 62.5
Kid 62.5
Sidewalk 61
Pavement 61
Flooring 58.5
Walkway 57.5
Teen 56.4
Overcoat 55.4
Coat 51.5

Imagga
created on 2022-01-29

person 28.4
people 27.3
man 22.8
cleaner 20.3
adult 19.7
fashion 18.8
male 17.8
happy 17.5
city 17.5
portrait 16.8
urban 15.7
house 14.2
pretty 14
attractive 14
lifestyle 13.7
window 13.6
home 13.6
human 13.5
women 13.4
model 13.2
life 12.9
business 12.1
men 12
street 12
one 11.9
interior 11.5
smile 11.4
couple 11.3
standing 11.3
modern 11.2
black 11
happiness 11
elegance 10.9
smiling 10.9
cute 10.8
room 10.4
hair 10.3
alone 10
door 10
dress 9.9
lady 9.7
style 9.6
sexy 9.6
looking 9.6
clothing 9.5
casual 9.3
face 9.2
child 9.2
building 9
cheerful 8.9
posing 8.9
shop 8.9
elegant 8.6
walking 8.5
youth 8.5
device 8.5
store 8.5
buy 8.4
leisure 8.3
shopping 8.3
makeup 8.2
indoor 8.2
architecture 7.9
indoors 7.9
bags 7.8
summer 7.7
two 7.6
walk 7.6
hand 7.6
legs 7.5
clothes 7.5
fun 7.5
holding 7.4
care 7.4
inside 7.4
light 7.4
sensuality 7.3
bag 7.2
leg 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

clothing 96.7
footwear 94.3
text 93.9
person 93.7
outdoor 91.4
street 89.1
dress 86.9
standing 79.8
girl 73.6
black and white 66
skirt 65.2
woman 56.4

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.6%
Calm 99.6%
Surprised 0.3%
Disgusted 0%
Happy 0%
Angry 0%
Sad 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Shoe 99%
Coat 51.5%

Captions

Microsoft

a man standing in front of a building 86.5%
a man standing in front of a door 85.6%
a man standing in front of a brick building 78.2%

Text analysis

Amazon

MJI7
MJI7 YE3
YE3

Google

MJI7 YT3RA2 A33A
YT3RA2
MJI7
A33A