Human Generated Data

Title

Untitled (children playing dress-up and walking baby carraige on sidewalk)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15593

Human Generated Data

Title

Untitled (children playing dress-up and walking baby carraige on sidewalk)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-19

Person 99.1
Human 99.1
Person 99
Person 96.7
Clothing 87.9
Apparel 87.9
Building 84.2
Neighborhood 84.2
Urban 84.2
Face 78.3
Door 70.5
Countryside 69.7
Shelter 69.7
Outdoors 69.7
Nature 69.7
Rural 69.7
Road 67.9
Spoke 64.9
Machine 64.9
People 63.4
Plant 62.7
Photo 61.9
Portrait 61.9
Photography 61.9
Alloy Wheel 61
Stroller 58.9
Path 58.2
Ground 57.7
Yard 55.3
Wheel 53.7
Wheel 53.6

Imagga
created on 2022-03-19

barrow 60.6
wheeled vehicle 57.7
handcart 50.8
vehicle 38.2
man 30.2
conveyance 24.7
outdoors 24
people 23.4
tricycle 23.2
male 21.3
park 18.9
pedestrian 18.2
chair 17.3
seat 16.8
person 16.5
city 15
wheelchair 14.6
snow 14.3
adult 14.3
cold 13.8
outdoor 13.8
winter 13.6
bench 13.4
street 12
old 11.8
lifestyle 11.6
tree 11.5
walk 11.4
landscape 11.2
day 11
active 10.8
trees 10.7
urban 10.5
couple 10.4
walking 10.4
scene 10.4
sitting 10.3
summer 10.3
sport 10
road 9.9
men 9.4
outside 9.4
wood 9.2
autumn 8.8
happy 8.8
boy 8.7
black 8.4
portrait 8.4
building 8.3
silhouette 8.3
care 8.2
grass 7.9
season 7.8
sky 7.7
frozen 7.6
relax 7.6
path 7.6
one 7.5
back 7.3
business 7.3
exercise 7.3
activity 7.2
smile 7.1
scholar 7.1
cool 7.1
love 7.1
work 7.1
businessman 7.1
cleaner 7.1
travel 7
life 7
together 7

Google
created on 2022-03-19

Microsoft
created on 2022-03-19

outdoor 99.6
child 89.9
person 87.1
black and white 87
house 81.6
text 80.1
clothing 75
toddler 59.7
baby 50.1

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Male, 80.1%
Calm 84.1%
Sad 14.7%
Angry 0.4%
Confused 0.3%
Fear 0.2%
Disgusted 0.2%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 11-19
Gender Male, 73.4%
Calm 99.5%
Sad 0.3%
Confused 0.1%
Happy 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 12-20
Gender Female, 85.1%
Sad 73.6%
Calm 21.5%
Happy 2%
Confused 0.8%
Disgusted 0.6%
Surprised 0.5%
Fear 0.5%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Wheel 53.7%

Captions

Microsoft

a man riding a skateboard up the side of a building 69%
a young man riding a skateboard up the side of a building 57.7%
a man riding a skateboard down the side of a building 57.6%