Human Generated Data

Title

Untitled (three girls with strollers on sidewalk)

Date

1959, printed later

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.146

Human Generated Data

Title

Untitled (three girls with strollers on sidewalk)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 100
Person 99.7
Human 99.7
Person 99.6
Person 90
Bike 86.2
Vehicle 86.2
Bicycle 86.2
Transportation 86.2
Wheel 76.2
Machine 76.2
Wheel 72.8
Cradle 70.7
Crib 67.1
Apparel 64
Shorts 64
Clothing 64
People 57.3

Imagga
created on 2022-01-08

bench 69.3
park bench 53.1
seat 44.2
stretcher 42.4
litter 33.8
conveyance 32.6
furniture 24.5
park 23.9
handcart 21.9
snow 19.9
tree 19.2
wheeled vehicle 19.2
outdoor 18.3
outdoors 18
chair 17.5
man 17.5
winter 17
barrow 16.5
landscape 16.4
trees 15.1
wood 15
day 14.9
cold 14.6
sitting 14.6
people 14.5
male 14.2
old 13.9
rural 13.2
summer 12.9
grass 12.6
adult 12.3
forest 12.2
building 11.9
person 11.8
relax 11.8
furnishing 11.7
sky 11.5
season 10.9
vehicle 10.9
shopping cart 10.9
lifestyle 10.8
happy 10.6
sit 10.4
scene 10.4
outside 10.3
field 10
autumn 9.7
empty 9.4
family 8.9
snowy 8.8
lonely 8.7
love 8.7
water 8.7
happiness 8.6
frozen 8.6
garden 8.4
relaxation 8.4
rest 8.3
color 8.3
alone 8.2
relaxing 8.2
country 7.9
couple 7.8
woods 7.6
beach 7.6
path 7.6
fun 7.5
freedom 7.3
child 7.2
fall 7.2
river 7.1
portrait 7.1
machine 7.1
architecture 7
seasonal 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

building 99.1
text 98.2
outdoor 97.4
person 81.7
clothing 72
black and white 65.5
house 64.9

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Female, 52.1%
Happy 64.4%
Disgusted 11.6%
Calm 11.5%
Surprised 4.1%
Confused 2.8%
Angry 2.6%
Fear 2%
Sad 1%

AWS Rekognition

Age 4-10
Gender Female, 99.9%
Happy 55.5%
Confused 16.1%
Sad 8.1%
Calm 7%
Surprised 6.8%
Angry 2.4%
Disgusted 2.2%
Fear 1.9%

AWS Rekognition

Age 4-10
Gender Female, 56%
Fear 30.9%
Confused 19.1%
Sad 15.7%
Angry 10.7%
Surprised 10.6%
Happy 6.8%
Calm 3.7%
Disgusted 2.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Bicycle 86.2%
Wheel 76.2%

Captions

Microsoft

a man standing in front of a building 84.7%
a man sitting in front of a building 78.2%
a man and a woman standing in front of a building 66.2%