Human Generated Data

Title

Untitled (three little girls walking prams on sidewalk)

Date

May, 1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17976

Human Generated Data

Title

Untitled (three little girls walking prams on sidewalk)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

May, 1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17976

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Shorts 99.2
Clothing 99.2
Apparel 99.2
Person 99.2
Human 99.2
Person 99.1
Chair 91.8
Furniture 91.8
Wheel 85.5
Machine 85.5
Outdoors 83
Female 82.2
Nature 81.3
Wheel 81
Shelter 80.1
Countryside 80.1
Rural 80.1
Building 80.1
Person 80
Plant 78.9
Tree 76.9
Housing 72.8
Meal 72.2
Food 72.2
Urban 69.2
Train 68.2
Transportation 68.2
Vehicle 68.2
Face 67.7
Woman 66.9
People 66.2
Girl 62.9
Photography 61.8
Photo 61.8
Grass 61.2
Kid 58.1
Child 58.1
Sitting 56.7
Path 56.4
Tire 56.4
Road 55.1

Clarifai
created on 2023-10-29

people 99.9
adult 97.6
two 97.3
vehicle 96.9
group together 96.7
monochrome 96.1
woman 95.9
home 95.7
child 93.8
man 93.6
street 93.6
group 93.6
transportation system 92
bench 91.4
canine 90.8
dog 90.2
family 90.1
cart 89.4
one 88.4
furniture 88

Imagga
created on 2022-03-04

musical instrument 24.7
cello 22.2
bowed stringed instrument 21.6
stringed instrument 21.3
dark 20.9
silhouette 19.9
man 18.8
people 17.8
chair 17.5
room 17.4
building 16.5
window 15.8
person 15.7
classroom 13.2
male 12.8
light 11.4
urban 11.4
dirty 10.8
city 10.8
old 10.4
black 10.2
seat 10.1
adult 10
scene 9.5
street 9.2
house 9.2
protection 9.1
portrait 9.1
fashion 9
wind instrument 8.9
night 8.9
women 8.7
water 8.7
sitting 8.6
walk 8.6
one 8.2
style 8.2
shopping cart 8.1
sunset 8.1
body 8
keyboard instrument 8
hair 7.9
couple 7.8
architecture 7.8
spooky 7.8
model 7.8
sky 7.7
relax 7.6
walking 7.6
handcart 7.5
fun 7.5
symbol 7.4
alone 7.3
business 7.3
sexy 7.2
device 7.2
day 7.1
travel 7

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 99.3
black and white 97.5
street 93.9
monochrome 91.1
house 81.9
person 69.1
clothing 52.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 51%
Calm 60.8%
Sad 35.8%
Angry 0.9%
Fear 0.7%
Surprised 0.5%
Disgusted 0.5%
Confused 0.4%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Wheel
Train
Person 99.2%
Person 99.1%
Person 80%
Wheel 85.5%
Wheel 81%
Train 68.2%