Human Generated Data

Title

Untitled (little girls pushing carriages, dressed in hats and coats)

Date

1959

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18902

Human Generated Data

Title

Untitled (little girls pushing carriages, dressed in hats and coats)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18902

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 99.5
Clothing 98
Apparel 98
Shoe 97
Footwear 97
Shoe 93.8
Shoe 93.3
Person 89.7
Shorts 83.4
Outdoors 83.1
Wheel 81.8
Machine 81.8
Tree 81.3
Plant 81.3
Wheel 80.9
Nature 76.8
Wheel 71.8
Bicycle 71.8
Transportation 71.8
Vehicle 71.8
Bike 71.8
Leisure Activities 68.8
People 66.8
Land 63.1
Wheel 62.5
Wheel 60.6
Hat 60.6
Road 58.7
Furniture 58.6
Conifer 57.8
Fir 55.5
Abies 55.5
Wheel 53.9

Clarifai
created on 2023-10-22

people 99.7
monochrome 99
group together 97.1
child 96.7
adult 96.5
street 96.1
group 96
bench 95.6
man 94.7
two 93.6
recreation 92.5
tree 92.1
black and white 91.2
woman 91.2
vehicle 90.3
one 90.2
boy 89.4
transportation system 89.3
many 88.4
park 86.2

Imagga
created on 2022-03-05

handcart 41.4
chair 40.8
seat 37.5
shopping cart 35.2
wheeled vehicle 33.5
bench 21.9
container 20.6
furniture 18.9
folding chair 18.2
barrow 17.5
conveyance 17.2
percussion instrument 16.9
park bench 16.6
park 16.5
tree 15.4
musical instrument 14.6
building 13.5
steel drum 13.1
sky 12.8
grass 12.6
landscape 12.6
old 12.5
trees 12.5
outdoor 12.2
vehicle 12.2
man 11.4
water 11.3
forest 11.3
city 10.8
outdoors 10.4
scene 10.4
empty 10.3
summer 10.3
street 10.1
urban 9.6
winter 9.4
male 9.2
house 9.2
road 9
snow 8.9
sun 8.9
rural 8.8
work 8.7
stretcher 8.6
day 8.6
cold 8.6
path 8.5
furnishing 8.3
lake 8.2
ashcan 8
country 7.9
business 7.9
litter 7.9
architecture 7.8
people 7.8
area 7.7
lonely 7.7
drum 7.6
wheelchair 7.5
wood 7.5
bin 7.4
window 7.3
fence 7.3
yellow 7.3
new 7.3
industrial 7.3
black 7.2
sunlight 7.1
mountain 7.1
river 7.1
horse 7.1
working 7.1
sea 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 98.5
text 95.3
black and white 93.8
monochrome 75.4
person 64.5
bench 42.4
outdoor object 30.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 90.4%
Fear 27.9%
Happy 18.8%
Sad 17%
Angry 12.7%
Surprised 11.6%
Disgusted 7.1%
Confused 3.6%
Calm 1.3%

AWS Rekognition

Age 34-42
Gender Male, 55.4%
Calm 73.6%
Sad 9.7%
Happy 5.5%
Disgusted 4.2%
Confused 3.2%
Angry 2.1%
Surprised 1.1%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person
Shoe
Wheel
Bicycle
Person 99.5%
Person 99.5%
Person 89.7%
Shoe 97%
Shoe 93.8%
Shoe 93.3%
Wheel 81.8%
Wheel 80.9%
Wheel 71.8%
Wheel 62.5%
Wheel 60.6%
Wheel 53.9%
Bicycle 71.8%

Captions