Human Generated Data

Title

Untitled (girl pushing baby in baby carraige)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17310

Human Generated Data

Title

Untitled (girl pushing baby in baby carraige)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.8
Person 99.8
Clothing 98.5
Apparel 98.5
Grass 96.3
Plant 96.3
Tree 89.6
Person 81.7
Shorts 75.8
Baby 74.7
Road 70.3
Photo 69.9
Portrait 69.9
Face 69.9
Photography 69.9
Child 69.4
Kid 69.4
Wheel 67.3
Machine 67.3
Fir 66.5
Abies 66.5
Girl 62.8
Female 62.8
Pants 62.1
Hat 61.8
Wheel 61.2
Dress 59.1
Tarmac 56
Asphalt 56

Imagga
created on 2022-02-26

wheeled vehicle 31.4
tricycle 29.7
man 26.2
vehicle 21.1
outdoors 20.5
outdoor 19.9
people 19.5
sport 19.2
male 19.1
pedestrian 18.1
bowed stringed instrument 17.7
person 17.5
weapon 16.7
musical instrument 16.4
conveyance 16.3
sunset 16.2
grass 15.8
silhouette 15.7
park 15.6
sky 15.3
stringed instrument 15.1
landscape 14.9
summer 14.1
water 14
bench 13.6
recreation 13.4
bow and arrow 13.4
adult 13
happy 12.5
violin 12.5
sun 12.1
snow 11.7
field 11.7
tree 11.5
winter 11.1
barrow 11
relax 10.9
leisure 10.8
instrument 10.8
cold 10.3
sitting 10.3
child 10.2
vacation 9.8
river 9.8
handcart 9.8
fun 9.7
couple 9.6
scene 9.5
walking 9.5
beach 9.4
relaxation 9.2
lake 9.2
chair 9.1
old 9.1
suit 9
tool 9
brass 8.9
autumn 8.8
boy 8.7
fishing 8.6
day 8.6
men 8.6
device 8.6
outside 8.6
cello 8.5
travel 8.4
portrait 8.4
fisherman 8.3
active 8.3
wind instrument 8.3
protection 8.2
danger 8.2
meadow 8.1
clothing 8
mountain 8
businessman 7.9
rural 7.9
together 7.9
happiness 7.8
destruction 7.8
sea 7.8
dusk 7.6
human 7.5
lawn mower 7.4
holding 7.4
environment 7.4
seat 7.3
dirty 7.2
lifestyle 7.2
trees 7.1
love 7.1
working 7.1
scenic 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.9
tree 98.7
grass 97.6
toddler 93.5
black and white 89.1
cart 87.3
baby 83.4
text 80.2
child 76.3
person 74.6
playground 54.6
bench 27.4

Face analysis

Amazon

Google

AWS Rekognition

Age 18-24
Gender Male, 91.9%
Calm 99.4%
Sad 0.2%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 53.5%
Calm 99.9%
Happy 0%
Sad 0%
Fear 0%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Wheel 67.3%

Captions

Microsoft

a little boy that is standing in the grass 57.3%
a person holding a kite in a park 44.7%

Text analysis

Amazon

the