Human Generated Data

Title

Untitled (boy in front of cars)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16605

Human Generated Data

Title

Untitled (boy in front of cars)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-12

Person 99.3
Human 99.3
Car 98.5
Vehicle 98.5
Automobile 98.5
Transportation 98.5
Apparel 94.6
Clothing 94.6
Female 73.1
Face 72.2
Outdoors 71
Road 68
Nature 67.5
People 66.4
Photo 66.1
Photography 66.1
Military 65.8
Military Uniform 65.8
Portrait 64
Tire 63.3
Machine 63
Girl 60.2
Asphalt 59.2
Tarmac 59.2
Wheel 59.1
Officer 58.8
Sports Car 58
Car 57.6
Coupe 56.2
Spoke 56
Plant 55.5

Imagga
created on 2022-02-12

child 24.3
man 23.5
person 21
people 20.6
beach 18.7
sunset 18
male 17.1
outdoors 16.4
adult 16.2
world 15.9
park bench 15.6
sand 14.7
bench 14.3
danger 13.6
sky 13.4
outdoor 13
sea 12.5
kin 12.3
water 12
portrait 11.7
landscape 11.2
love 11.1
summer 10.9
protection 10.9
ocean 10.9
clothing 10.8
leisure 10.8
holding 10.7
park 10.7
seat 10.5
happiness 10.2
lifestyle 10.1
industrial 10
silhouette 9.9
juvenile 9.9
vacation 9.8
uniform 9.6
couple 9.6
walking 9.5
outside 9.4
two 9.3
sport 9.3
old 9.1
dirty 9
fun 9
mother 8.9
family 8.9
military uniform 8.8
sexy 8.8
destruction 8.8
toxic 8.8
protective 8.8
dangerous 8.6
relax 8.4
black 8.4
attractive 8.4
sibling 8.4
coast 8.1
autumn 7.9
mask 7.9
radioactive 7.9
day 7.8
radiation 7.8
nuclear 7.8
play 7.8
men 7.7
chemical 7.7
gas 7.7
tree 7.7
youth 7.7
happy 7.5
smoke 7.4
active 7.2
recreation 7.2
religion 7.2
mountain 7.1
grass 7.1
kid 7.1
together 7

Microsoft
created on 2022-02-12

outdoor 97.4
black and white 86.4
text 83.3
land vehicle 71.6
wheel 67.7
car 66.1
vehicle 64.7
person 59.5
clothing 58.7

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 94.9%
Calm 86%
Confused 6.5%
Sad 2.3%
Happy 2.3%
Angry 1.2%
Disgusted 0.8%
Surprised 0.7%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Car 98.5%
Wheel 59.1%

Captions

Microsoft

a man standing in front of a truck 85.6%
a man that is standing in front of a truck 79.9%
a man standing in front of a car 72.9%

Text analysis

Amazon

ะตะท
KODAKSEELA