Human Generated Data

Title

Untitled (man in warehouse)

Date

c. 1970

People

Artist: Michael Mathers, American born 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1814

Human Generated Data

Title

Untitled (man in warehouse)

People

Artist: Michael Mathers, American born 1945

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Walkway 99.9
Path 99.9
Person 99.7
Human 99.7
Pavement 93.6
Sidewalk 93.6
Outdoors 89.8
Brick 89.3
Nature 81.6
Footwear 78.2
Clothing 78.2
Apparel 78.2
Shoe 78.2
Cobblestone 74.7
Countryside 74.3
Shelter 64
Rural 64
Building 64
Coat 61.8
Overcoat 61.8
Wall 57.3

Imagga
created on 2022-01-22

crutch 38.7
silhouette 30.6
staff 30
sunset 27.9
stick 27.2
man 26.2
people 22.9
person 22.5
black 19.1
adult 18.2
beach 17.7
male 16.4
fashion 15.8
water 15.4
ocean 14.9
sun 14.6
world 14.3
posing 14.2
dark 14.2
sky 14
model 14
skateboard 13
sport 13
light 12.7
pose 12.7
sea 12.5
evening 12.1
human 12
style 11.9
dress 11.7
summer 11.6
lifestyle 11.6
walking 11.4
outdoors 11.2
body 11.2
street 11.1
exercise 10.9
wheeled vehicle 10.9
shadow 10.8
lady 10.6
dusk 10.5
sexy 10.4
portrait 10.4
women 10.3
motion 10.3
action 10.2
sensuality 10
city 10
dirty 9.9
attractive 9.8
dance 9.8
board 9.5
support 9.5
legs 9.4
strength 9.4
outdoor 9.2
step 9.1
leisure 9.1
fitness 9
vacation 9
couple 8.7
performance 8.6
device 8.6
walk 8.6
travel 8.5
sunrise 8.4
relax 8.4
teenager 8.2
landscape 8.2
active 8.1
urban 7.9
men 7.7
moving 7.6
vehicle 7.6
hot 7.5
lamp 7.5
fun 7.5
one 7.5
sports equipment 7.4
cricket bat 7.4
horizon 7.2
coast 7.2
life 7.2
hair 7.1
spotlight 7.1
cool 7.1
performer 7
modern 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

footwear 97.8
outdoor 97.7
clothing 97.3
skating 96.6
person 95.8
street 95.5
man 93.8
black and white 92
text 91.6
way 85.6
monochrome 76.1
trousers 56.1
sidewalk 54.1
shadow 54

Face analysis

Amazon

Google

AWS Rekognition

Age 52-60
Gender Male, 99.9%
Calm 51.1%
Sad 18.9%
Disgusted 11.7%
Angry 9.7%
Fear 2.9%
Confused 2.2%
Happy 2%
Surprised 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 78.2%

Captions

Microsoft

a man riding a skateboard down a sidewalk next to a brick wall 44.3%
a man riding a skateboard down a sidewalk 44.2%
a man riding a skateboard up the side of a building 44.1%