Human Generated Data

Title

New York

Date

1983

People

Artist: Louis Stettner, American 1922 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.569

Copyright

© Louis Stettner Estate

Human Generated Data

Title

New York

People

Artist: Louis Stettner, American 1922 - 2016

Date

1983

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 99.8
Human 99.8
Automobile 98.5
Transportation 98.5
Car 98.5
Vehicle 98.5
Person 98.4
Person 98
Person 97.6
Wheel 97.5
Machine 97.5
Car 97.1
Person 96.9
Person 96.2
Person 95.3
Person 94.5
Person 94.5
Person 94.5
Wheel 94.4
Car 94.3
Tarmac 92.8
Asphalt 92.8
Pedestrian 92.3
Person 91.8
Car 91.6
Person 90.8
Person 87.6
City 85
Urban 85
Town 85
Metropolis 85
Building 85
Road 82.3
Footwear 74.1
Clothing 74.1
Apparel 74.1
Shoe 74.1
Spoke 70.4
Leisure Activities 69.9
Shoe 69.1
People 62.7
Tire 60.2
Sports Car 59.6
Shorts 59.6
Street 57.1
Alloy Wheel 55.3

Clarifai
created on 2018-02-10

people 99.4
street 97.4
adult 97
man 96.2
one 96
group together 93.6
action 92.7
vehicle 91.2
competition 90.9
monochrome 89.9
wear 88.8
woman 87
two 86.4
many 86.2
road 84.5
pavement 83.1
recreation 82.3
police 82.2
transportation system 81.4
motion 81.2

Imagga
created on 2018-02-10

skateboard 41
sport 38.9
board 29.3
adult 25.9
weapon 25.6
man 25.6
vehicle 23
people 21.8
person 21.5
stick 20.9
fashion 18.9
sword 18.8
outdoors 17.9
outdoor 17.6
street 17.5
lifestyle 16.6
exercise 16.4
fitness 16.3
conveyance 16.2
ball 15.8
leisure 15.8
javelin 15.7
male 15.6
model 15.6
golf 15.3
active 15.3
athlete 15.2
one 14.2
summer 14.2
golfer 14.2
city 14.1
action 13.9
men 13.7
fun 13.5
elegance 13.4
pretty 13.3
attractive 13.3
black 13.2
style 12.6
spear 12.5
game 12.5
player 12.3
competition 11.9
hair 11.9
professional 11.8
dress 11.8
urban 11.4
club 11.3
outside 11.1
sports 11.1
business 10.9
iron 10.9
recreation 10.8
swing 10.7
posing 10.7
sexy 10.4
walking 10.4
legs 10.4
motion 10.3
cute 10.1
playing 10
sunset 9.9
human 9.8
lady 9.7
crutch 9.7
body 9.6
play 9.5
training 9.2
silhouette 9.1
activity 9
hit 8.8
grass 8.7
standing 8.7
athletic 8.6
jeans 8.6
moving 8.6
wall 8.6
staff 8.5
sensuality 8.2
happy 8.2
leg 8.1
suit 8.1
sun 8.1
dancer 8
cool 8
work 7.9
brunette 7.8
golfing 7.8
portrait 7.8
driver 7.8
cleaner 7.7
corporate 7.7
walk 7.6
dance 7.6
relax 7.6
pose 7.3
shadow 7.2
women 7.1
life 7

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

road 99.9
person 99.7
outdoor 99.5
man 94.6
street 92.4
walking 76.3

Face analysis

Amazon

AWS Rekognition

Age 30-47
Gender Male, 99%
Happy 0.3%
Angry 14.8%
Surprised 2.4%
Sad 13.5%
Disgusted 2%
Calm 57.2%
Confused 9.8%

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Disgusted 49.6%
Confused 49.6%
Angry 49.6%
Surprised 49.5%
Happy 49.5%
Sad 50%
Calm 49.7%

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Calm 50%
Happy 49.6%
Disgusted 49.5%
Surprised 49.6%
Angry 49.6%
Sad 49.7%
Confused 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50%
Confused 49.6%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.9%
Calm 49.7%
Angry 49.7%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Calm 49.9%
Surprised 49.5%
Sad 49.7%
Angry 49.7%
Disgusted 49.6%
Confused 49.6%
Happy 49.6%

Feature analysis

Amazon

Person 99.8%
Car 98.5%
Wheel 97.5%
Shoe 74.1%

Captions

Microsoft

a man walking down the street 97.5%
a man is walking down the street 97.4%
a man walking down a street 97.3%

Text analysis

Amazon

SALD
5393