Human Generated Data

Title

Vicksburg Negroes and shop front, Mississippi

Date

March 1936, printed later

People

Artist: Walker Evans, American 1903 - 1975

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, National Endowment for the Arts Grant, P1972.105

Copyright

© Walker Evans Archive, The Metropolitan Museum of Art

Human Generated Data

Title

Vicksburg Negroes and shop front, Mississippi

People

Artist: Walker Evans, American 1903 - 1975

Date

March 1936, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Human 99.9
Person 99.9
Person 99.6
Person 99
Wheel 98.5
Machine 98.5
Person 97.2
Automobile 95.9
Transportation 95.9
Vehicle 95.9
Person 95.4
Footwear 95.3
Apparel 95.3
Shoe 95.3
Clothing 95.3
Model T 95.2
Antique Car 95.2
Building 92.4
Wheel 91.3
Person 88.5
Urban 88.2
Car 81.9
Rural 74.9
Countryside 74.9
Outdoors 74.9
Nature 74.9
Shelter 74.9
Tire 69.1
Neighborhood 56.6
Spoke 55.5
Shoe 50.4

Clarifai
created on 2018-03-16

vehicle 99.9
people 99.9
group together 99.4
transportation system 99.4
group 98.5
adult 97.1
two 97
car 96.5
street 96.5
man 95.9
many 93.9
driver 93.5
four 92.6
nostalgia 92.6
several 90.2
vintage 89.9
one 89.8
monochrome 87.2
wear 86.9
woman 86.3

Imagga
created on 2018-03-16

model t 100
motor vehicle 100
car 100
wheeled vehicle 37.8
vehicle 37.4
transportation 28.7
auto 27.7
old 25.1
automobile 22
transport 21
wheel 18.1
drive 16.1
truck 14.9
motor 14.5
road 13.5
antique 13.3
tire 12.5
retro 12.3
classic 12.1
house 11.7
vintage 11.6
engine 11.5
outdoors 11.2
grass 11.1
machine 10.6
equipment 10.5
outdoor 9.9
driving 9.7
building 9.5
luxury 8.6
travel 8.4
black 8.4
sky 8.3
street 8.3
sport 8.2
work 8
cars 7.8
broken 7.7
door 7.6
city 7.5
man 7.4
speed 7.3
cart 7.2
home 7.2
farm 7.1
summer 7.1

Google
created on 2018-03-16

Microsoft
created on 2018-03-16

outdoor 97.2
road 96.8
store 45.9

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Male, 52.1%
Sad 46.3%
Disgusted 47%
Angry 45.5%
Happy 45.3%
Surprised 45.3%
Calm 50.4%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Male, 54.4%
Happy 45.1%
Disgusted 45.1%
Calm 51.4%
Surprised 45.1%
Angry 45.4%
Sad 47.8%
Confused 45.2%

AWS Rekognition

Age 27-44
Gender Male, 50.6%
Disgusted 45.5%
Surprised 46.1%
Sad 49.6%
Calm 46.2%
Angry 47%
Happy 45.3%
Confused 45.4%

AWS Rekognition

Age 26-43
Gender Female, 50%
Happy 50.2%
Disgusted 49.5%
Confused 49.5%
Calm 49.5%
Sad 49.6%
Angry 49.6%
Surprised 49.5%

AWS Rekognition

Age 35-53
Gender Male, 54.1%
Sad 45.9%
Surprised 45.4%
Confused 45.4%
Happy 45.1%
Calm 52.3%
Disgusted 45.2%
Angry 45.7%

Feature analysis

Amazon

Person 99.9%
Wheel 98.5%
Shoe 95.3%
Car 81.9%

Captions

Microsoft

a truck is parked in front of a store 84%
a truck is parked in front of a store window 79.9%
a car parked in front of a store 79.8%

Text analysis

Amazon

BARBER
SHOE
SHOE SHIN
SHOF
Camel
SHIN

Google

BARBER
BARBER