Human Generated Data

Title

Untitled (New York City Reformatory, New Hampton, New York)

Date

May 1934-June 1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Human Generated Data

Title

Untitled (New York City Reformatory, New Hampton, New York)

People

Artist: Ben Shahn, American 1898 - 1969

Date

May 1934-June 1934

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.8
Human 99.8
Person 99.8
Person 99.7
Person 99.6
Person 99.6
Person 99.5
Nature 99.4
Outdoors 99.4
Person 99.4
Person 99.2
Person 98.6
Agriculture 97.5
Countryside 97.5
Field 97.5
Person 97.2
Soil 93.3
Rural 82.8
Worker 76.9
Farm 75.7
Planting 71.3
Ground 57.3
Harvest 56

Clarifai

people 99.9
group together 99.5
group 99.5
adult 99.5
many 98.9
military 98.3
war 98.3
man 98.1
cropland 97.9
vehicle 96.2
several 95.8
soldier 95.1
two 94.1
administration 92.3
three 92.3
four 91.9
one 90.7
skirmish 89.9
transportation system 88.9
wear 88.7

Imagga

maze 40.9
landscape 32.7
travel 30.3
sky 28.7
mountain 25.8
tourist 22.8
tourism 22.3
architecture 21.9
trees 20.5
private 20.3
park 18.1
hill 17.8
building 16.8
scenery 16.2
stone 16.1
outdoor 15.3
scenic 14.9
rock 14.8
old 13.9
track 13.9
mountains 13.9
ancient 13.8
summer 13.5
wall 12.8
person 12.7
city 12.5
tree 12.5
people 12.3
fortress 12.1
man 12.1
outdoors 11.9
grass 11.9
vacation 11.5
rural 11.5
engineer 11.2
landmark 10.8
history 10.7
hiking 10.6
valley 10.6
sun 10.5
forest 10.4
walking 10.4
sunny 10.3
clouds 10.1
road 9.9
tower 9.9
canyon 9.7
cloud 9.5
natural 9.4
desert 9.3
famous 9.3
countryside 9.1
sand 8.9
country 8.8
traveler 8.7
destination 8.4
field 8.4
rampart 8.2
national 8.2
water 8
ruin 7.8
hills 7.8
heritage 7.7
outside 7.7
culture 7.7
house 7.7
walk 7.6
temple 7.6
cityscape 7.6
south 7.5
active 7.2
holiday 7.2
river 7.1
male 7.1
palace 7.1
spring 7.1
day 7.1
sea 7
autumn 7

Google

farmworker 73.5
black and white 70.7
track 65.4
monochrome 61.7
rail transport 60.6
soil 59.2
laborer 58
vehicle 56.4
tree 50.8

Microsoft

sky 100
outdoor 99.9
person 93
group 75.9

Face analysis

Amazon

AWS Rekognition

Age 30-47
Gender Male, 50.3%
Angry 49.7%
Confused 49.6%
Disgusted 49.6%
Surprised 49.6%
Sad 49.6%
Calm 49.9%
Happy 49.6%

AWS Rekognition

Age 38-57
Gender Female, 53.4%
Confused 45.3%
Surprised 45.5%
Angry 45.4%
Sad 49.9%
Happy 45.4%
Calm 48.4%
Disgusted 45.2%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Sad 49.6%
Angry 49.6%
Happy 49.5%
Calm 50.3%

AWS Rekognition

Age 26-43
Gender Female, 51.5%
Sad 46.5%
Calm 46.2%
Confused 45.9%
Disgusted 46%
Surprised 45.9%
Happy 46.5%
Angry 48.1%

AWS Rekognition

Age 35-52
Gender Female, 51.5%
Surprised 45.1%
Happy 45.1%
Angry 45.2%
Sad 45.9%
Calm 53.7%
Confused 45%
Disgusted 45%

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing on top of a dirt field 97.5%
a group of people standing in a field 96.4%
a group of people standing in a dirt field 96.3%