Human Generated Data

Title

Untitled (central Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.582

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.582

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Person 98.8
Person 97.9
Person 97.9
Outdoors 97.3
Person 96.7
Machine 96.6
Wheel 96.6
Car 96
Transportation 96
Vehicle 96
Wheel 95.6
Antique Car 93.2
Model T 93.2
Spoke 92.3
Person 88.8
Person 88.6
Person 87.9
Animal 86
Horse 86
Mammal 86
Person 85.7
Person 85
Person 84.4
Car 78.7
Car 78.5
Wheel 77.3
Car 75.9
Person 74.8
Person 73.3
Wheel 72.9
Person 72.2
Nature 71.4
Play Area 68.4
Person 67
Car 65.2
Car 56.6
People 56.5
Car 56.4
Outdoor Play Area 55.5
Grass 55.5
Plant 55.5
Utility Pole 55.5
Tree 55.1

Clarifai
created on 2018-05-11

people 100
group together 99.7
group 98.9
adult 97.9
man 97.8
many 97.8
vehicle 97.1
military 92.6
soldier 91.9
war 90.9
three 88.3
several 88.2
transportation system 86.1
monochrome 86
two 85.7
child 84.9
recreation 83.5
administration 82.8
cavalry 80.1
campsite 79.8

Imagga
created on 2023-10-06

swing 49.4
mechanical device 37.2
resort area 36.4
plaything 36
area 29
mechanism 27.7
tool 26.9
region 23.7
plow 22.3
outdoor 18.3
chairlift 18.2
grass 17.4
sky 17.2
field 15.9
location 15.4
park 15
landscape 14.9
ski tow 14.5
conveyance 13.8
old 13.2
summer 12.9
outdoors 12.7
rural 12.3
day 11.8
man 11.4
fun 11.2
industry 11.1
work 11.1
tree 10.8
vehicle 10.4
building 10.3
leisure 10
sunset 9.9
farm 9.8
disaster 9.8
country 9.7
play 9.5
cloud 9.5
outside 9.4
travel 9.2
chair 9.2
countryside 9.1
environment 9
meadow 9
mountain 8.9
sun 8.9
destruction 8.8
agriculture 8.8
structure 8.7
construction 8.6
wheeled vehicle 8.3
transport 8.2
industrial 8.2
machine 8.1
tractor 8.1
lawn mower 8.1
light 8
trees 8
sand 7.9
spring 7.9
playground 7.9
forest 7.8
black 7.8
male 7.8
adult 7.8
sunny 7.7
empty 7.7
sport 7.7
winter 7.7
clouds 7.6
beach 7.6
equipment 7.6
wheel 7.5
vacation 7.4
danger 7.3
people 7.3
transportation 7.2
game 7.1
river 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

grass 98.3
outdoor 98
old 77.8
group 66.7
several 10.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 89.1%
Calm 97.5%
Surprised 6.4%
Fear 5.9%
Sad 2.5%
Angry 0.4%
Confused 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 21-29
Gender Female, 81.4%
Happy 65.9%
Calm 23.7%
Surprised 6.7%
Fear 6%
Sad 6%
Angry 0.8%
Disgusted 0.7%
Confused 0.3%

AWS Rekognition

Age 18-26
Gender Male, 80.2%
Sad 99.8%
Calm 16.2%
Confused 12.1%
Surprised 6.4%
Fear 6%
Angry 1.2%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 23-33
Gender Male, 99.3%
Calm 42.3%
Happy 37.6%
Sad 12.6%
Surprised 6.8%
Fear 6.2%
Angry 2.2%
Disgusted 1.8%
Confused 1%

Feature analysis

Amazon

Person 98.8%
Wheel 96.6%
Car 96%
Horse 86%

Categories