Human Generated Data

Title

Untitled (Maynardville, Tennessee)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1439

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Maynardville, Tennessee)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1439

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

License Plate 99.9
Transportation 99.9
Vehicle 99.9
Clothing 99.6
Coat 99.6
Adult 98.8
Female 98.8
Person 98.8
Woman 98.8
Adult 98.2
Person 98.2
Male 98.2
Man 98.2
Adult 97.5
Person 97.5
Male 97.5
Man 97.5
Adult 97.2
Female 97.2
Person 97.2
Woman 97.2
Adult 96.9
Adult 96.9
Female 96.9
Female 96.9
Person 96.9
Woman 96.9
Bride 96.9
Wedding 96.9
Person 96.4
People 94.4
Alloy Wheel 93.6
Car Wheel 93.6
Machine 93.6
Spoke 93.6
Tire 93.6
Adult 92.2
Person 92.2
Male 92.2
Man 92.2
Car 91
Person 89.4
Adult 89
Person 89
Male 89
Man 89
Person 86
Wheel 83.2
Wheel 78.9
Face 72.7
Head 72.7
City 71.3
Footwear 66.9
Shoe 66.9
Hat 60
Shoe 57.9
Road 57.8
Overcoat 57.5
Street 57.2
Urban 57.2
Antique Car 56.6
Path 55.7
Sidewalk 55.7

Clarifai
created on 2018-05-11

people 99.9
group together 99.4
vehicle 98.7
war 98
adult 97.2
military 96.8
group 96.4
transportation system 95.7
man 95.3
soldier 95.1
administration 94.8
two 93.5
many 90.5
child 90.4
police 87.9
uniform 87.5
weapon 86.9
interaction 86.9
one 85.9
several 85.2

Imagga
created on 2023-10-06

car 31.9
snow 25.6
vehicle 25.5
transportation 22.4
stretcher 19.5
road 18.1
litter 16.6
conveyance 16.2
danger 14.5
automobile 14.4
drive 14.2
truck 14
transport 13.7
wheeled vehicle 13.4
auto 13.4
old 13.2
seat 12.9
black 12.6
weather 12.3
travel 12
bench 11.9
industrial 11.8
chair 11.8
building 11.1
winter 11.1
trees 10.7
driving 10.6
work 10.2
city 10
sand 9.9
outdoors 9.7
sky 9.6
shopping cart 9.5
track 9.2
tree 9.2
safety 9.2
wood 9.2
accident 8.8
urban 8.7
man 8.7
motor vehicle 8.7
scene 8.7
cold 8.6
industry 8.5
tool 8.4
people 8.4
handcart 8.3
street 8.3
vintage 8.3
park 8.2
landscape 8.2
metal 8
container 8
machine 8
cars 7.8
architecture 7.8
parking meter 7.7
dangerous 7.6
sign 7.5
environment 7.4
speed 7.3
structure 7.2
history 7.2

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.6
grass 96.6
person 87.5
people 83.4
group 71.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 95.7%
Surprised 7.8%
Fear 5.9%
Sad 2.2%
Confused 0.5%
Angry 0.3%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Male, 99.6%
Confused 58.8%
Calm 33.9%
Surprised 6.5%
Fear 5.9%
Sad 4.9%
Disgusted 0.4%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 19-27
Gender Male, 99.3%
Calm 96.9%
Surprised 6.4%
Fear 5.9%
Sad 2.6%
Confused 0.7%
Angry 0.3%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 16-22
Gender Male, 99.8%
Happy 90.9%
Surprised 7.2%
Fear 6%
Calm 3.1%
Sad 2.3%
Disgusted 1.4%
Confused 1.2%
Angry 0.9%

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 80.6%
Surprised 9.6%
Sad 6.4%
Fear 6.3%
Happy 2.5%
Angry 0.9%
Confused 0.9%
Disgusted 0.6%

AWS Rekognition

Age 22-30
Gender Male, 76.6%
Calm 59.9%
Disgusted 13.8%
Sad 7.9%
Surprised 7.8%
Fear 7.6%
Happy 4.6%
Confused 3.6%
Angry 1.5%

AWS Rekognition

Age 27-37
Gender Male, 98.4%
Sad 99.7%
Calm 26.6%
Surprised 6.7%
Fear 6%
Angry 1.8%
Happy 1.3%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 45-53
Gender Female, 78.8%
Happy 71.5%
Calm 25.6%
Surprised 6.7%
Fear 6.1%
Sad 2.4%
Disgusted 0.6%
Angry 0.2%
Confused 0.1%

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 10
Gender Female

Feature analysis

Amazon

Adult 98.8%
Female 98.8%
Person 98.8%
Woman 98.8%
Male 98.2%
Man 98.2%
Bride 96.9%
Car 91%
Wheel 83.2%
Shoe 66.9%
Hat 60%

Text analysis

Amazon

1700