Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.364

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.364

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

City 100
Road 100
Street 100
Urban 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Path 98.7
Sidewalk 98.7
Car 98
Transportation 98
Vehicle 98
Male 97.8
Person 97.8
Boy 97.8
Child 97.8
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
Machine 97.5
Wheel 97.5
Person 97.2
Person 97
Person 96.4
Person 95.7
Person 95.7
Car 94
Wheel 93.1
Clothing 92.6
Coat 92.6
Motorcycle 90.8
Wheel 85.6
Wheel 85.5
Wheel 82.9
Car 79.9
Face 78.7
Head 78.7
Wheel 74.6
Motor 69.7
Outdoors 68.1
Spoke 57.3
License Plate 57
Sidecar 56.9
Hat 55.9
Car 55.8
Smoke 55.1

Clarifai
created on 2018-05-11

people 100
group together 99.7
vehicle 99.5
adult 99.5
transportation system 98.6
man 97.4
wear 96.5
many 96
group 95.9
road 95.4
two 95.2
street 94.7
military 93.6
administration 92
three 91.1
police 90
one 90
four 89.5
veil 89.3
war 88.8

Imagga
created on 2023-10-06

barrow 61.4
handcart 49.7
wheelchair 42.9
wheeled vehicle 42.5
vehicle 38
man 28.9
chair 28.3
people 24.5
seat 23.2
adult 20.7
conveyance 20.7
transportation 20.6
outdoors 18.9
person 18.7
male 17.9
street 17.5
city 15.8
men 14.6
outdoor 13.8
wheel 13.2
old 12.5
walking 12.3
help 12.1
furniture 11.9
snow 11.7
walk 11.4
urban 11.4
couple 11.3
travel 11.3
road 10.8
care 10.7
transport 10
active 9.9
disabled 9.9
black 9.8
work 9.5
bench 9.2
park 9.1
activity 9
landscape 8.9
life 8.8
lifestyle 8.7
cold 8.6
sitting 8.6
industry 8.5
stretcher 8.1
building 8
handicapped 7.9
disability 7.9
scene 7.8
portrait 7.8
winter 7.7
senior 7.5
equipment 7.4
holding 7.4
safety 7.4
water 7.3
pedestrian 7.3
looking 7.2
smile 7.1
love 7.1
worker 7.1
working 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.2
road 98
person 87.8
way 50.9
cart 50.1
pulling 48.1
sidewalk 25.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Calm 98.2%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 1.1%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 16-22
Gender Male, 99.9%
Calm 97.7%
Surprised 6.3%
Fear 5.9%
Sad 2.5%
Confused 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 2-8
Gender Male, 96.9%
Calm 32.9%
Fear 29.5%
Angry 14.1%
Confused 9.6%
Surprised 7.4%
Sad 4.7%
Disgusted 4%
Happy 3.1%

AWS Rekognition

Age 23-33
Gender Male, 69.4%
Calm 65.3%
Angry 18.6%
Surprised 8.2%
Disgusted 6.1%
Fear 6.1%
Sad 3.2%
Happy 1.7%
Confused 1.3%

AWS Rekognition

Age 16-22
Gender Female, 97.5%
Sad 100%
Surprised 6.3%
Fear 5.9%
Confused 0%
Calm 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 2-8
Gender Male, 98.4%
Calm 52.4%
Happy 35.8%
Surprised 6.8%
Fear 6.3%
Sad 5.4%
Confused 1.1%
Angry 1.1%
Disgusted 0.9%

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Car 98%
Boy 97.8%
Child 97.8%
Wheel 97.5%
Hat 55.9%

Text analysis

Amazon

239
ESTAURANT
SERVICE
QUICK SERVICE
QUICK
-
DOMES

Google

STAURANT 239
STAURANT
239