Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1655

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Omar, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1655

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Coat 99.9
Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Male 99
Person 99
Boy 99
Child 99
Machine 97.6
Wheel 97.6
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Person 96.2
Spoke 95.2
Person 95
Hat 92.2
Transportation 90
Vehicle 90
Alloy Wheel 89.2
Car Wheel 89.2
Tire 89.2
Wheel 88.2
Car 85.4
City 83.7
Road 83.7
Street 83.7
Urban 83.7
Car 82.9
Motorcycle 81.5
Face 76.5
Head 76.5
Wheel 76.4
Motor 74.9
Person 71.9
Smoke 57.9
Overcoat 57
Sidecar 56.3
Outdoors 56.3
Photography 56.1
Portrait 56.1
Helmet 55.8
Hardhat 55.6
Formal Wear 55.4
Suit 55.4

Clarifai
created on 2018-05-11

people 100
group together 99.6
adult 99.5
vehicle 99.4
group 98.9
man 98.8
transportation system 98.6
military 97.1
two 96
uniform 95.2
soldier 95
administration 94.3
outfit 94.3
four 94.1
war 93.7
three 93.4
police 92.8
street 92.4
driver 91.8
woman 91.4

Imagga
created on 2023-10-06

wheelchair 46.2
barrow 34
vehicle 33.1
wheeled vehicle 30.1
chair 29.5
seat 27.3
snow 26.6
handcart 26.6
man 24.2
outdoors 22.5
winter 22.1
carriage 21.1
cart 20.7
people 19.5
cold 18.9
old 16.7
wagon 15.9
horse cart 15.8
furniture 15.4
conveyance 14.7
outdoor 14.5
male 14.2
park 14
wheel 13.2
tree 13.1
bench 12.9
person 12.6
transportation 12.5
adult 12.3
city 11.6
rural 11.5
men 11.2
outside 11.1
street 11
machine 10.9
road 10.8
frozen 10.5
walk 10.5
walking 10.4
building 10.4
season 10.1
family 9.8
snowy 9.7
landscape 9.7
senior 9.4
help 9.3
care 9.1
activity 9
sky 8.9
cannon 8.5
mother 8.4
countryside 8.2
happy 8.1
furnishing 8
lifestyle 7.9
disabled 7.9
day 7.8
work 7.8
couple 7.8
smile 7.8
portrait 7.8
travel 7.7
sitting 7.7
storm 7.7
horse 7.6
equipment 7.5
human 7.5
sport 7.4
historic 7.3
speed 7.3
fall 7.2
active 7.2
grass 7.1
trees 7.1
gun 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 4-12
Gender Male, 99.9%
Calm 62.2%
Sad 15.1%
Angry 14.3%
Fear 6.5%
Surprised 6.4%
Confused 4.4%
Happy 1.4%
Disgusted 1%

AWS Rekognition

Age 30-40
Gender Male, 99.9%
Calm 92.6%
Surprised 6.4%
Fear 5.9%
Sad 5%
Angry 0.5%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 22-30
Gender Female, 58.3%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.8%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Boy 99%
Child 99%
Wheel 97.6%
Car 85.4%
Motorcycle 81.5%

Categories

Text analysis

Amazon

239
QUICK
QUICK SERVICE
ESTAURANT
SERVICE
Ess
VAIOS
HERCANTHE

Google

STAURAN
STAURAN