Human Generated Data

Title

Untitled (Mrs. Yocum and Mr. Reed, Jere, Scotts Run, West Virginia)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1255

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Mrs. Yocum and Mr. Reed, Jere, Scotts Run, West Virginia)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1255

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 99.1
Formal Wear 98.8
Suit 98.8
Alloy Wheel 98.2
Car Wheel 98.2
Machine 98.2
Spoke 98.2
Tire 98.2
Transportation 98.2
Vehicle 98.2
Coat 97.9
Wheel 97.1
Face 96.4
Head 96.4
Photography 96.4
Portrait 96.4
Jacket 94.6
Wheel 81.8
Motorcycle 76.6
Accessories 73.2
Tie 73.2
Antique Car 71.8
Model T 71.8
Overcoat 67.1
Outdoors 63.2
Blazer 61.7
Car 59.3
Glasses 58.7
City 55.7
Road 55.7
Street 55.7
Urban 55.7
Electrical Device 55.7
Microphone 55.7
Smoke 55.2

Clarifai
created on 2018-05-11

people 100
adult 99.6
group together 98.4
vehicle 97.9
man 97.5
group 96.9
two 96.7
leader 94.7
administration 94.1
transportation system 93.3
three 92.7
military 92.6
four 89.4
child 88.8
several 88.7
woman 87.8
war 87.7
wear 87.2
outfit 83.9
five 83.4

Imagga
created on 2023-10-06

trombone 54.8
brass 47.1
wind instrument 41.4
man 29.6
musical instrument 28.9
male 22
people 18.4
outdoors 18
person 17
couple 13.9
outdoor 13.8
city 13.3
park 13.2
adult 13
sax 12.8
horse 11.4
military 10.6
statue 10.6
old 10.4
bench 10.4
building 10.3
summer 10.3
gun 10
bassoon 9.9
sport 9.9
vacation 9.8
together 9.6
love 9.5
world 9.4
outside 9.4
happy 9.4
two 9.3
protection 9.1
danger 9.1
fun 9
soldier 8.8
destruction 8.8
urban 8.7
grass 8.7
men 8.6
child 8.5
travel 8.4
sky 8.3
grandfather 8.2
playing 8.2
transportation 8.1
pedestrian 8.1
weapon 8
clothing 7.9
day 7.8
architecture 7.8
portrait 7.8
industry 7.7
mask 7.7
cart 7.6
stone 7.6
walking 7.6
rifle 7.5
tourism 7.4
street 7.4
industrial 7.3
landmark 7.2
suit 7.2
history 7.2
family 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 99.9
person 99.8
man 93.4
old 90.3
white 61
older 20.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 62.5%
Sad 16.1%
Confused 13.9%
Surprised 7.1%
Fear 6.1%
Happy 3%
Disgusted 1.8%
Angry 1.3%

AWS Rekognition

Age 60-70
Gender Male, 99.7%
Calm 85.7%
Surprised 7.4%
Fear 6.2%
Confused 6%
Angry 2.8%
Sad 2.8%
Disgusted 0.5%
Happy 0.3%

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 54
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Coat 97.9%
Wheel 97.1%
Tie 73.2%
Car 59.3%
Glasses 58.7%