Human Generated Data

Title

Untitled (threshing crew, Virgil Thaxton's farm, near Mechanicsburg, Ohio)

Date

July 1938-August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3310

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (threshing crew, Virgil Thaxton's farm, near Mechanicsburg, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3310

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Face 99.3
Head 99.3
Photography 99.3
Portrait 99.3
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Driving 85.1
Transportation 85.1
Vehicle 85.1
Hat 83.4
Car 83.1
Sun Hat 62
Sitting 57.4
Antique Car 57
Model T 57
Firearm 56.8
Weapon 56.8
Outdoors 56.7

Clarifai
created on 2018-05-10

people 99.9
vehicle 99.7
one 99.6
transportation system 99.1
adult 98.8
man 98.5
two 97.3
lid 97.3
watercraft 96
veil 94.6
administration 93.5
military 92.5
portrait 92.4
train 92.3
wear 92.3
railway 91.7
war 91.6
child 91.5
elderly 90.8
woman 90.3

Imagga
created on 2023-10-06

car 34.1
toaster 31.2
kitchen appliance 28.2
vehicle 25.4
home appliance 23.4
automobile 19.1
driver 18.4
appliance 16.3
drive 16.1
window 16
happy 15.6
auto 15.3
person 14.5
man 14.1
smiling 13.7
smile 13.5
driving 13.5
male 13.5
transportation 13.4
old 13.2
sitting 12.9
case 12.6
furniture 12.3
screen 12
people 11.7
windowsill 11.5
passenger 11.4
travel 11.3
joy 10.8
black 10.8
adult 10.3
happiness 10.2
transport 10
road 9.9
portrait 9.7
outdoors 9.7
sill 9.6
looking 9.6
fun 9
china cabinet 8.9
new 8.9
home 8.8
cabinet 8.7
love 8.7
wood 8.3
one 8.2
business 7.9
furnishing 7.9
motor 7.7
pretty 7.7
protective covering 7.7
bride 7.7
device 7.6
building 7.6
frame 7.6
durables 7.6
sit 7.6
communication 7.5
child 7.5
trip 7.5
house 7.5
vintage 7.4
structural member 7.4
inside 7.4
cheerful 7.3
window screen 7.3
metal 7.2
hat 7.1
worker 7.1
work 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

man 93.4
window 92.2
outdoor 91.4
bus 90.5
old 88.6
black 79.2
white 69.6
vintage 25.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 97.5%
Surprised 6.3%
Fear 5.9%
Sad 2.3%
Happy 1%
Confused 0.3%
Angry 0.3%
Disgusted 0.3%

Microsoft Cognitive Services

Age 43
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%
Hat 83.4%
Car 83.1%