Human Generated Data

Title

Untitled (threshing crew, Virgil Thaxton's farm, near Mechanicsburg, Ohio)

Date

July 1938-August 1938, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3531

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (threshing crew, Virgil Thaxton's farm, near Mechanicsburg, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3531

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Sun Hat 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Face 98.1
Head 98.1
Photography 98.1
Portrait 98.1
Person 97.7
Car 95.8
Transportation 95.8
Vehicle 95.8
Hat 91.7
Antique Car 86.3
Model T 86.3
Machine 85.2
Wheel 85.2
Wheel 73.8
Cap 73.2
Baseball Cap 57.9
Tire 57.7
Spoke 56.5
Alloy Wheel 55.4
Car Wheel 55.4

Clarifai
created on 2018-05-10

people 99.9
man 99
lid 99
two 98.9
adult 98.9
vehicle 98.3
veil 98.1
one 97.8
group 96.4
wear 96
transportation system 95.9
three 95.5
portrait 95.3
woman 94
group together 92.8
recreation 90.2
four 88.9
actor 88.1
driver 87.8
military 86.9

Imagga
created on 2023-10-06

passenger 58.4
man 28.2
male 24.3
happy 22.6
car 22
people 21.8
adult 21.3
person 20.3
smiling 19.5
smile 15.7
outdoors 15.7
happiness 15.7
vehicle 15.5
child 15.1
couple 14.8
kin 14.6
portrait 14.2
men 12.9
love 11.8
face 11.4
seat 11.3
sitting 11.2
wheelchair 10.9
transportation 10.8
old 10.4
senior 10.3
chair 10.3
outside 10.3
emotion 10.1
fashion 9.8
automobile 9.6
auto 9.6
grandfather 9.5
wife 9.5
two 9.3
model t 9.2
fun 9
cheerful 8.9
together 8.8
driver 8.7
lifestyle 8.7
hat 8.6
pair 8.5
mature 8.4
hand 8.4
joy 8.4
leisure 8.3
road 8.1
motor vehicle 8.1
family 8
looking 8
building 7.9
black 7.8
mother 7.8
retired 7.8
travel 7.7
driving 7.7
retirement 7.7
married 7.7
house 7.5
one 7.5
close 7.4
inside 7.4
transport 7.3
home 7.2
father 7.2
interior 7.1
architecture 7
train 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.4
person 98.9
old 63.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Happy 84.7%
Angry 12.1%
Surprised 6.7%
Fear 6.1%
Sad 2.2%
Confused 0.7%
Disgusted 0.3%
Calm 0.2%

AWS Rekognition

Age 37-45
Gender Male, 100%
Calm 96.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 1%
Confused 0.8%
Happy 0.5%
Disgusted 0.4%

Microsoft Cognitive Services

Age 15
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Car 95.8%
Hat 91.7%
Wheel 85.2%

Categories

Imagga

people portraits 84.9%
paintings art 12.3%
pets animals 1.7%

Captions