Human Generated Data

Title

Harvest hand and helper on the Virgil Thaxton farm near Mechanicsburg, Ohio

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3021

Human Generated Data

Title

Harvest hand and helper on the Virgil Thaxton farm near Mechanicsburg, Ohio

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3021

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Hat 99.7
Clothing 99.7
Apparel 99.7
Person 99.7
Human 99.7
Wheel 90.2
Machine 90.2
Worker 87.9
Helmet 75.2
Face 64.5
Hardhat 62.5
Pants 59.3
Sun Hat 58.7

Clarifai
created on 2023-10-15

people 99.8
lid 98.8
man 98.6
adult 98.4
gun 98.4
one 97.4
vintage 97.3
portrait 96.8
war 95.9
military 95.8
cowman 95.1
two 94.9
wear 93
weapon 92.8
soldier 91.8
retro 89.5
uniform 89.1
farming 87.1
cannon 86.4
three 85.8

Imagga
created on 2021-12-15

jinrikisha 59.8
cart 55.8
wagon 38.1
man 37
wheelchair 35
outdoors 33.7
wheeled vehicle 28.4
chair 24.2
male 24.1
vehicle 22.6
people 20.6
hat 18.8
outside 18.8
adult 18.1
person 18
outdoor 17.6
happy 17.5
seat 16.9
senior 16.9
park 16.5
lifestyle 15.9
couple 15.7
wheel 15.1
car 14.6
carriage 13.8
happiness 13.3
men 12.9
summer 12.9
cowboy 12.3
vacation 12.3
transportation 11.6
portrait 11.6
smiling 11.6
sky 11.5
sitting 11.2
day 11
active 10.9
field 10.9
husband 10.5
one 10.4
horse 10.4
wife 10.4
help 10.2
smile 10
leisure 10
retired 9.7
together 9.6
furniture 9.5
old 9.1
grass 8.7
love 8.7
elderly 8.6
care 8.2
healthy 8.2
road 8.1
bicycle 8.1
working 8
handicapped 7.9
sport 7.9
disabled 7.9
standing 7.8
bike 7.8
riding 7.8
ride 7.8
travel 7.7
cowboy hat 7.7
industry 7.7
retirement 7.7
illness 7.6
relax 7.6
relationship 7.5
street 7.4
countryside 7.3
transport 7.3
clothing 7.2
season 7

Microsoft
created on 2021-12-15

outdoor 99.8
person 99.4
hat 99.2
clothing 98.1
fashion accessory 96.8
man 96.4
cowboy hat 94.3
text 93.7
sun hat 88.9
fedora 88.9
human face 79.6
standing 79.5
posing 58.8
old 58
smile 55.7
work-clothing 20.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 39-57
Gender Male, 99.8%
Calm 99.7%
Sad 0.1%
Angry 0.1%
Surprised 0%
Happy 0%
Confused 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 58
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Hat 99.7%
Person 99.7%
Wheel 90.2%

Categories

Imagga

paintings art 98.6%

Captions