Human Generated Data

Title

Farmers at public auction, central Ohio

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3040

Human Generated Data

Title

Farmers at public auction, central Ohio

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Clothing 100
Apparel 100
Person 99.8
Human 99.8
Hat 99.8
Hat 99.6
Person 98.8
Person 98.5
Sun Hat 94.8
Cowboy Hat 72.7

Imagga
created on 2021-12-15

hat 100
cowboy hat 98.4
headdress 59.6
clothing 50.3
man 47.7
male 39.8
person 32.2
people 29.6
old-timer 25.9
covering 24.8
consumer goods 24.6
portrait 24.6
senior 22.5
men 21.5
happy 19.4
face 19.2
hand 19
old 18.1
adult 18.1
two 17.8
uniform 17.5
guy 16.5
work 16.5
cowboy 15.9
smile 15.7
couple 15.7
worker 15.3
shirt 14.9
grandfather 13.9
outdoors 13.4
elderly 13.4
look 13.1
mature 13
military uniform 12.9
smiling 12.3
together 12.3
looking 12
one 11.9
love 11.8
professional 11.8
western 11.6
job 11.5
style 11.1
hair 11.1
expression 11.1
occupation 11
beard 10.8
handsome 10.7
fashion 10.6
husband 10.5
wife 10.4
black 10.2
happiness 10.2
emotion 10.1
lifestyle 10.1
attractive 9.8
family 9.8
serious 9.5
pair 9.4
model 9.3
casual 9.3
cowgirl 8.9
building 8.7
affectionate 8.7
married 8.6
loving 8.6
wearing 8.6
close 8.6
horse 8.5
business 8.5
relationship 8.4
engineer 8.3
human 8.2
aged 8.1
interior 8
medical 7.9
standing 7.8
labor 7.8
hug 7.7
affection 7.7
helmet 7.7
outside 7.7
industry 7.7
head 7.6
meeting 7.5
leisure 7.5
vintage 7.4
retro 7.4
friendly 7.3
cheerful 7.3
indoor 7.3
room 7.3
pose 7.3
romance 7.1
posing 7.1
businessman 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

fashion accessory 99.7
fedora 98.5
person 98.4
man 98.2
cowboy hat 97.6
sun hat 96.3
clothing 95.9
text 94.2
human face 93.6
hat 90.5
outdoor 90
old 89.8
white 69.1

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 94.3%
Fear 40.1%
Calm 19.6%
Sad 15.3%
Confused 12.9%
Disgusted 7.5%
Surprised 3.2%
Angry 1.1%
Happy 0.3%

AWS Rekognition

Age 52-70
Gender Male, 99.1%
Calm 94.3%
Sad 3.1%
Confused 1.1%
Happy 1%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 10-20
Gender Female, 53.2%
Calm 65.3%
Angry 22.7%
Sad 7.2%
Happy 1.4%
Confused 1.2%
Disgusted 1%
Surprised 0.8%
Fear 0.6%

Microsoft Cognitive Services

Age 58
Gender Male

Microsoft Cognitive Services

Age 35
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 99.8%

Captions

Microsoft

a man wearing a hat 94.4%
an old photo of a man wearing a hat 93.6%
an old photo of a man in a hat 93.5%