Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

Date

August 6, 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.811

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (public auction, A.H. Buchwalter farm, near Hilliards, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 6, 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.811

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Sun Hat 100
Adult 98
Male 98
Man 98
Person 98
Adult 97.9
Person 97.9
Female 97.9
Woman 97.9
Adult 96.8
Male 96.8
Man 96.8
Person 96.8
Adult 96.6
Male 96.6
Man 96.6
Person 96.6
Person 96.1
Baby 96.1
Person 95.8
Adult 93.4
Male 93.4
Man 93.4
Person 93.4
Adult 93
Male 93
Man 93
Person 93
Person 92.4
Hat 91.8
Person 90.5
Hat 90
Person 89.8
Adult 88.7
Person 88.7
Female 88.7
Woman 88.7
Hat 86.2
Face 86.2
Head 86.2
Hat 84
Person 83.2
Hat 82
Hat 65.6
Hat 65.5
Indoors 57.7
Cap 56.4
Cowboy Hat 55.4

Clarifai
created on 2018-05-11

people 100
group together 99.7
group 99.3
many 99
adult 98.5
man 97
several 94.8
veil 94.3
lid 94.3
wear 93.5
woman 93
outfit 91.7
watercraft 90.2
military 90
vehicle 89.8
recreation 88.8
transportation system 86.4
one 84.2
administration 83.8
chair 81.1

Imagga
created on 2023-10-06

table 33.7
dinner 31.7
restaurant 31
cup 26.5
meal 25.2
container 23.6
setting 22.2
food 21.8
plate 21.6
interior 19.5
banquet 19.4
drink 19.2
party 18.9
luxury 18
dining 17.1
coffee 16.8
reception 16.6
glass 16.3
china 16.2
lunch 16.1
celebration 15.9
wedding 15.6
eat 15.1
event 14.8
napkin 14.7
dish 14.2
service 13.9
breakfast 12.3
room 12.1
wine 12
spoon 12
vessel 11.8
catering 11.7
fork 11.7
black 11.4
elegant 11.1
chair 11
decoration 10.9
silverware 10.8
dine 10.8
tea 10.7
knife 10.6
formal 10.5
tableware 10.4
set 10.2
porcelain 10.2
utensil 10.2
man 10.1
kitchen 10
cups 9.8
business 9.7
drinking 9.6
hotel 9.5
tabletop 9.5
place 9.3
glasses 9.3
cuisine 8.9
decor 8.8
indoors 8.8
arrangement 8.7
serving 8.7
flowers 8.7
hand 8.4
bowl 8.3
furniture 8.2
person 8
close 8
male 7.9
beverage 7.9
hat 7.9
plates 7.9
tablecloth 7.8
nutriment 7.8
saucer 7.8
fancy 7.7
fine 7.6
clothing 7.4
inside 7.4
indoor 7.3
empty 7.3
people 7.2
silver 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.2
people 71.8
group 61.1

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 6-14
Gender Female, 97.6%
Sad 99.1%
Confused 17.4%
Calm 14.4%
Surprised 6.5%
Fear 6.1%
Happy 3.2%
Angry 2.5%
Disgusted 1.7%

AWS Rekognition

Age 28-38
Gender Male, 74.3%
Calm 95.6%
Surprised 7%
Fear 5.9%
Sad 2.4%
Confused 1.7%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 46
Gender Male

Feature analysis

Amazon

Adult 98%
Male 98%
Man 98%
Person 98%
Female 97.9%
Woman 97.9%
Baby 96.1%
Hat 91.8%

Categories

Imagga

paintings art 71.5%
food drinks 15.1%
people portraits 10.4%