Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.420

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.420

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Hat 99.9
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Photography 98.2
Person 98
Outdoors 97.2
Architecture 96.3
Building 96.3
Shelter 96.3
Shorts 96
Nature 93.5
Face 93.2
Head 93.2
Sun Hat 92.6
Portrait 85.7
Car 81.9
Transportation 81.9
Vehicle 81.9
Countryside 70.6
Hut 70.6
Rural 70.6
Animal 69.5
Mammal 69.5
Bull 57.7
Saddle 56.8
Shirt 56.7
Cap 56
Bag 55.5
Shack 55.1
Pants 55

Clarifai
created on 2018-05-11

people 99.9
adult 98
man 96.7
one 94.6
two 92.9
group 91
woman 89.5
wear 86
group together 85.9
monochrome 84.3
actor 82.4
administration 80.7
leader 79.9
family 79.3
recreation 78.6
three 75.9
transportation system 75.3
child 73.7
four 73.6
home 73.2

Imagga
created on 2023-10-06

sky 24.2
landscape 21.6
billboard 19.7
structure 19.7
flag 17.9
outdoors 15.7
signboard 15.4
tree 15.4
architecture 15
travel 14.8
shop 14.7
building 14.3
rural 14.1
old 13.9
winter 13.6
emblem 13.6
mountain 13.3
trees 13.3
city 13.3
cold 12.9
snow 12.7
outdoor 12.2
lake 11.9
mercantile establishment 11.5
park 11.5
clouds 11
summer 10.9
newspaper 10.7
water 10.7
scenic 10.5
grass 10.3
countryside 10
road 9.9
tourism 9.9
product 9.4
vacation 9
stall 8.9
bakery 8.9
country 8.8
forest 8.7
natural 8.7
scene 8.7
house 8.4
creation 8.2
sun 8
river 8
world 7.7
outside 7.7
daily 7.7
quiet 7.7
walk 7.6
field 7.5
sign 7.5
hill 7.5
place of business 7.5
environment 7.4
street 7.4
color 7.2
horizon 7.2
day 7.1
season 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

tree 99.8
outdoor 99.5
text 85.7
sign 24.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 10-18
Gender Female, 83.6%
Sad 94.3%
Calm 46.3%
Surprised 6.3%
Confused 6.3%
Fear 5.9%
Angry 0.5%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Calm 98.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.4%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 11-19
Gender Female, 98.9%
Sad 45.5%
Fear 25%
Calm 22.4%
Surprised 9.4%
Disgusted 7.8%
Happy 5.8%
Angry 4.7%
Confused 3.8%

Microsoft Cognitive Services

Age 10
Gender Male

Microsoft Cognitive Services

Age 12
Gender Female

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Car 81.9%

Categories

Captions

Text analysis

Amazon

HORNS
TAILS
RAMO
4° HORNS
RAMO ALIVES
5 & 10
ALIVES

Google

RAMO ALIVE 4 HORNS TAILS
RAMO
ALIVE
4
HORNS
TAILS