Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2087

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2087

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Hat 100
Architecture 99.6
Building 99.6
Outdoors 99.6
Shelter 99.6
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Sun Hat 99
Person 98.8
Adult 98
Male 98
Man 98
Person 98
Photography 96
Nature 95.3
Face 93
Head 93
Countryside 89
Hut 89
Rural 89
Portrait 88.3
Shorts 61.2
Bag 60.4
Car 59.8
Transportation 59.8
Vehicle 59.8
Cap 57
Animal 56.7
Mammal 56.7
Bull 55.9
Shirt 55.9
Shack 55.8
Baseball Cap 55.6
Coat 55

Clarifai
created on 2018-05-10

people 99.9
adult 98.3
man 96.7
one 96
two 95.6
group 93
woman 91.1
wear 89.3
group together 89.2
actor 84.9
transportation system 84.6
administration 83.9
leader 80.8
vehicle 80
three 77.9
home 76.9
recreation 76.1
furniture 74.8
lid 74.1
four 72.3

Imagga
created on 2023-10-06

shop 22.3
old 21.6
sky 20.4
architecture 20
building 19.1
structure 19
bakery 18.9
mercantile establishment 17.3
travel 15.5
rural 15
winter 14.5
house 14.3
city 14.1
landscape 13.4
trees 12.4
scene 12.1
outdoors 11.9
place of business 11.5
cold 11.2
exterior 11.1
snow 10.8
tree 10.8
tourism 10.7
park 10.4
musical instrument 10
outdoor 9.9
vacation 9.8
country 9.7
door 9.5
flag 9.3
scenic 8.8
roof 8.7
wall 8.7
ancient 8.6
holiday 8.6
wood 8.3
vintage 8.3
lake 8.2
water 8
forest 7.8
brick 7.8
stone 7.6
clouds 7.6
sign 7.5
billboard 7.4
home 7.2
religion 7.2
summer 7.1
day 7.1
wooden 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

tree 99.6
outdoor 98.8
sign 16.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Male, 59.8%
Sad 99.5%
Calm 33.7%
Surprised 6.3%
Fear 5.9%
Confused 1.8%
Angry 0.4%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 9-17
Gender Female, 99.3%
Sad 53.5%
Calm 30.9%
Fear 15.1%
Disgusted 11.8%
Surprised 7.2%
Angry 5.9%
Happy 3.4%
Confused 2.9%

Microsoft Cognitive Services

Age 9
Gender Female

Microsoft Cognitive Services

Age 8
Gender Male

Microsoft Cognitive Services

Age 41
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Car 59.8%

Captions

Text analysis

Amazon

HORNS
RAMO
TAILS
4° HORNS
RAMO ALIVES
ALIVES
5&10

Google

HORNS TAILS 5 & 10
HORNS
TAILS
5
&
10