Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.397

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.397

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Boy 99.3
Child 99.3
Male 99.3
Person 99.3
Male 98.5
Person 98.5
Adult 98.5
Man 98.5
Male 97
Person 97
Adult 97
Man 97
Face 96.5
Head 96.5
Person 93
Machine 92.5
Wheel 92.5
Animal 90.9
Horse 90.9
Mammal 90.9
Wheel 87
Advertisement 80.7
Transportation 77.7
Vehicle 77.7
Furniture 75.9
Clothing 74.9
Hat 74.9
Photography 68
Portrait 68
Wagon 66.6
Bicycle 60.5
Accessories 57.2
Formal Wear 57.2
Tie 57.2
Horse Cart 56.5
Carriage 55.5
Shorts 55.4
Cycling 55.3
Sport 55.3

Clarifai
created on 2018-05-11

people 100
one 99.1
two 98.8
adult 98.8
group 98.3
vehicle 98.1
three 97.2
group together 97.1
administration 96.9
leader 95.9
child 95.2
monochrome 95
four 94.8
man 94.7
transportation system 94
several 92.3
five 92.1
woman 91.6
wear 91.2
war 89.6

Imagga
created on 2023-10-06

billboard 39.1
daily 35.6
signboard 30.9
structure 29.5
old 26.5
building 22.4
architecture 20.3
city 19.1
urban 14.9
shop 14.6
house 14.2
barbershop 12.5
ancient 12.1
seller 12
travel 12
paper 11.8
dirty 10.8
bank 10.7
business 10.3
money 10.2
world 10.2
finance 10.1
exterior 10.1
aged 10
newspaper 9.9
home 9.6
wall 9.4
mercantile establishment 9.4
stone 9.3
vintage 9.1
sign 9
history 8.9
financial 8.9
sea 8.6
grunge 8.5
office 8.5
sky 8.3
water 8
art 7.8
antique 7.8
construction 7.7
buildings 7.6
boat 7.4
tourism 7.4
town 7.4
retro 7.4
banking 7.4
design 7.3
black 7.2
horse cart 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 98
person 97.4
old 55.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Male, 54.4%
Calm 39.3%
Fear 30.8%
Sad 11.8%
Confused 8.9%
Surprised 7.4%
Angry 4.3%
Disgusted 2.3%
Happy 1.3%

AWS Rekognition

Age 22-30
Gender Male, 100%
Angry 51%
Confused 29.1%
Calm 18.1%
Surprised 6.6%
Fear 6%
Sad 2.3%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 20-28
Gender Male, 94.8%
Calm 86.3%
Surprised 9.9%
Fear 6.1%
Sad 2.8%
Happy 2.2%
Confused 1.5%
Angry 0.9%
Disgusted 0.8%

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Boy 99.3%
Child 99.3%
Male 99.3%
Person 99.3%
Adult 98.5%
Man 98.5%
Wheel 92.5%
Horse 90.9%
Hat 74.9%
Tie 57.2%

Categories

Imagga

cars vehicles 100%

Captions

Text analysis

Amazon

SEPT.
AND
DAY AND
DAY
NIGHT
SEPT. 7-10
7-10

Google

DAY AND NIGHT
DAY
AND
NIGHT