Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.431

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.431

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Photography 99.2
Boy 98.9
Child 98.9
Male 98.9
Person 98.9
Male 98.7
Person 98.7
Adult 98.7
Man 98.7
Face 98.2
Head 98.2
Portrait 98.2
Male 97.8
Person 97.8
Adult 97.8
Man 97.8
Male 96.7
Person 96.7
Adult 96.7
Man 96.7
Animal 94.3
Horse 94.3
Mammal 94.3
Person 92.6
Advertisement 92.1
Furniture 90.3
Machine 87.9
Wheel 87.9
Person 79
Architecture 77.6
Building 77.6
Outdoors 77.6
Shelter 77.6
Transportation 77.1
Vehicle 77.1
Clothing 56.9
Shorts 56.9
Hat 56.9
Wagon 56.1

Clarifai
created on 2018-05-11

people 100
vehicle 99.7
adult 99.5
one 99.3
transportation system 99.1
watercraft 98.8
two 97.8
group together 97.6
man 96.6
group 96.4
leader 95.6
three 94.4
aircraft 93.8
military 90.3
four 90.2
administration 90
war 84.2
wear 84
monochrome 83.5
woman 82.6

Imagga
created on 2023-10-06

garbage truck 80.6
truck 68.7
motor vehicle 51.2
daily 22.8
city 21.6
architecture 21.1
building 20.1
wheeled vehicle 19.7
old 18.8
industry 17.9
machine 17.4
construction 16.3
billboard 15.7
house 15
work 14.9
structure 14.8
industrial 14.5
transportation 14.3
urban 13.1
signboard 12.7
travel 12.7
steel 12.4
black 10.2
sky 10.2
transport 10
vehicle 10
machinery 9.9
device 9.8
working 9.7
metal 9.7
exterior 9.2
equipment 9.2
car 9
labor 8.8
water 8.7
roof 8.6
stone 8.4
safety 8.3
vintage 8.3
dirty 8.1
newspaper 7.9
office 7.8
boat 7.8
factory 7.7
lamp 7.6
town 7.4
street 7.4
light 7.4
business 7.3
job 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 96.6
outdoor 95.1
old 42.2
posing 35.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Male, 100%
Angry 40.8%
Confused 39.4%
Calm 18%
Surprised 6.5%
Fear 6%
Sad 2.4%
Happy 0.2%
Disgusted 0.2%

AWS Rekognition

Age 14-22
Gender Male, 98.8%
Calm 63.8%
Fear 11.2%
Sad 8.1%
Surprised 7.2%
Confused 5.6%
Angry 4.5%
Disgusted 2.9%
Happy 1.2%

AWS Rekognition

Age 26-36
Gender Female, 75.4%
Sad 100%
Surprised 6.3%
Fear 6%
Calm 0.8%
Confused 0.3%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Boy 98.9%
Child 98.9%
Male 98.9%
Person 98.9%
Adult 98.7%
Man 98.7%
Horse 94.3%
Wheel 87.9%
Hat 56.9%

Categories

Captions

Text analysis

Amazon

SEPT.
AND
DAY AND
DAY
NIGHT
SEPT. 7-10
7-10

Google

DAY AND NIGHT
DAY
AND
NIGHT