Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.558

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.558

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 99.9
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Chair 92.4
Furniture 92.4
Person 92.4
Face 86.5
Head 86.5
Machine 86.2
Wheel 86.2
Outdoors 84.1
Chair 83.9
Wheel 82.6
Car 78.9
Transportation 78.9
Vehicle 78.9
Hat 78.9
Car 76.5
Wheel 72.6
Spoke 67.6
Wheel 67
Footwear 64.6
Shoe 64.6
Wheel 63.5
Shoe 58.6
Shirt 57.9
Cap 57.6
Architecture 57.5
Building 57.5
Factory 57.5
Manufacturing 57.5
Hat 57.4
Sun Hat 57.4
Antique Car 56.9
Model T 56.9
Baseball Cap 56.7
Assembly Line 55.1
Motor 55

Clarifai
created on 2018-05-11

people 99.9
adult 99.3
group together 98.9
group 98.5
vehicle 97.6
man 97.5
many 95.6
one 93.9
transportation system 93.7
several 93.2
administration 92.1
military 91.6
leader 91.5
two 91
war 90.1
three 86.2
uniform 85.4
wear 83.5
four 83.5
woman 81.7

Imagga
created on 2023-10-06

brass 64.2
wind instrument 48.8
cornet 46.9
musical instrument 35.4
man 24.2
male 19.9
motor vehicle 18.7
trombone 18.2
equipment 17.4
people 16.2
device 15.5
car 15.2
hat 15
uniform 14.9
men 13.7
outdoors 12.8
gun 12.7
golf equipment 12.6
person 12.3
horn 12.3
wheeled vehicle 12.2
industry 11.9
grass 11.9
sport 11.5
work 11
vehicle 10.5
clothing 10.4
old 10.4
model t 10.4
building 10.3
sky 10.2
day 10.2
sports equipment 10.1
outdoor 9.9
park 9.9
transportation 9.9
adult 9.7
horse 9.6
outside 9.4
travel 9.1
city 9.1
leisure 9.1
industrial 9.1
vacation 9
history 8.9
construction 8.6
smile 8.5
drive 8.5
weapon 8.4
house 8.4
transport 8.2
playing 8.2
helmet 8.1
machine 8
steel 8
worker 7.9
rural 7.9
architecture 7.8
machinery 7.8
military 7.7
club 7.5
street 7.4
historic 7.3
road 7.2
job 7.1
military uniform 7
together 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

tree 99.8
outdoor 99.7
person 97.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 81.9%
Calm 51.6%
Sad 46.1%
Disgusted 13.3%
Surprised 7.1%
Fear 6.4%
Happy 2.3%
Angry 2.2%
Confused 2.1%

AWS Rekognition

Age 51-59
Gender Male, 99.2%
Calm 62.3%
Happy 23.1%
Fear 6.9%
Surprised 6.6%
Sad 5.9%
Angry 2.6%
Disgusted 0.8%
Confused 0.6%

AWS Rekognition

Age 22-30
Gender Female, 72.1%
Sad 62.8%
Calm 58.4%
Confused 7%
Surprised 6.5%
Fear 6.1%
Disgusted 2.3%
Angry 0.4%
Happy 0.4%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Chair 92.4%
Wheel 86.2%
Car 78.9%
Hat 78.9%
Shoe 64.6%

Captions

Microsoft
created on 2018-05-11

a man sitting on a bus 47%
a man sitting on a bench 46.9%
a man sitting at a bus stop 46.8%

Text analysis

Amazon

CANTON
ARD
OHIO
JOHN
JOHN DEERE
DEERE
*500
كللم ARD الوحدة
كللم
الوحدة