Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.711

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.711

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 99.3
Building 99.3
Outdoors 99.3
Shelter 99.3
Person 99
Person 98.6
Adult 98.6
Bride 98.6
Female 98.6
Wedding 98.6
Woman 98.6
Person 98.5
Person 97.2
Adult 97.2
Bride 97.2
Female 97.2
Woman 97.2
Person 95.4
Face 89.6
Head 89.6
Person 88.4
Adult 88.4
Bride 88.4
Female 88.4
Woman 88.4
Photography 73.4
Hairdresser 68.3
Barbershop 67.9
Indoors 67.9
Text 63.2
Back 56.4
Body Part 56.4
Portrait 55.5
Urban 55.3
Circus 55.3
Leisure Activities 55.3
Neck 55.2

Clarifai
created on 2018-05-11

people 99.9
group 99.2
adult 98.3
group together 97
administration 96.1
woman 96.1
man 95
several 94.3
leader 93.6
music 92.9
war 92.1
many 90.4
vehicle 90
military 89.1
child 88.7
five 88.5
wear 88
three 87.9
one 87.5
four 87.4

Imagga
created on 2023-10-06

billboard 45.8
signboard 37.1
daily 35.2
newspaper 31.6
structure 29.5
product 21.4
stall 20.9
man 18.1
barbershop 17.7
business 17
shop 16.7
creation 16.6
old 13.9
world 13.7
black 13.2
building 12.6
mercantile establishment 12.6
light 12
sky 11.5
text 11.3
person 10.9
male 10.6
finance 10.1
hand 9.9
sign 9.8
businessman 9.7
education 9.5
people 9.5
architecture 9.4
financial 8.9
symbol 8.7
water 8.7
construction 8.6
money 8.5
place of business 8.4
dark 8.3
office 8.3
school 8.1
power 7.6
city 7.5
technology 7.4
student 7.2
success 7.2
looking 7.2
bank 7.2
night 7.1
work 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 93
person 92.2
accessory 34.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-27
Gender Male, 99.9%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 19-27
Gender Female, 99.9%
Sad 99.8%
Calm 15.4%
Surprised 6.5%
Fear 6%
Confused 5.9%
Angry 5.9%
Disgusted 1.3%
Happy 0.2%

AWS Rekognition

Age 18-26
Gender Female, 82.5%
Calm 98.2%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Happy 0.6%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 21-29
Gender Female, 96.9%
Sad 77.9%
Calm 59.9%
Surprised 6.5%
Fear 6.1%
Disgusted 1.8%
Confused 0.7%
Angry 0.5%
Happy 0.5%

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Adult 98.6%
Bride 98.6%
Female 98.6%
Woman 98.6%

Text analysis

Amazon

FROM
STRANGE
PAI
FROM ALL PAI
ALL
ISLAND
STRANGE PEOPLE
SIL
PEOPLE
CIRCLE

Google

SIP STRANGE PEOPLE FROM ALL
SIP
STRANGE
PEOPLE
FROM
ALL