Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3139

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3139

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Art 99.2
Painting 99.2
Person 98.8
Person 98.4
Person 98.4
Person 98.3
Adult 98.3
Bride 98.3
Female 98.3
Wedding 98.3
Woman 98.3
Person 97.5
Person 96.5
Adult 96.5
Male 96.5
Man 96.5
Person 93.3
Outdoors 91
Person 86.1
Baby 86.1
Accessories 82.2
Bag 82.2
Handbag 82.2
Person 78.9
Head 73.1
Clothing 69.5
Dress 69.5
Face 69.3
People 67.4
Tent 57.2
Architecture 56.7
Building 56.7
Shelter 56.7
Handbag 56.1
Bus Stop 56.1
Fashion 55.8
Formal Wear 55.8
Gown 55.8
Play Area 55.6
Stage 55

Clarifai
created on 2018-05-10

people 100
adult 99.2
group 98.2
man 96.7
child 96.6
group together 95.4
war 91.7
woman 91.5
many 89.7
furniture 89.2
sit 88.9
administration 86.9
two 84.8
family 84.6
several 82.4
one 81
home 81
canine 80.1
three 80
wear 79.9

Imagga
created on 2023-10-06

old 24.4
upright 24.1
building 19.9
stall 18.9
architecture 18.7
piano 18.4
house 17.5
window 15.7
wall 15.6
percussion instrument 15.6
stringed instrument 14.2
keyboard instrument 14
door 13.6
structure 13.6
vintage 12.4
antique 12.3
ancient 12.1
city 11.6
interior 11.5
musical instrument 11.4
home 11.2
wood 10.8
abandoned 10.7
history 10.7
light 10.7
travel 10.6
scene 10.4
stone 10.3
grunge 10.2
room 10.1
dirty 9.9
broken 9.6
urban 9.6
sky 9.6
roof 9.5
garage 9.4
construction 9.4
religion 9
metal 8.8
rural 8.8
factory 8.8
industry 8.5
black 8.4
outdoor 8.4
town 8.3
tourism 8.2
environment 8.2
aged 8.1
furniture 7.9
empty 7.7
temple 7.7
windows 7.7
container 7.7
dark 7.5
landscape 7.4
retro 7.4
inside 7.4
art 7.3
shop 7.3
industrial 7.3
steel 7.1
barbershop 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

store 44.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Male, 98.8%
Calm 95.2%
Surprised 6.3%
Fear 5.9%
Disgusted 2.9%
Sad 2.5%
Confused 0.4%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 96.3%
Happy 78.7%
Angry 8.5%
Fear 7.1%
Surprised 6.7%
Sad 4.5%
Calm 2.5%
Disgusted 0.9%
Confused 0.2%

AWS Rekognition

Age 28-38
Gender Male, 98.5%
Calm 91.3%
Surprised 6.6%
Fear 6.1%
Sad 3.1%
Happy 2.4%
Angry 1.6%
Disgusted 0.8%
Confused 0.2%

AWS Rekognition

Age 25-35
Gender Male, 93.4%
Sad 95.1%
Fear 48.7%
Surprised 6.5%
Disgusted 6.1%
Calm 2%
Happy 1.7%
Angry 1.2%
Confused 0.4%

AWS Rekognition

Age 18-26
Gender Female, 72.7%
Sad 49.5%
Calm 48.2%
Angry 10.9%
Surprised 7.8%
Fear 7.6%
Happy 3.2%
Disgusted 3%
Confused 1.3%

AWS Rekognition

Age 24-34
Gender Male, 99.4%
Calm 57.2%
Sad 19.1%
Happy 12.6%
Fear 7.7%
Surprised 7.4%
Disgusted 3%
Confused 2.5%
Angry 1.8%

Feature analysis

Amazon

Person 98.8%
Adult 98.3%
Bride 98.3%
Female 98.3%
Woman 98.3%
Male 96.5%
Man 96.5%
Baby 86.1%
Handbag 82.2%

Categories

Text analysis

Amazon

GIRL
PERSO
MAKE
MAKE IT
IT
WILL
CLOTH
MORE
JUST
FAT
PERSO NALITY FAT GIRL
OF
JUST BULT ONE WILL OF MORE CLOTH
NALITY
5%
ONE
CHILESEN
10
BULT