Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.696

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.696

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 99.6
Female 99.6
Person 99.6
Woman 99.6
Adult 99
Person 99
Male 99
Man 99
Adult 98.1
Female 98.1
Person 98.1
Woman 98.1
Person 97.2
Male 97.2
Boy 97.2
Child 97.2
Person 94.3
Accessories 93.2
Formal Wear 93.2
Tie 93.2
Crowd 90.5
Face 90
Head 90
Person 88.9
Baby 88.9
People 84.6
Clothing 84.6
Shirt 84.6
Hat 67.6
Shop 66.9
Furniture 63.9
Table 63.9
Indoors 62
Jewelry 61.2
Necklace 61.2
Market 60.6
Audience 57.8
Speech 57.6
Pub 57
Architecture 56.9
Building 56.9
Factory 56.9
Electrical Device 56.8
Microphone 56.8
Restaurant 56.6
Clinic 56
Classroom 55.9
Room 55.9
School 55.9
Hospital 55.8
Photography 55.5
Portrait 55.5
Box 55.2
Body Part 55.2
Finger 55.2
Hand 55.2

Clarifai
created on 2018-05-11

people 99.9
adult 98.5
group 98
monochrome 97.8
child 96.1
group together 96
man 94.6
woman 90.6
two 90.6
several 90.3
three 90.1
recreation 89.8
wear 89.6
administration 89.1
sit 88.3
four 87.4
indoors 86.5
many 84.5
education 84
one 82.9

Imagga
created on 2023-10-06

man 41
seller 36.3
male 32.6
people 29
person 26.1
adult 22.6
smiling 21
happy 20
old 18.8
lifestyle 18.8
sitting 18
men 18
work 17.8
senior 16.9
indoors 16.7
classroom 16.2
smile 15.7
education 15.6
home 15.1
portrait 14.9
marimba 14.4
school 13.7
room 13.6
teacher 13.5
couple 13.1
mature 13
looking 12.8
business 12.7
casual 12.7
happiness 12.5
percussion instrument 12.5
outdoors 11.9
shop 11.9
women 11.9
class 11.6
elderly 11.5
day 11
musical instrument 10.8
office 10.8
restaurant 10.7
loom 10.6
hand 10.6
businessman 10.6
cheerful 10.6
together 10.5
group 10.5
indoor 10
color 10
holding 9.9
stall 9.9
worker 9.9
handsome 9.8
job 9.7
working 9.7
student 9.2
horizontal 9.2
leisure 9.1
aged 9
family 8.9
table 8.8
standing 8.7
mid adult 8.7
30s 8.7
retirement 8.6
college 8.5
textile machine 8.5
counter 8.5
black 8.4
blackboard 8.3
occupation 8.2
machine 8.2
mercantile establishment 8.1
professional 7.9
boy 7.8
hands 7.8
face 7.8
retired 7.8
chair 7.7
house 7.5
relaxed 7.5
service 7.4
clothing 7.4
inside 7.4
interior 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 85.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Male, 99%
Confused 43.1%
Sad 27.8%
Calm 24.5%
Surprised 6.9%
Fear 6.7%
Disgusted 4.8%
Angry 3.9%
Happy 0.5%

AWS Rekognition

Age 35-43
Gender Female, 98.4%
Calm 97.2%
Surprised 6.4%
Fear 5.9%
Sad 2.5%
Confused 1.1%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 16-22
Gender Female, 51.5%
Calm 92.1%
Surprised 6.3%
Fear 5.9%
Sad 5.5%
Angry 0.4%
Happy 0.3%
Confused 0.2%
Disgusted 0.1%

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Female 99.6%
Person 99.6%
Woman 99.6%
Male 99%
Man 99%
Boy 97.2%
Child 97.2%
Tie 93.2%
Baby 88.9%
Necklace 61.2%

Categories

Imagga

paintings art 97.3%
people portraits 1.1%

Text analysis

Amazon

FROM
2
STRANGE
YOU
ENSAT
FORGET
DE
N° 2
STRANGE PE
ENSAT LACTS
FROM AL
PE
LACTS
the
ICL
AL
O

Google

TRANGE PE FROM
TRANGE
PE
FROM