Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2968

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2968

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Electrical Device 99.3
Microphone 99.3
People 99.3
Person 99.2
Adult 99.2
Male 99.2
Man 99.2
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99
Adult 99
Male 99
Man 99
Person 97.5
Adult 97.5
Female 97.5
Woman 97.5
Clothing 96.6
Coat 96.6
Person 93
Adult 93
Male 93
Man 93
Face 89.6
Head 89.6
Electronics 86.5
Screen 86.5
Accessories 85.5
Formal Wear 85.5
Tie 85.5
Photography 78.7
Architecture 77.3
Building 77.3
Hospital 77.3
Computer Hardware 70.7
Hardware 70.7
Monitor 70.7
Indoors 67.7
Crowd 67.3
Art 62.1
Painting 62.1
Suit 61
Portrait 60.9
TV 60.3
Speaker 57.5
Classroom 57
Room 57
School 57
Blazer 56.8
Jacket 56.8
Furniture 56.7
Table 56.7
Clinic 56.5
Audience 56
Studio 55.7
Cinema 55.6
Hairdresser 55.5
Captain 55.4
Officer 55.4
Astronomy 55.2
Outer Space 55.2
Photographer 55.1
Speech 55

Clarifai
created on 2018-05-10

people 100
adult 98.8
group 98.3
two 98.2
man 96.3
administration 94.8
group together 94.8
three 94.5
one 88.9
woman 87.9
four 87.4
leader 86
room 85.6
military 84.9
furniture 84
wear 84
five 82.9
several 82.8
vehicle 82.1
war 81.6

Imagga
created on 2023-10-06

man 33.6
people 32.3
male 31.2
person 25.4
adult 23.8
men 19.7
business 19.4
couple 18.3
businessman 17.6
office 17.1
musical instrument 16.8
dress 16.3
two 16.1
groom 15.9
wind instrument 14.4
modern 14
happy 13.8
room 13.7
brass 13.2
corporate 12.9
indoor 12.8
happiness 12.5
indoors 12.3
teacher 12.2
looking 12
love 11.8
bride 11.7
cornet 11.7
professional 11.7
handsome 11.6
interior 11.5
suit 11.4
fashion 11.3
wedding 11
smiling 10.8
clothing 10.8
group 10.5
black 10.5
building 10.4
home 10.4
casual 10.2
family 9.8
husband 9.7
urban 9.6
women 9.5
sitting 9.4
meeting 9.4
lifestyle 9.4
elegance 9.2
city 9.1
confident 9.1
holding 9.1
chair 9
world 9
team 9
job 8.8
life 8.7
wife 8.5
hand 8.4
window 8.2
together 7.9
day 7.8
standing 7.8
education 7.8
device 7.8
portrait 7.8
shop 7.8
corporation 7.7
pretty 7.7
educator 7.7
marriage 7.6
desk 7.6
bouquet 7.5
style 7.4
inside 7.4
alone 7.3
hairdresser 7.2
board 7.2
celebration 7.2
romance 7.1
face 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

man 98.3
person 96.4
standing 91.3
old 76.4
posing 61.8
clothes 16.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-36
Gender Male, 100%
Calm 76.1%
Sad 19.8%
Surprised 6.8%
Fear 6%
Angry 2.6%
Confused 1.8%
Disgusted 0.6%
Happy 0.5%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 75.1%
Angry 12.8%
Surprised 8.2%
Fear 7.9%
Sad 2.5%
Happy 1.9%
Confused 0.6%
Disgusted 0.5%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Sad 84%
Calm 59.9%
Surprised 6.3%
Fear 5.9%
Confused 0.9%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Adult 99.2%
Male 99.2%
Man 99.2%
Female 97.5%
Woman 97.5%
Tie 85.5%

Text analysis

Amazon

STR
OF
PLO
L
TE STR
RING
TE
S L
NA
S
RING 0
0
00-36
TRAE

Google

0 LTG E=
0
E
=
LTG