Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3037

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Eighth Avenue and Forty-second Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3037

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Chair 99.1
Furniture 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98
Male 98
Man 98
Person 98
Person 93.9
Clothing 91.8
Coat 91.8
Person 90.9
Face 88
Head 88
Person 85
Person 81.4
Barbershop 81.4
Indoors 81.4
Person 81.2
Crowd 76.5
Person 72.5
Art 63.2
Painting 63.2
Person 61
Machine 61
Wheel 61
Table 57.2
Audience 56.1
Speech 56.1
Formal Wear 56.1
Suit 56.1
Debate 56.1
Electrical Device 55.6
Microphone 55.6
Desk 55.4

Clarifai
created on 2018-05-10

people 99.9
adult 99.2
group 98.4
man 96.4
group together 96.4
administration 95.5
two 95.1
one 94.2
vehicle 91.7
woman 91.3
war 90.9
three 89
room 87.3
four 87.2
military 86.8
several 85.9
wear 85.6
five 84.2
police 82.9
many 82.7

Imagga
created on 2023-10-07

barbershop 31
man 28.2
shop 25.7
male 25.5
business 23.7
musical instrument 22.7
people 22.3
person 21.3
mercantile establishment 19.7
office 18.3
adult 18.2
businessman 16.8
suit 16.7
device 14.8
corporate 14.6
building 14.1
silhouette 14.1
place of business 13.1
men 12.9
wall 12.8
laptop 12.8
black 12.7
city 12.5
sitting 12
job 10.6
working 10.6
computer 10.5
modern 10.5
urban 10.5
percussion instrument 10.3
street 10.1
lifestyle 10.1
window 10.1
attractive 9.8
style 9.6
looking 9.6
work 9.4
equipment 9.3
alone 9.1
electronic instrument 9.1
music 9.1
fashion 9
room 9
outdoors 9
wind instrument 9
handsome 8.9
technology 8.9
night 8.9
success 8.8
interior 8.8
brass 8.7
executive 8.5
career 8.5
keyboard instrument 8.5
dark 8.3
leisure 8.3
upright 8.1
professional 7.8
portrait 7.8
outside 7.7
worker 7.7
piano 7.6
one 7.5
holding 7.4
single 7.4
light 7.4
indoor 7.3
women 7.1
restaurant 7.1
glass 7.1
day 7.1
architecture 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 93.3
old 40.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Calm 35.8%
Confused 28.9%
Fear 22.6%
Surprised 8.2%
Sad 4.7%
Angry 1.5%
Happy 0.9%
Disgusted 0.9%

AWS Rekognition

Age 23-31
Gender Female, 89.3%
Sad 99.9%
Angry 16%
Surprised 6.9%
Fear 5.9%
Confused 1.2%
Happy 0.8%
Disgusted 0.5%
Calm 0.4%

AWS Rekognition

Age 48-56
Gender Male, 100%
Calm 48.4%
Surprised 11.7%
Sad 11.4%
Fear 8.5%
Angry 7.9%
Disgusted 7.6%
Confused 5.5%
Happy 4.3%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Surprised 99.5%
Fear 6.1%
Calm 3.2%
Sad 2.2%
Confused 1.7%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 25-35
Gender Female, 90.1%
Calm 52.7%
Surprised 15.8%
Disgusted 13.9%
Confused 11.2%
Fear 7.2%
Sad 3.5%
Angry 2.1%
Happy 1.3%

AWS Rekognition

Age 24-34
Gender Female, 90.5%
Surprised 90%
Fear 29%
Calm 4.3%
Confused 3.7%
Happy 2.6%
Sad 2.6%
Disgusted 1.9%
Angry 1.2%

AWS Rekognition

Age 18-26
Gender Female, 67%
Calm 98.2%
Surprised 6.3%
Fear 5.9%
Sad 2.7%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 23-31
Gender Male, 82.5%
Calm 73.1%
Surprised 11.2%
Fear 6.5%
Disgusted 6.2%
Angry 5.4%
Sad 3%
Happy 2.1%
Confused 1.7%

AWS Rekognition

Age 21-29
Gender Female, 95.7%
Surprised 82.7%
Calm 28.2%
Confused 17.2%
Fear 6.2%
Sad 2.5%
Happy 1.2%
Disgusted 0.9%
Angry 0.9%

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%
Chair 99.1%
Coat 91.8%
Wheel 61%

Text analysis

Amazon

INSTITUTE
NATURAL INSTITUTE
NATURAL
OPEN
AIR
OF
TEETH
OPEN AIR UNITED
HEALTH
UNITED