Human Generated Data

Title

Untitled (New Orleans, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1491

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New Orleans, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1491

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99
Male 99
Man 99
Person 99
Formal Wear 96.7
Suit 96.7
Adult 93.4
Male 93.4
Man 93.4
Person 93.4
Coat 91.6
Hat 87.8
Face 85.8
Head 85.8
Hat 81.2
Overcoat 77.5
Sun Hat 57.6
Photography 56.3
Architecture 55.9
Balcony 55.9
Building 55.9

Clarifai
created on 2018-05-11

people 100
group together 99.5
adult 99.5
group 98.9
man 98.2
four 97.9
administration 97.2
two 97.1
several 97
three 96.7
outfit 95.8
five 95.6
leader 95.5
wear 95.5
woman 94.1
one 93.6
many 93.1
lid 92.9
veil 91.9
vehicle 89.5

Imagga
created on 2023-10-06

man 36.3
person 27.3
male 26.3
people 24.5
helmet 19.4
clothing 16.7
work 16.5
mask 15.5
ballplayer 13.9
black 13.8
adult 13.8
hat 13.3
player 12.5
portrait 12.3
surgeon 12.2
patient 12
uniform 11.9
nurse 11.8
athlete 11.6
worker 11.4
senior 11.2
men 11.2
equipment 11
world 10.9
industry 10.2
room 10.1
protection 10
old 9.7
business 9.7
military 9.6
shop 9.4
industrial 9.1
human 9
hospital 8.8
medical 8.8
soldier 8.8
indoors 8.8
headdress 8.7
doctor 8.5
weapon 8.4
hand 8.4
contestant 8.3
safety 8.3
occupation 8.2
to 8
job 8
surgery 7.8
war 7.7
health 7.6
cowboy hat 7.6
happy 7.5
city 7.5
restaurant 7.4
religion 7.2
women 7.1
working 7.1
building 7

Microsoft
created on 2018-05-11

person 99.8
standing 75.5
old 67.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 45-51
Gender Male, 100%
Happy 79.9%
Angry 10%
Calm 8%
Surprised 6.7%
Fear 6%
Sad 2.2%
Confused 0.6%
Disgusted 0.1%

AWS Rekognition

Age 54-64
Gender Male, 100%
Happy 92%
Surprised 6.6%
Fear 6.4%
Sad 2.7%
Calm 2.2%
Disgusted 1%
Angry 0.7%
Confused 0.3%

AWS Rekognition

Age 35-43
Gender Male, 99.9%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.1%
Happy 0.1%
Confused 0%
Angry 0%
Disgusted 0%

Microsoft Cognitive Services

Age 64
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Coat 91.6%
Hat 87.8%

Categories

Captions

Text analysis

Amazon

-
.
War . - ...
...
War