Human Generated Data

Title

Untitled (New Orleans, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1512

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New Orleans, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1512

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Face 99.8
Head 99.8
Photography 99.8
Portrait 99.8
Sun Hat 99.7
Person 99.1
Child 99.1
Female 99.1
Girl 99.1
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.7
Male 98.7
Man 98.7
Person 98.6
Hat 98
Person 94.6
Person 94.5
Shirt 90.5
Accessories 81.2
Glasses 81.2
People 78.8
Baseball Cap 69.2
Cap 69.2
Sunglasses 57.4
Earring 56.2
Jewelry 56.2
Crowd 56
Body Part 55.9
Neck 55.9

Clarifai
created on 2018-05-11

people 99.9
adult 99.3
group 99.2
man 98.5
group together 97.1
wear 96.9
woman 96.5
veil 95.2
two 95.1
lid 94.3
four 93.7
portrait 92.8
three 92.1
facial expression 90.9
administration 88.8
leader 88.2
five 87.9
several 87.7
outfit 86.7
uniform 86.3

Imagga
created on 2023-10-05

man 43
male 30.6
barbershop 29.5
person 29.5
people 28.4
shop 23.6
work 23.5
megaphone 22.7
happy 20
acoustic device 19.3
men 18.9
senior 18.7
mercantile establishment 17.9
nurse 17.5
smiling 17.4
portrait 16.8
device 16.3
hairdresser 15.9
worker 15.8
face 15.6
sitting 15.5
adult 15.1
professional 14.8
home 14.4
smile 14.2
hat 13.5
handsome 13.4
black 13.2
lifestyle 13
looking 12.8
elderly 12.4
job 12.4
indoors 12.3
place of business 11.9
old 11.8
happiness 11.7
hand 11.4
couple 11.3
mature 11.2
child 11
occupation 11
look 10.5
standing 10.4
industry 10.2
business 9.7
medical 9.7
casual 9.3
two 9.3
leisure 9.1
grandfather 9.1
salon 9.1
color 8.9
businessman 8.8
hair 8.7
day 8.6
glasses 8.3
fashion 8.3
outdoors 8.2
family 8
kid 8
patient 7.9
life 7.9
student 7.8
restaurant 7.8
clothing 7.8
education 7.8
retired 7.8
room 7.7
world 7.7
profession 7.7
study 7.5
tool 7.4
holding 7.4
camera 7.4
safety 7.4
inside 7.4
guy 7.4
cook 7.3
indoor 7.3
school 7.2
kitchen 7.2
father 7.1
love 7.1
little 7.1
together 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 100
man 97.1
outdoor 88.9
standing 83.8
people 81.1
group 66.3
posing 58
old 50.4
crowd 1.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-20
Gender Female, 100%
Calm 67.8%
Fear 33.1%
Surprised 6.6%
Sad 2.3%
Confused 0.3%
Disgusted 0.1%
Angry 0%
Happy 0%

AWS Rekognition

Age 16-22
Gender Female, 93%
Calm 89.3%
Surprised 6.3%
Fear 6%
Sad 4.9%
Happy 2.7%
Confused 1%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 6-14
Gender Male, 85.4%
Sad 98.5%
Calm 41.5%
Surprised 6.3%
Fear 5.9%
Confused 1.8%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Male, 92.5%
Calm 73.9%
Sad 14.2%
Fear 7.1%
Surprised 6.4%
Angry 3.8%
Happy 2.9%
Disgusted 1.3%
Confused 0.7%

AWS Rekognition

Age 21-29
Gender Female, 74.8%
Fear 78.4%
Calm 14%
Sad 9.4%
Surprised 7.8%
Confused 4.1%
Happy 3.1%
Angry 2.6%
Disgusted 2.1%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 14
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Child 99.1%
Female 99.1%
Girl 99.1%
Adult 98.7%
Male 98.7%
Man 98.7%
Hat 98%
Glasses 81.2%

Categories

Imagga

people portraits 98.9%

Text analysis

Amazon

and