Human Generated Data

Title

Untitled (New Orleans, Louisiana)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1457

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New Orleans, Louisiana)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1457

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Boy 98.7
Child 98.7
Male 98.7
Person 98.7
Boy 98.1
Child 98.1
Male 98.1
Person 98.1
Male 98.1
Person 98.1
Adult 98.1
Man 98.1
Person 97.6
Boy 97.3
Child 97.3
Male 97.3
Person 97.3
Male 97.1
Person 97.1
Adult 97.1
Man 97.1
Boy 96.7
Child 96.7
Male 96.7
Person 96.7
Person 96.5
Person 96
Person 95.9
Person 95.2
Person 93.5
Person 92.5
Person 90
Clothing 89.9
Footwear 89.9
Shoe 89.9
Person 87.8
Face 86.8
Head 86.8
Animal 83
Cat 83
Mammal 83
Pet 83
Shoe 82
Person 78.3
Person 77.6
Shoe 77.2
Person 77.1
Bear 75.1
Black Bear 75.1
Wildlife 75.1
Bull 74.6
Shoe 64.6
Accessories 63.9
Formal Wear 63.9
Tie 63.9
Shoe 59.3
Shoe 57.5
Shoe 56.8
Doctor 56.4
Bullfighter 55.2
Bullfighting 55.2

Clarifai
created on 2018-05-11

people 100
group 99.4
many 96.5
adult 96.3
child 96.1
group together 95.6
administration 94.7
man 93.7
woman 92.7
recreation 88.9
boy 88.4
wear 87.8
war 87
military 85.7
several 85.3
injury 82.5
offense 81.1
police 79.2
leader 78.6
crowd 76.2

Imagga
created on 2023-10-07

man 30.2
photographer 25.6
people 25.1
group 21.7
person 21.7
male 21.3
men 18
brass 18
business 17.6
businessman 16.8
corporate 15.4
work 15
wind instrument 14.2
job 14.1
trainer 13.8
adult 13.7
trombone 13.5
team 13.4
worker 12.4
office 12
professional 11.6
musical instrument 11.4
meeting 11.3
walk 10.5
communication 10.1
happy 10
travel 9.8
together 9.6
executive 9.6
women 9.5
room 9.4
youth 9.4
uniform 9.2
silhouette 9.1
black 9
standing 8.7
teamwork 8.3
businesswoman 8.2
suit 8.1
success 8
family 8
home 8
to 8
working 7.9
support 7.9
bass 7.9
weapon 7.7
career 7.6
adults 7.6
active 7.5
senior 7.5
indoor 7.3
girls 7.3
student 7.2
patient 7.2
building 7.1
device 7.1
indoors 7
modern 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
group 76.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 93.8%
Happy 54.5%
Confused 27.1%
Surprised 7.9%
Fear 6.1%
Angry 5.3%
Calm 5.2%
Sad 3.2%
Disgusted 1.3%

AWS Rekognition

Age 26-36
Gender Female, 100%
Calm 64.7%
Happy 15.8%
Confused 15.2%
Surprised 6.7%
Fear 6.2%
Sad 2.7%
Disgusted 0.6%
Angry 0.4%

AWS Rekognition

Age 48-54
Gender Female, 100%
Sad 100%
Surprised 7%
Fear 6.3%
Confused 3.3%
Calm 2.7%
Happy 2.3%
Disgusted 1.7%
Angry 1.7%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 98.9%
Surprised 6.6%
Fear 5.9%
Sad 2.2%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 6-16
Gender Female, 68.2%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.4%
Confused 0.1%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 33-41
Gender Male, 92.8%
Sad 96.7%
Calm 45.1%
Surprised 6.8%
Fear 6%
Confused 2%
Disgusted 0.5%
Angry 0.5%
Happy 0.2%

AWS Rekognition

Age 19-27
Gender Male, 100%
Calm 99.6%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0%
Confused 0%
Angry 0%
Disgusted 0%

AWS Rekognition

Age 29-39
Gender Male, 62.1%
Sad 97.1%
Calm 45.4%
Surprised 6.7%
Fear 6%
Confused 1.3%
Angry 0.4%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 6-12
Gender Female, 99.8%
Calm 97.3%
Surprised 6.3%
Fear 6.1%
Sad 2.3%
Happy 1%
Angry 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 12-20
Gender Male, 98.8%
Sad 99.7%
Calm 29.1%
Surprised 6.5%
Fear 6%
Confused 0.8%
Angry 0.5%
Happy 0.4%
Disgusted 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.2%
Calm 89.5%
Sad 7.2%
Surprised 6.4%
Fear 5.9%
Confused 0.8%
Angry 0.3%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 2-8
Gender Female, 89%
Sad 99.9%
Surprised 26.9%
Fear 6.1%
Calm 1.9%
Happy 0.7%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 24-34
Gender Male, 92.4%
Calm 76.5%
Sad 9.8%
Fear 8%
Surprised 6.6%
Happy 3.3%
Confused 1.8%
Angry 0.9%
Disgusted 0.5%

Microsoft Cognitive Services

Age 69
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 56
Gender Male

Microsoft Cognitive Services

Age 14
Gender Male

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 42
Gender Female

Microsoft Cognitive Services

Age 20
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Boy 98.7%
Child 98.7%
Male 98.7%
Person 98.7%
Adult 98.1%
Man 98.1%
Shoe 89.9%
Cat 83%
Tie 63.9%

Categories