Human Generated Data

Title

Untitled (Java)

Date

January 26, 1960-February 2, 1960

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2344

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Java)

People

Artist: Ben Shahn, American 1898 - 1969

Date

January 26, 1960-February 2, 1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2344

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Flower 100
Flower Arrangement 100
Plant 100
Flower Bouquet 100
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Male 98.5
Person 98.5
Boy 98.5
Child 98.5
Adult 97
Male 97
Man 97
Person 97
Person 94.4
Child 94.4
Female 94.4
Girl 94.4
Adult 92.3
Male 92.3
Man 92.3
Person 92.3
Person 91.2
Person 90.5
Face 86
Head 86
Person 79.1
Cup 78.6
Person 71.5
Person 69.4
Formal Wear 65.1
Funeral 57.4
Clothing 57.3
Suit 57.3
Dress 57
Fashion 56.7
Gown 56.7
Coat 56.2
People 55.4
Rose 55.4

Clarifai
created on 2018-05-10

people 99.9
group 99.1
adult 98.7
many 96.9
group together 96.4
man 94.3
administration 93.5
woman 93.2
wear 91.8
military 91.7
several 90.4
war 90.2
child 84.5
recreation 83.4
leader 83.1
outfit 83
one 79.9
two 79.5
music 77.6
vehicle 77

Imagga
created on 2023-10-05

groom 20.2
man 18.8
old 18.1
person 17.8
male 17.7
people 17.3
city 15
world 14.7
adult 14
men 13.7
love 13.4
building 13
sitting 12.9
bride 12.5
couple 12.2
architecture 11.7
statue 11.7
two 11
dress 10.8
sculpture 10.5
portrait 10.3
wedding 10.1
history 9.8
ancient 9.5
happiness 9.4
monument 9.3
television camera 9.3
face 9.2
life 9.2
travel 9.1
religion 9
home 8.8
happy 8.8
women 8.7
bouquet 8.5
fashion 8.3
cheerful 8.1
landmark 8.1
smiling 8
spectator 7.8
black 7.8
scene 7.8
culture 7.7
married 7.7
husband 7.6
marriage 7.6
wife 7.6
house 7.5
religious 7.5
traditional 7.5
outdoors 7.5
vintage 7.4
television equipment 7.4
business 7.3
art 7.2
room 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 96.7
crowd 0.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Female, 89.8%
Calm 94.4%
Surprised 6.3%
Fear 5.9%
Sad 4.4%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 19-27
Gender Female, 100%
Sad 100%
Surprised 6.3%
Fear 6.1%
Calm 1.7%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0%

AWS Rekognition

Age 33-41
Gender Male, 91.8%
Calm 70.8%
Angry 11.9%
Sad 7.7%
Surprised 6.8%
Fear 6.3%
Happy 2.6%
Disgusted 1.7%
Confused 1.5%

AWS Rekognition

Age 4-10
Gender Male, 99.2%
Sad 97.7%
Calm 44.9%
Surprised 6.6%
Fear 6%
Angry 0.4%
Happy 0.3%
Confused 0.2%
Disgusted 0.2%

AWS Rekognition

Age 26-36
Gender Female, 56.9%
Sad 99.8%
Surprised 24.1%
Fear 6.4%
Calm 3.8%
Angry 3%
Happy 2.2%
Disgusted 1.6%
Confused 0.7%

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Boy 98.5%
Child 98.5%
Female 94.4%
Girl 94.4%
Cup 78.6%

Captions