Human Generated Data

Title

Untitled (New York City Reformatory, New Hampton, New York)

Date

May 1934-June 1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2800

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City Reformatory, New Hampton, New York)

People

Artist: Ben Shahn, American 1898 - 1969

Date

May 1934-June 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2800

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

People 100
Face 100
Head 100
Photography 100
Portrait 100
Person 98.9
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 97.9
Adult 97.9
Male 97.9
Man 97.9
Person 97.4
Person 97.3
Adult 97.3
Male 97.3
Man 97.3
Person 97.2
Adult 97.2
Male 97.2
Man 97.2
Person 96.3
Adult 96.3
Male 96.3
Man 96.3
Person 96.1
Person 96.1
Person 95.1
Person 94.3
Adult 94.3
Bride 94.3
Female 94.3
Wedding 94.3
Woman 94.3
Person 94.2
Person 92.4
Adult 92.4
Male 92.4
Man 92.4
Happy 88
Smile 88
American Football 76.4
American Football (Ball) 76.4
Ball 76.4
Football 76.4
Sport 76.4
Groupshot 71.3
Body Part 69.2
Finger 69.2
Hand 69.2
Baseball Cap 61.3
Cap 61.3
Clothing 61.3
Hat 61.3
Crowd 61.1

Clarifai
created on 2018-05-10

people 99.9
group together 99.4
man 99.1
group 98.9
portrait 98.6
many 98.2
adult 96.9
several 93.7
military 92.7
uniform 90.7
four 90.3
retro 90.3
wear 89.7
facial expression 89.4
outfit 86.7
baseball 84.8
war 84.5
woman 83.2
leader 82.6
sitting 82.4

Imagga
created on 2023-10-06

kin 100
family 45.4
father 39.9
happy 37
man 37
mother 36.8
male 36.3
smiling 35.4
child 34.9
together 34.2
couple 33.1
people 32.3
portrait 30.4
daughter 28.7
senior 27.2
parent 26.1
love 25.2
dad 24.9
group 24.2
son 24.1
boy 22.6
park 22.2
home 20.7
children 20
happiness 19.6
outdoors 18.7
adult 18.1
lifestyle 18.1
husband 17.2
men 17.2
sitting 16.3
cheerful 16.3
casual 16.1
clothing 16.1
togetherness 16
person 16
mature 15.8
smile 15.7
enjoying 15.1
grandfather 14.8
elderly 14.4
wife 14.2
fun 14.2
little 14.1
day 14.1
two 13.5
kid 13.3
friends 13.1
military uniform 13.1
60s 12.7
30s 12.5
camera 12
women 11.9
sibling 11.7
loving 11.4
laughing 11.3
looking 11.2
old 11.1
uniform 11
joy 10.9
sixties 10.8
grandmother 10.8
cute 10.8
parents 10.7
childhood 10.7
married 10.5
friendship 10.3
summer 10.3
leisure 10
world 9.8
mom 9.7
affectionate 9.7
generation 9.6
kids 9.4
brother 9.3
horizontal 9.2
outdoor 9.2
girls 9.1
attractive 9.1
baby 8.8
mum 8.8
indoors 8.8
looking camera 8.7
four 8.6
relationship 8.4
autumn 7.9
having fun 7.9
70s 7.9
seniors 7.9
middle aged 7.8
thirties 7.8
couch 7.7
outside 7.7
sofa 7.7
20s 7.3
playing 7.3
relaxing 7.3
school 7.2
face 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.8
outdoor 98.2
posing 89.9
group 79.1
team 62
old 46.8
crowd 1.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 100%
Happy 98.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.7%
Confused 0.1%
Calm 0.1%
Disgusted 0%

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 61.5%
Confused 25.1%
Angry 11.2%
Surprised 6.4%
Fear 5.9%
Sad 2.4%
Disgusted 0.8%
Happy 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.8%
Happy 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Confused 0.1%
Disgusted 0%
Calm 0%

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Happy 99.1%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.3%
Calm 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 33-41
Gender Female, 63.4%
Calm 88%
Surprised 6.7%
Fear 6.3%
Happy 3.7%
Sad 3.1%
Confused 1.9%
Angry 1.2%
Disgusted 0.6%

AWS Rekognition

Age 24-34
Gender Male, 100%
Happy 97.8%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Angry 0.9%
Calm 0.4%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 75.7%
Happy 8.6%
Surprised 6.7%
Fear 6.1%
Angry 5.1%
Confused 4.6%
Disgusted 2.8%
Sad 2.7%

AWS Rekognition

Age 9-17
Gender Female, 62.7%
Happy 79.9%
Fear 11.6%
Surprised 6.6%
Calm 3.3%
Sad 2.6%
Confused 1.7%
Angry 1.2%
Disgusted 0.9%

AWS Rekognition

Age 25-35
Gender Male, 99.8%
Happy 99.2%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Confused 0.1%
Calm 0.1%
Disgusted 0.1%

AWS Rekognition

Age 24-34
Gender Male, 89.4%
Calm 95.9%
Surprised 6.3%
Fear 5.9%
Sad 3.6%
Angry 0.1%
Disgusted 0.1%
Confused 0.1%
Happy 0%

AWS Rekognition

Age 24-34
Gender Male, 98.5%
Calm 59.7%
Sad 44.9%
Fear 7.1%
Surprised 7.1%
Confused 7%
Happy 1.3%
Disgusted 1.1%
Angry 0.9%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Happy 97.9%
Surprised 6.5%
Fear 5.9%
Sad 2.2%
Calm 1%
Confused 0.4%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 28-38
Gender Male, 97.2%
Happy 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0%
Calm 0%
Disgusted 0%
Confused 0%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Microsoft Cognitive Services

Age 22
Gender Male

Microsoft Cognitive Services

Age 62
Gender Male

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 51
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%
Adult 98.2%
Male 98.2%
Man 98.2%
Bride 94.3%
Female 94.3%
Woman 94.3%

Categories

Imagga

people portraits 99.3%