Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2467

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2467

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Adult 98
Male 98
Man 98
Person 98
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Adult 96.9
Male 96.9
Man 96.9
Person 96.9
Face 95.4
Head 95.4
Person 95
Musical Instrument 92.3
Person 92
Person 89.2
Person 87.7
Person 87.1
Person 86.4
Person 86.2
Accordion 84
Person 79.1
People 57
Photography 55.8
Portrait 55.2
Group Performance 55.1
Leisure Activities 55.1
Music 55.1
Music Band 55.1
Musician 55.1
Performer 55.1
City 55.1

Clarifai
created on 2018-05-10

people 99.9
group 99.1
adult 98.7
many 97.7
administration 96.6
group together 95.7
man 95.1
leader 95
several 92.3
one 91.2
two 89.4
vehicle 88.2
music 88.1
wear 87.8
war 87.1
musician 85.9
woman 85.3
military 84.9
portrait 83.5
outfit 83.2

Imagga
created on 2023-10-05

accordion 100
keyboard instrument 100
wind instrument 100
musical instrument 100
music 31.6
piano 31.5
instrument 27
playing 26.5
keyboard 25.4
play 25
man 23.5
musical 22
male 22
person 20.7
people 20.6
adult 20
musician 18.5
sound 15
key 15
happy 14.4
portrait 14.2
black 13.8
attractive 13.3
indoors 13.2
education 13
hand 12.9
home 12.8
business 12.8
keys 12.7
old 12.5
performance 12.5
child 11.7
practice 10.7
learning 10.3
smiling 10.1
face 9.9
pianist 9.9
one 9.7
technology 9.6
boy 9.6
classical 9.6
women 9.5
learn 9.4
lifestyle 9.4
holding 9.1
human 9
fun 9
working 8.8
song 8.8
together 8.8
closeup 8.8
hands 8.7
artist 8.7
cute 8.6
sitting 8.6
casual 8.5
professional 8.4
senior 8.4
classic 8.4
finger 8.3
laptop 8.2
computer 8
looking 8
chord 7.9
work 7.8
performer 7.8
lesson 7.8
fingers 7.6
player 7.6
retro 7.4
note 7.4
guy 7.4
equipment 7.3
art 7.2
family 7.1
job 7.1
happiness 7.1
modern 7

Microsoft
created on 2018-05-10

person 99.7
man 94.9
accordion 93.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 91.8%
Surprised 6.5%
Fear 6%
Confused 2.9%
Sad 2.8%
Angry 1.3%
Disgusted 0.9%
Happy 0.2%

AWS Rekognition

Age 25-35
Gender Male, 100%
Sad 75%
Calm 37.6%
Confused 12.2%
Surprised 12%
Fear 6.6%
Angry 3.2%
Happy 1.3%
Disgusted 1.3%

AWS Rekognition

Age 25-35
Gender Male, 99.9%
Calm 88.8%
Surprised 10.3%
Fear 6%
Confused 3.6%
Sad 2.3%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 30-40
Gender Female, 83.3%
Calm 59.5%
Sad 35%
Fear 8.6%
Surprised 7.3%
Confused 4%
Angry 2.7%
Disgusted 2.1%
Happy 1.3%

AWS Rekognition

Age 24-34
Gender Female, 90.5%
Calm 53.7%
Angry 24.7%
Disgusted 8%
Surprised 7.8%
Fear 6.4%
Sad 3.9%
Confused 2.9%
Happy 2.1%

AWS Rekognition

Age 23-33
Gender Male, 93.7%
Calm 65.2%
Sad 13.9%
Happy 13.7%
Surprised 7%
Fear 6.3%
Disgusted 1.7%
Angry 1.7%
Confused 1.3%

AWS Rekognition

Age 7-17
Gender Female, 73.2%
Calm 86.2%
Surprised 10.5%
Fear 5.9%
Disgusted 4.6%
Sad 2.3%
Confused 1.2%
Angry 0.4%
Happy 0.3%

Microsoft Cognitive Services

Age 61
Gender Male

Microsoft Cognitive Services

Age 48
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Microsoft Cognitive Services

Age 53
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.1%
Male 98.1%
Man 98.1%
Person 98.1%

Categories