Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2472

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2472

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 97.4
Male 97.4
Man 97.4
Person 97.4
Adult 97.2
Male 97.2
Man 97.2
Person 97.2
Face 95.8
Head 95.8
Adult 94.4
Male 94.4
Man 94.4
Person 94.4
Person 93.9
Person 92.9
Musical Instrument 92.4
Person 89.8
Accordion 85.2
Person 66.8
Accessories 60.4
Glasses 60.4
Photography 57.7
Portrait 57.7

Clarifai
created on 2018-05-10

people 99.9
adult 99.2
group 98.7
group together 97.8
administration 97.7
leader 97.7
man 97.1
portrait 96.5
wear 96.1
one 96
two 95.8
music 95.8
many 95.5
several 93.4
military 90.8
outfit 90.5
musician 89.3
war 88.9
three 85.1
chair 84.8

Imagga
created on 2023-10-06

accordion 100
wind instrument 100
musical instrument 100
concertina 91.5
keyboard instrument 86.6
free-reed instrument 73.2
man 34.9
male 28.4
adult 26.5
business 24.3
people 23.4
person 23
businessman 20.3
hand 18.2
device 17.7
music 17.2
face 16.3
office 15.3
instrument 14.9
portrait 14.9
play 14.6
playing 14.6
computer 14.4
work 14.1
corporate 13.7
men 13.7
laptop 13.7
musical 13.4
old 13.2
happy 13.2
keyboard 13.1
sound 13.1
education 13
attractive 11.9
professional 11.8
musician 11.7
smiling 11.6
holding 11.6
working 11.5
indoors 11.4
senior 11.2
casual 11
smile 10.7
looking 10.4
technology 10.4
notebook 10.3
occupation 10.1
businesswoman 10
handsome 9.8
worker 9.8
job 9.7
one 9.7
black 9.6
performance 9.6
businesspeople 9.5
career 9.5
sitting 9.4
lifestyle 9.4
mature 9.3
finger 9.2
piano 8.8
retirement 8.6
paper 8.6
elderly 8.6
employee 8.6
culture 8.6
youth 8.5
one person 8.5
communication 8.4
color 8.3
executive 8.3
fun 8.2
confident 8.2
suit 8.1
boy 7.8
concentration 7.7
confidence 7.7
key 7.5
manager 7.4
entertainment 7.4
indoor 7.3
group 7.3
student 7.2
home 7.2
clothing 7.2
building 7.1
financial 7.1
day 7.1

Microsoft
created on 2018-05-10

accordion 99.1
person 97.5
man 95.4
music 88.8
old 51.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 98.9%
Fear 52.1%
Confused 29.2%
Calm 10.8%
Angry 7.9%
Surprised 7.9%
Sad 3.1%
Disgusted 2.5%
Happy 1.5%

AWS Rekognition

Age 31-41
Gender Male, 99.9%
Calm 75.3%
Confused 10.1%
Surprised 7.4%
Fear 7%
Sad 4.7%
Angry 2.8%
Disgusted 0.9%
Happy 0.4%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 97.3%
Surprised 7.3%
Fear 5.9%
Sad 2.2%
Angry 0.5%
Confused 0.2%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Fear 78%
Surprised 27%
Calm 9.6%
Sad 5.5%
Angry 1.2%
Disgusted 1.2%
Confused 0.8%
Happy 0.4%

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 59.2%
Happy 19.3%
Surprised 7.6%
Fear 7.4%
Sad 6%
Confused 3.3%
Disgusted 3%
Angry 1.6%

Microsoft Cognitive Services

Age 45
Gender Male

Microsoft Cognitive Services

Age 72
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.7%
Male 98.7%
Man 98.7%
Person 98.7%
Glasses 60.4%

Text analysis

Amazon

14th ff