Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2921

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2921

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Face 100
Head 100
Photography 100
Portrait 100
Clothing 99.8
Cap 99.8
Person 99.5
Adult 99.5
Male 99.5
Man 99.5
Baseball Cap 97.1
Coat 80.1
Body Part 72.6
Neck 72.6
Machine 69.6
Wheel 69.6
Hat 64.3
Indoors 62.3
Car 57
Transportation 57
Vehicle 57
Computer Hardware 56.2
Electronics 56.2
Hardware 56.2
Shelf 55.5
Screen 55.2
Monitor 55.2

Clarifai
created on 2018-05-10

people 99.8
adult 99
one 98.1
man 97.4
portrait 96.2
administration 95.9
wear 92.1
monochrome 91.8
leader 87.4
offense 82.8
street 80.7
furniture 80.4
military 79.6
veil 79.1
war 78.1
facial hair 77.4
lid 76.1
outfit 74.8
room 73.2
vehicle 72

Imagga
created on 2023-10-07

man 48.4
male 36.2
person 32.6
senior 29
portrait 27.8
people 24.5
old 23.7
adult 23.4
looking 23.2
face 22.7
glasses 22.2
business 21.3
businessman 21.2
mature 20.4
work 20.4
elderly 20.1
office 18
handsome 17.8
men 17.2
old-timer 16.3
happy 16.3
computer 16.3
suit 15.8
serious 15.3
working 15
worker 14.5
grandfather 14.4
black 14.4
smile 14.2
expression 13.6
casual 13.6
laptop 13.2
professional 13.1
eyes 12.9
hair 12.7
gray 12.6
retirement 12.5
lifestyle 12.3
sitting 12
head 11.8
retired 11.6
age 11.4
device 11.3
guy 11
television 11
aged 10.9
smiling 10.9
job 10.6
look 10.5
human 10.5
one 10.4
occupation 10.1
hand 9.9
to 9.7
indoors 9.7
technology 9.6
home 9.6
tie 9.5
equipment 9.4
shirt 9.3
call 9.2
indoor 9.1
modern 9.1
confident 9.1
success 8.8
close 8.6
thinking 8.5
beard 8.5
clothing 8.5
desk 8.5
telephone 8.2
lady 8.1
pensioner 7.8
education 7.8
executive 7.8
older 7.8
corporate 7.7
hat 7.7
intellectual 7.6
one person 7.5
pay-phone 7.5
holding 7.4
alone 7.3
machine 7.2
telecommunication system 7.1
interior 7.1
scholar 7.1
happiness 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 95.5
man 93.1
black 83.6
white 63.9
old 57.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Confused 43%
Sad 33.7%
Angry 12%
Calm 10.4%
Disgusted 9.5%
Surprised 6.7%
Fear 6.5%
Happy 0.6%

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Adult 99.5%
Male 99.5%
Man 99.5%
Wheel 69.6%
Hat 64.3%
Car 57%

Categories

Imagga

people portraits 50.2%
paintings art 48.7%

Text analysis

Amazon

ED
TS
FRESH
5
FRESH now
...
Brite
now
4

Google

印 arite TS ED n.
n.
arite
TS
ED