Human Generated Data

Title

Untitled (New York City)

Date

1932-1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2852

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2852

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Clothing 100
Coat 100
Formal Wear 100
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Suit 96.9
Face 96.3
Head 96.3
Photography 95.9
Portrait 95.9
Accessories 93.9
Glasses 93.9
Hat 93.6
Person 88.9
Person 87.1
Bench 84.8
Furniture 84.8
Tie 71.1
Overcoat 69.3
Sitting 65.4
Person 62.2
Shirt 57.6
Text 57.2
Reading 57.2
Jacket 57.1
Blazer 56.5
Tuxedo 55.4
Sun Hat 55.2

Clarifai
created on 2018-05-10

people 99.9
administration 98.5
adult 98.4
man 96.6
one 96.1
leader 95.1
group 93.5
group together 93.1
many 88.3
chair 88.2
two 87.8
portrait 87.6
woman 82.3
sit 82.3
wear 82
home 82
vehicle 80
outfit 79.4
veil 79.1
war 78.3

Imagga
created on 2023-10-06

man 53.2
business 44.4
businessman 44.2
male 40.4
office 40.4
suit 37.9
professional 34.6
corporate 34.4
people 32.4
executive 31.6
work 29
person 27
happy 26.3
adult 26.1
manager 25.2
handsome 25
men 23.2
businesspeople 22.8
portrait 22
team 21.5
confident 20.9
businesswoman 20.9
meeting 20.7
teamwork 20.4
wagon 19.5
job 19.5
successful 19.2
smiling 18.8
group 18.5
smile 18.5
building 18.2
success 17.7
laptop 17.3
computer 16.8
communication 16.8
colleagues 16.5
corporation 16.4
tie 16.1
working 15.9
wheeled vehicle 15.4
worker 14.5
formal 14.3
guy 14.2
modern 14
together 14
attractive 14
looking 13.6
face 13.5
employee 13.4
sitting 12.9
friendly 12.8
businessmen 12.7
talking 12.4
desk 12.3
bartender 12.1
expression 11.9
technology 11.9
leader 11.6
leadership 11.5
restaurant 11.5
workplace 11.4
career 11.4
senior 11.2
company 11.2
women 11.1
busy 10.6
staff 10.5
boss 10.5
look 10.5
table 10.4
waiter 10.2
lifestyle 10.1
discussing 9.8
partner 9.7
clothing 9.6
partnership 9.6
ethnic 9.5
smart 9.4
container 9.3
phone 9.2
occupation 9.2
indoors 8.8
using 8.7
entrepreneur 8.5
finance 8.4
hand 8.4
notebook 8.4
mature 8.4
service 8.3
glasses 8.3
holding 8.3
human 8.3
necktie 8
coat 7.8
happiness 7.8
standing 7.8
black 7.8
masculine 7.8
conversation 7.8
older 7.8
management 7.7
serious 7.6
casual 7.6
student 7.6
plan 7.6
city 7.5
cheerful 7.3
center 7.3
colleague 7.2
hall 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.1
man 98.9
outdoor 96.9
suit 81.6
old 80
black 78.7
white 67.2
posing 38.9

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 62-72
Gender Male, 100%
Confused 45.3%
Calm 35.5%
Sad 10.4%
Surprised 8.3%
Fear 6.3%
Angry 1.3%
Disgusted 1.2%
Happy 0.3%

AWS Rekognition

Age 25-35
Gender Female, 96.5%
Fear 98.1%
Surprised 6.3%
Sad 2.2%
Calm 0.1%
Happy 0%
Confused 0%
Disgusted 0%
Angry 0%

AWS Rekognition

Age 2-10
Gender Female, 58.7%
Calm 73.1%
Happy 21%
Surprised 6.9%
Fear 6.8%
Sad 2.5%
Confused 0.7%
Disgusted 0.6%
Angry 0.2%

Microsoft Cognitive Services

Age 44
Gender Male

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Suit 96.9%
Glasses 93.9%
Hat 93.6%

Text analysis

Amazon

AGE
Oranges
9
E
9 NV E PRES
AGE EV
PRES
NV
EV
SUNKIST Oranges
SUNKIST