Human Generated Data

Title

New York City

Date

2001

People

Artist: Louis Stettner, American 1922 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.567

Copyright

© Louis Stettner Estate

Human Generated Data

Title

New York City

People

Artist: Louis Stettner, American 1922 - 2016

Date

2001

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2011.567

Copyright

© Louis Stettner Estate

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 99.4
Human 99.4
Vehicle 98.3
Automobile 98.3
Car 98.3
Transportation 98.3
Person 97.5
Person 95.7
Silhouette 95.7
Person 93.2
Person 91.5
Sitting 88.9
Person 84.1
Furniture 67.8
Bench 67.8
Person 62.5
Coat 55.9
Suit 55.9
Overcoat 55.9
Clothing 55.9
Apparel 55.9
Fashion 55.4
Robe 55.4
Gown 55.4
Evening Dress 55.4

Clarifai
created on 2018-02-10

people 99.7
man 97.9
adult 97.7
leader 97.1
administration 96.1
two 95.8
one 95.5
woman 94.1
chair 94
indoors 92.3
furniture 92.3
monochrome 91.8
group 91.4
room 90.4
wear 85.7
portrait 85.3
street 84.8
three 82.8
airport 81.7
group together 81.6

Imagga
created on 2018-02-10

piano 100
man 34.9
business 34.6
male 31.9
laptop 30.7
computer 30.1
office 29.3
businessman 29.1
people 29
work 28.2
person 26.5
sitting 25.8
meeting 25.4
corporate 22.3
suit 21.9
professional 21.3
adult 21.1
businesswoman 20
happy 18.8
table 18.6
working 18.6
upright 17.8
men 17.2
keyboard 17.1
job 16.8
indoors 16.7
music 16.3
executive 16.2
businesspeople 16.1
team 16.1
group 16.1
worker 15.2
smiling 15.2
instrument 14.5
communication 14.3
hand 13.7
chair 13.4
technology 13.4
desk 13.2
room 13
playing 12.8
home 12.8
black 12.7
teacher 12.3
together 12.3
education 12.1
smile 12.1
success 11.3
looking 11.2
manager 11.2
happiness 11
indoor 11
notebook 10.8
workplace 10.5
talking 10.5
portrait 10.4
play 10.3
mature 10.2
lifestyle 10.1
confident 10
modern 9.8
colleagues 9.7
corporation 9.6
women 9.5
career 9.5
child 9.2
cheerful 8.9
employee 8.8
mid adult 8.7
musical 8.6
learn 8.5
senior 8.4
attractive 8.4
teamwork 8.3
successful 8.2
classroom 7.9
discussing 7.9
hands 7.8
conference 7.8
face 7.8
discussion 7.8
busy 7.7
reading 7.6
sit 7.6

Google
created on 2018-02-10

Microsoft
created on 2018-02-10

person 99.2
indoor 94.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 50.2%
Happy 49.5%
Surprised 49.5%
Disgusted 49.6%
Sad 50%
Angry 49.6%
Confused 49.6%
Calm 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Angry 49.5%
Sad 50.2%
Calm 49.7%
Disgusted 49.5%

AWS Rekognition

Age 38-59
Gender Male, 50.3%
Happy 49.6%
Disgusted 49.6%
Confused 49.5%
Angry 49.7%
Surprised 49.5%
Sad 49.8%
Calm 49.7%

Feature analysis

Amazon

Person 99.4%
Car 98.3%
Bench 67.8%

Captions

Microsoft
created on 2018-02-10

a man sitting on a table 84.8%
a man sitting on a bench 77.4%
a man sitting at a table 77.3%