Human Generated Data

Title

Grand Central Station, NYC

Date

1952

People

Artist: Larry Silver, American born 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bruce Silverstein, 2018.394

Human Generated Data

Title

Grand Central Station, NYC

People

Artist: Larry Silver, American born 1934

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bruce Silverstein, 2018.394

Machine Generated Data

Tags

Amazon
created on 2020-03-11

Person 99.1
Human 99.1
Person 98.6
Person 98.3
Person 97.8
Flooring 97.4
Person 96.7
Musical Instrument 87.5
Musician 87.5
Floor 85.6
Tarmac 85.4
Asphalt 85.4
Person 84.4
Leisure Activities 81.1
Sitting 79.7
Interior Design 75.6
Indoors 75.6
People 59.6
Music Band 59.3
Performer 57.2
Guitar 56.5
Guitarist 56.5
Furniture 56.1
Couch 56.1

Clarifai
created on 2020-03-11

people 100
group together 99.3
leader 98.5
group 98.1
administration 97.9
many 96.6
man 96.3
war 96.1
adult 96.1
military 93.8
music 91.9
vehicle 90
chair 89.2
woman 88.6
outfit 87.8
veil 87.3
soldier 86.8
furniture 86.4
several 86.2
child 84.7

Imagga
created on 2020-03-11

classroom 62.1
room 55.3
people 25.6
business 19.4
man 17
urban 14.8
men 14.6
musical instrument 14.6
chair 14.5
group 14.5
brass 14.3
person 14.2
interior 14.1
city 14.1
women 13.4
wind instrument 12.3
indoors 12.3
black 12.1
travel 12
adult 11.8
office 11
motion 10.3
music 10.1
indoor 10
life 9.9
building 9.8
equipment 9.7
window 9.4
silhouette 9.1
transportation 9
train 8.6
crowd 8.6
work 8.6
transport 8.2
businessman 7.9
table 7.8
seat 7.7
musical 7.7
walking 7.6
fashion 7.5
hall 7.5
technology 7.4
inside 7.4
light 7.3
board 7.3
passenger 7.1
modern 7

Google
created on 2020-03-11

Microsoft
created on 2020-03-11

black and white 96.6
person 96.3
clothing 88.1
man 76

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Female, 52.3%
Fear 50.1%
Sad 46.1%
Confused 45.7%
Surprised 45.2%
Disgusted 45.2%
Calm 47.5%
Happy 45%
Angry 45.2%

AWS Rekognition

Age 23-35
Gender Male, 99.2%
Disgusted 0.1%
Surprised 0.2%
Happy 0.1%
Sad 0.4%
Confused 0.1%
Calm 98.9%
Fear 0%
Angry 0.3%

AWS Rekognition

Age 13-23
Gender Male, 54%
Sad 45.2%
Happy 46.5%
Disgusted 45.3%
Surprised 48.9%
Calm 48.4%
Fear 45.3%
Angry 45.1%
Confused 45.4%

AWS Rekognition

Age 22-34
Gender Female, 52%
Happy 45.2%
Calm 45.4%
Sad 48.7%
Confused 46.7%
Disgusted 45.2%
Fear 48%
Angry 45.4%
Surprised 45.4%

AWS Rekognition

Age 38-56
Gender Male, 54.6%
Fear 46.2%
Confused 45.1%
Happy 45.1%
Surprised 45.3%
Sad 46%
Angry 46.4%
Calm 50.8%
Disgusted 45.1%

AWS Rekognition

Age 17-29
Gender Male, 54.6%
Surprised 45.5%
Angry 53.4%
Calm 45.1%
Disgusted 45%
Sad 45%
Confused 45.1%
Happy 45.1%
Fear 45.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%