Human Generated Data

Title

New York Waterfront

Date

1951

People

Artist: George Heyer, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Daniel Bell, P1979.53

Human Generated Data

Title

New York Waterfront

People

Artist: George Heyer, American 20th century

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dr. Daniel Bell, P1979.53

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Human 99.8
Person 99.8
Person 99.7
Person 99.7
Person 99.1
Person 98.9
Person 96.9
Clothing 96.5
Apparel 96.5
Footwear 94.1
Shoe 94.1
Shoe 90.8
Person 82.5
Pants 69.7
Face 68.8
Outdoors 68.7
People 65
Nature 61.6
Countryside 61.6
Shoe 59.9
Overcoat 59.6
Coat 59.6
Hat 55.2
Shoe 53.5

Clarifai
created on 2018-03-23

people 100
group 99.3
adult 98.7
group together 98.5
man 96.6
administration 95.1
several 92.7
actor 91.5
musician 91.4
three 91
music 90.4
many 90.2
woman 89.1
leader 88.9
five 88.9
outfit 88.6
wear 88.1
portrait 86.8
four 84.2
two 83.8

Imagga
created on 2018-03-23

man 28.9
male 24.8
person 24.6
people 22.3
sax 22.1
silhouette 19
musical instrument 17.8
black 17.7
wind instrument 17.7
business 15.2
stage 14.8
adult 13.9
men 13.7
office 12.8
old 12.5
businessman 12.4
platform 12
symbol 11.4
clothing 11.2
sunset 10.8
dress 9.9
hand 9.9
human 9.7
group 9.7
success 9.7
boy 9.6
uniform 8.8
soldier 8.8
military 8.7
beach 8.4
portrait 8.4
sign 8.3
art 8.1
water 8
accordion 7.9
darkness 7.8
drawing 7.7
world 7.7
guy 7.6
dark 7.5
vintage 7.4
smoke 7.4
brass 7.3
blackboard 7.3
religion 7.2
room 7.1
night 7.1
travel 7

Google
created on 2018-03-23

Microsoft
created on 2018-03-23

person 99.8
standing 97.5
posing 90.7
people 67.8
black 66.6
group 65.1
old 43.8
clothes 15.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 57-77
Gender Male, 53.6%
Angry 45.1%
Sad 51%
Confused 45.2%
Happy 48.3%
Disgusted 45.2%
Surprised 45.1%
Calm 45.1%

AWS Rekognition

Age 26-43
Gender Male, 55%
Surprised 45.1%
Angry 45.3%
Happy 45.1%
Disgusted 53%
Confused 45.2%
Calm 46.1%
Sad 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Disgusted 45.1%
Confused 45.1%
Angry 45.1%
Sad 53.4%
Calm 46%
Surprised 45.1%
Happy 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.6%
Confused 45.6%
Calm 46.3%
Sad 45.5%
Surprised 51.9%
Disgusted 45.2%
Happy 45.1%
Angry 45.4%

AWS Rekognition

Age 35-53
Gender Male, 96.5%
Angry 2.2%
Sad 0.9%
Surprised 1.1%
Calm 89%
Happy 1.6%
Confused 2.5%
Disgusted 2.6%

AWS Rekognition

Age 23-38
Gender Male, 51.1%
Disgusted 45%
Sad 54.9%
Confused 45%
Angry 45%
Surprised 45%
Calm 45%
Happy 45%

AWS Rekognition

Age 35-52
Gender Female, 53.4%
Angry 45.2%
Calm 45.5%
Confused 45.1%
Happy 45.1%
Disgusted 53.8%
Surprised 45.2%
Sad 45.1%

Microsoft Cognitive Services

Age 58
Gender Male

Microsoft Cognitive Services

Age 26
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 94.1%

Text analysis

Amazon

No
TE
INSITH
INSITH BACKANG
BACKANG