Human Generated Data

Title

Untitled (Glad you found photos to keep)

Date

1984

People

Artist: Bill Dane, American born 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5272

Copyright

© Bill Dane

Human Generated Data

Title

Untitled (Glad you found photos to keep)

People

Artist: Bill Dane, American born 1938

Date

1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.5272

Copyright

© Bill Dane

Machine Generated Data

Tags

Amazon
created on 2019-11-15

Person 98.6
Human 98.6
Person 98.2
Person 95.8
Musical Instrument 95.7
Musician 95.7
Person 93.6
Apparel 91.7
Clothing 91.7
Person 90.5
Person 90.1
Person 88.4
Leisure Activities 88.1
Music Band 86.9
Horse 86.6
Animal 86.6
Mammal 86.6
Person 84.4
Person 73
Guitar 67.8
Banjo 61.2
Hat 55.9

Clarifai
created on 2019-11-15

people 99.9
group 99
many 97
group together 95.9
outfit 95.3
man 95.1
adult 94.9
military 94.1
uniform 93.1
wear 91.2
woman 90.6
war 89.5
administration 86.8
music 86.2
military uniform 83.8
child 83.2
soldier 83.1
movie 82.1
several 82
gun 80.2

Imagga
created on 2019-11-15

brass 65.2
trombone 59.5
wind instrument 56.4
musical instrument 44.6
people 27.3
man 24.8
person 19.9
male 19.8
urban 18.3
men 18
adult 18
group 17.7
business 16.4
city 15.8
black 14.5
cornet 14.1
outfit 12.7
window 12.1
occupation 11.9
women 11.9
clothing 11.7
crowd 11.5
fashion 11.3
music 11.3
indoors 10.5
modern 10.5
room 10.1
businessman 9.7
style 9.6
chair 9.6
passenger 9.3
life 9.3
musician 9.2
silhouette 9.1
working 8.8
concert 8.7
lifestyle 8.7
work 8.6
building 8.4
human 8.2
suit 8.1
transportation 8.1
interior 8
standing 7.8
architecture 7.8
scene 7.8
travel 7.7
stress 7.7
performance 7.7
walk 7.6
showing 7.5
fun 7.5
leisure 7.5
stringed instrument 7.5
bowed stringed instrument 7.5
inside 7.4
light 7.3
classroom 7.3
protection 7.3
hall 7.3
office 7.2
activity 7.2
handsome 7.1
portrait 7.1
to 7.1
together 7

Google
created on 2019-11-15

Microsoft
created on 2019-11-15

text 99.3
person 97.8
musical instrument 92.7
clothing 89.8
drum 73.1
group 61.2
white 60.9
posing 50.6
old 43.9

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 31-47
Gender Male, 51.6%
Fear 45%
Surprised 45.1%
Angry 45.1%
Sad 46.4%
Disgusted 45%
Happy 45%
Calm 53.1%
Confused 45.2%

AWS Rekognition

Age 25-39
Gender Female, 50.5%
Fear 45.1%
Confused 46.6%
Happy 45%
Calm 51%
Surprised 45.1%
Sad 45.5%
Angry 46.5%
Disgusted 45.2%

AWS Rekognition

Age 26-40
Gender Male, 53.8%
Happy 45.1%
Confused 45.4%
Calm 48.9%
Fear 45.2%
Angry 47.5%
Surprised 45.2%
Disgusted 47.4%
Sad 45.3%

AWS Rekognition

Age 35-51
Gender Female, 52.2%
Calm 45%
Angry 45%
Surprised 45.2%
Disgusted 45%
Happy 45%
Sad 45.2%
Fear 54.5%
Confused 45%

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 40
Gender Male

Feature analysis

Amazon

Person 98.6%
Horse 86.6%

Text analysis

Google

Ва
Ва