Human Generated Data

Title

Untitled (band playing at wedding)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.18

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (band playing at wedding)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.432.18

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Human 99.7
Person 99.7
Person 99.2
Person 99.2
Person 99
Person 97.6
Leisure Activities 97.3
Musical Instrument 95.5
Musician 95.5
Person 94.6
Person 94.1
Guitar 93.7
Person 89.5
Person 89.2
Person 88.3
Indoors 87
Interior Design 87
Shoe 83.4
Apparel 83.4
Footwear 83.4
Clothing 83.4
Music Band 79.5
Person 78.3
Person 72.9
Guitarist 62.6
Performer 62.6
Lute 55.4
Person 50.8

Clarifai
created on 2019-03-25

people 100
group 99.6
group together 98.4
adult 98.1
several 97.8
music 97.1
many 97
administration 96.5
instrument 95.4
woman 94.8
musician 94.5
leader 94.3
man 94
recreation 89.2
stringed instrument 87.8
five 86.6
furniture 85.1
wear 84.8
four 84.7
outfit 83.7

Imagga
created on 2019-03-25

stringed instrument 45.7
musical instrument 41.4
bowed stringed instrument 33.7
man 30.2
musician 25.9
violin 24.4
adult 24.1
music 23.7
singer 22.7
person 22.6
male 22
people 20.6
guitar 20.1
dress 19
sexy 18.5
performer 17.2
oboe 17.2
wind instrument 17
play 16.4
fashion 15.8
banjo 15.3
black 15
brass 14.9
portrait 14.9
style 14.8
attractive 14.7
playing 14.6
lifestyle 14.4
handsome 14.3
men 13.7
concert 13.6
model 13.2
couple 13.1
rock 13
barroom 12.8
women 12.6
fun 12
suit 11.7
artist 11.6
hands 11.3
pretty 11.2
body 11.2
hair 11.1
cornet 10.9
instrument 10.8
guitarist 10.8
performance 10.5
cello 10.3
bass 10.3
stage 10.2
entertainment 10.1
elegance 10.1
happy 10
dark 10
smile 10
device 9.9
entertainer 9.7
human 9.7
band 9.7
group 9.7
love 9.5
sitting 9.4
player 9.4
sound 9.4
youth 9.4
face 9.2
posing 8.9
together 8.8
smiling 8.7
guy 8.7
musical 8.6
happiness 8.6
two 8.5
power 8.4
studio 8.4
holding 8.3
room 8.2
sensuality 8.2
life 8.1
romance 8
looking 8
star 7.9
woodwind 7.8
bride 7.7
show 7.6
joy 7.5
emotion 7.4
clothing 7.3
cheerful 7.3
lady 7.3
danger 7.3
stylish 7.2
romantic 7.1
viol 7.1
interior 7.1
indoors 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 99.9
people 89.4
group 73.5
old 50.4
black and white 11.9
music 9.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-52
Gender Male, 99.5%
Confused 13.8%
Angry 14%
Surprised 37.2%
Happy 5.3%
Calm 13.9%
Disgusted 5.5%
Sad 10.3%

AWS Rekognition

Age 26-43
Gender Male, 98.2%
Surprised 4.7%
Calm 54.3%
Disgusted 1.9%
Happy 0.8%
Angry 15.3%
Sad 15.6%
Confused 7.4%

AWS Rekognition

Age 35-52
Gender Male, 54.3%
Angry 45.4%
Disgusted 45.4%
Confused 45.3%
Surprised 45.3%
Sad 48.1%
Happy 48.2%
Calm 47.4%

AWS Rekognition

Age 26-43
Gender Female, 53.4%
Confused 45%
Happy 45%
Sad 45.2%
Surprised 45%
Angry 54.7%
Disgusted 45%
Calm 45.1%

AWS Rekognition

Age 26-43
Gender Male, 98.9%
Calm 1.3%
Sad 1%
Surprised 1.1%
Confused 0.9%
Happy 93.5%
Angry 1.3%
Disgusted 1.1%

AWS Rekognition

Age 27-44
Gender Male, 54.5%
Happy 45.4%
Calm 51.2%
Confused 45.2%
Sad 47.3%
Disgusted 45.1%
Surprised 45.2%
Angry 45.6%

AWS Rekognition

Age 26-43
Gender Male, 52.1%
Sad 54.2%
Angry 45.3%
Calm 45.1%
Surprised 45.1%
Happy 45.1%
Confused 45.1%
Disgusted 45.2%

AWS Rekognition

Age 35-52
Gender Female, 53.7%
Surprised 45%
Calm 46.2%
Disgusted 45%
Happy 45.1%
Angry 45.1%
Sad 53.5%
Confused 45.1%

AWS Rekognition

Age 35-55
Gender Male, 52.9%
Angry 45.2%
Happy 53%
Sad 45.4%
Confused 45.1%
Disgusted 45.2%
Surprised 45.3%
Calm 45.8%

AWS Rekognition

Age 6-13
Gender Female, 51.6%
Happy 45.2%
Surprised 45.1%
Disgusted 45.5%
Calm 46%
Sad 52.6%
Confused 45.2%
Angry 45.4%

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 32
Gender Female

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 83.4%

Categories

Imagga

people portraits 85.3%
paintings art 13.6%