Human Generated Data

Title

Untitled (two men singing and playing accordian and banjo, Princeton University reunion, Princeton, NJ)

Date

c. 1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4550

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two men singing and playing accordian and banjo, Princeton University reunion, Princeton, NJ)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4550

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Musical Instrument 99.1
Leisure Activities 99.1
Guitar 99.1
Accessory 98.9
Tie 98.9
Accessories 98.9
Person 98.7
Human 98.7
Person 98.5
Person 97.5
Person 97.3
Person 96.5
Person 95.6
Person 95.3
Clothing 94.5
Apparel 94.5
Hat 94.5
Person 93.8
Musician 93
Person 91.8
Interior Design 91.2
Indoors 91.2
Person 89.3
Face 85.7
Room 80.9
Clinic 73.7
Person 70.9
Accordion 70.1
Music Band 69.6
Person 69.3
Photography 68.8
Photo 68.8
Portrait 68.8
People 62.5
Crowd 61.3
Suit 57.9
Coat 57.9
Overcoat 57.9
Pillow 56
Cushion 56

Clarifai
created on 2023-10-15

people 99.8
music 98
adult 97.9
group 97.7
man 97
musician 96.7
monochrome 95.3
instrument 94
woman 92.3
jazz 88.5
many 86.8
group together 86.4
uniform 84.7
accordion 83.3
veil 79.6
leader 78.2
wear 78.1
band 74.4
lid 74.1
outfit 73

Imagga
created on 2021-12-14

musical instrument 99.7
accordion 82.8
keyboard instrument 65.6
wind instrument 57.3
man 42.3
person 31.4
people 29
male 28.4
business 27.3
computer 27.2
office 26.9
work 25.9
adult 25.4
businessman 24.7
laptop 23.7
professional 21.5
desk 21.2
table 21.1
worker 19.5
men 18.9
businesswoman 17.3
sitting 17.2
businesspeople 17.1
meeting 16.9
working 16.8
job 15.9
team 15.2
looking 15.2
senior 15
room 14.7
communication 14.3
happy 13.8
corporate 13.7
group 13.7
home 13.5
teacher 13.3
indoors 13.2
education 13
technology 12.6
medical 12.4
executive 12.3
teamwork 12
serious 11.4
hand 11.4
percussion instrument 11.2
smiling 10.8
face 10.6
retirement 10.6
elderly 10.5
notebook 10.4
mature 10.2
lifestyle 10.1
confident 10
conference 9.8
women 9.5
chair 9.5
student 9.4
glass 9.3
steel drum 9.3
successful 9.1
device 9.1
one 9
handsome 8.9
paper 8.6
patient 8.6
workplace 8.6
smile 8.5
doctor 8.5
portrait 8.4
modern 8.4
shirt 8.4
study 8.4
old 8.4
occupation 8.2
colleagues 7.8
retired 7.8
planning 7.7
monitor 7.7
casual 7.6
two 7.6
talking 7.6
sit 7.6
human 7.5
manager 7.4
holding 7.4
success 7.2
board 7.2
stringed instrument 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-38
Gender Male, 92.7%
Calm 71.9%
Happy 16.7%
Angry 7.2%
Sad 2.5%
Surprised 0.7%
Disgusted 0.5%
Confused 0.3%
Fear 0.2%

AWS Rekognition

Age 36-52
Gender Male, 89%
Surprised 78.8%
Fear 13.9%
Calm 2.7%
Happy 1.9%
Confused 1.5%
Disgusted 0.5%
Angry 0.4%
Sad 0.3%

AWS Rekognition

Age 26-40
Gender Male, 56.3%
Calm 57.7%
Sad 17.9%
Happy 10.1%
Surprised 5.8%
Fear 3.4%
Confused 2.5%
Angry 2.2%
Disgusted 0.3%

AWS Rekognition

Age 29-45
Gender Female, 63.6%
Calm 99.3%
Surprised 0.4%
Sad 0.1%
Confused 0.1%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-34
Gender Female, 59.8%
Calm 96%
Happy 2.3%
Sad 0.8%
Surprised 0.3%
Confused 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 13-25
Gender Male, 78.6%
Calm 95.3%
Happy 2.9%
Disgusted 1%
Surprised 0.5%
Angry 0.2%
Sad 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 34-50
Gender Male, 89.3%
Calm 82.2%
Confused 5.8%
Angry 5.5%
Surprised 2.7%
Sad 1.2%
Disgusted 1.1%
Fear 0.9%
Happy 0.6%

AWS Rekognition

Age 13-25
Gender Male, 80.3%
Calm 94.2%
Angry 2%
Sad 1.7%
Happy 1%
Fear 0.5%
Surprised 0.3%
Disgusted 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Guitar 99.1%
Tie 98.9%
Person 98.7%
Hat 94.5%

Categories

Text analysis

Amazon

1917
1912
22AG
PSON
22AG YTOTAS 630N3930
YTOTAS
630N3930

Google

1917 191 1917
1917
191