Human Generated Data

Title

Untitled (men and women sitting on bench by pool)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7686

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women sitting on bench by pool)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.3
Human 98.3
Stage 98.3
Person 98.2
Person 98.1
Person 95.2
Shorts 95.2
Clothing 95.2
Apparel 95.2
Person 93.2
Person 84.9
Meal 77.2
Food 77.2
Leisure Activities 75.9
Piano 75.9
Musical Instrument 75.9
Person 72.2
Musician 70.5
Sitting 66.6
Chair 65.5
Furniture 65.5
Dish 59
Flooring 57
Crowd 56.6
Performer 55.4

Imagga
created on 2022-01-09

musical instrument 90.7
percussion instrument 70
marimba 66.2
stringed instrument 36.6
guitar 34.2
man 26.2
music 25.2
musician 23.1
adult 22.3
male 22
people 20.6
person 20.3
portrait 15.5
black 15
instrument 14.9
musical 14.3
bowed stringed instrument 14.3
silhouette 14.1
entertainment 13.8
play 13.8
guitarist 13.8
men 13.7
lifestyle 13.7
playing 13.7
keyboard instrument 13.6
acoustic guitar 13.6
attractive 12.6
violin 12.4
art 12.3
rock 12.1
acoustic 11.8
leisure 11.6
wind instrument 11.5
performer 10.9
accordion 10.8
performance 10.5
women 10.3
youth 10.2
studio 9.9
concert 9.7
banjo 9.2
indoor 9.1
business 9.1
modern 9.1
sport 9
player 8.9
body 8.8
couple 8.7
string 8.7
outdoor 8.4
dark 8.3
sexy 8
handsome 8
love 7.9
boy 7.8
model 7.8
device 7.8
pretty 7.7
grunge 7.7
casual 7.6
happy 7.5
sound 7.5
professional 7.4
style 7.4
window 7.3
office 7.2
active 7.2
steel drum 7.2
interior 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 98.8
clothing 96.4
text 95.9
black and white 95.7
window 95
man 93.5
concert 90.7
monochrome 75.9

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 87.6%
Calm 99.6%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 42-50
Gender Male, 68.3%
Calm 94.3%
Angry 1.8%
Sad 1.6%
Confused 0.7%
Disgusted 0.7%
Happy 0.4%
Fear 0.3%
Surprised 0.2%

AWS Rekognition

Age 16-24
Gender Male, 88.5%
Fear 61.4%
Calm 15.1%
Sad 15.1%
Confused 3.9%
Surprised 1.3%
Angry 1.1%
Disgusted 1.1%
Happy 1%

AWS Rekognition

Age 48-56
Gender Male, 95.7%
Calm 62%
Happy 22%
Sad 7.5%
Angry 2.7%
Surprised 1.7%
Fear 1.6%
Disgusted 1.5%
Confused 1%

AWS Rekognition

Age 40-48
Gender Male, 99.9%
Sad 51.2%
Calm 28.8%
Angry 8.4%
Disgusted 4.4%
Confused 2.2%
Surprised 2.1%
Fear 1.7%
Happy 1.3%

Feature analysis

Amazon

Person 98.3%
Piano 75.9%

Captions

Microsoft

a person sitting in front of a window 50.5%
a person standing in front of a window 50.4%
a person that is standing in front of a window 50.3%

Text analysis

Amazon

S
S.S.TIVIVES S
S.S.TIVIVES

Google

TIVINES
TIVINES