Human Generated Data

Title

Untitled (man on stage playing drums)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5136

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man on stage playing drums)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5136

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Musician 93.5
Musical Instrument 93.5
Human 93.5
Person 92.6
Guitar 92.2
Leisure Activities 92.2
Drum 73.8
Percussion 73.8
People 61
Performer 55.8
Text 55.4
Drummer 55.3

Clarifai
created on 2023-10-26

monochrome 99
people 98.8
man 97.1
wedding 93.5
bride 92.6
indoors 85.9
adult 84.1
street 82.8
veil 81.9
woman 81.4
group 80.2
technology 78.8
fun 78.6
art 78.1
black and white 77
light 76.8
illustration 75.6
design 75.4
groom 73.9
science 73

Imagga
created on 2022-01-23

people 18.4
man 16.1
business 13.4
person 13.3
male 12.8
adult 12.4
interior 12.4
symbol 11.4
hand 11.4
human 11.2
men 10.3
room 10.3
house 10.1
portrait 9.7
home 9.7
mosquito net 9.7
face 9.2
modern 9.1
old 9.1
dress 9
design 9
businessman 8.8
black 8.4
relaxation 8.4
film 8.4
silhouette 8.3
sign 8.3
art 8.2
light 8
celebration 8
lifestyle 7.9
women 7.9
indoors 7.9
negative 7.8
party 7.7
motion 7.7
building 7.7
window 7.7
drawing 7.5
technology 7.4
office 7.4
event 7.4
work 7.3
wall 7.3
smiling 7.2
travel 7
sky 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 97.1
drawing 80.4
black and white 69.9
person 63.6
cartoon 54.4
clothing 50.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Male, 98.4%
Fear 40.3%
Calm 22.4%
Sad 17.6%
Confused 6.7%
Disgusted 5.9%
Surprised 4.6%
Angry 1.6%
Happy 0.9%

Feature analysis

Amazon

Person 92.6%
Guitar 92.2%

Captions

Text analysis

Amazon

as
5576
15576
NAMTSA3
LA-ROUV

Google

15576 5576.
15576
5576.