Human Generated Data

Title

Untitled (Ringling performance with men in masks and woman in swim suit)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10647

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Ringling performance with men in masks and woman in swim suit)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10647

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 91.3
Human 91.3
Person 80.2
Person 66.2
Leisure Activities 63.5
Clothing 59.4
Apparel 59.4
Stage 57.7
Museum 56.6

Clarifai
created on 2023-10-26

people 99.7
group 96.6
illustration 96.5
monochrome 92.4
interaction 92.3
art 91.9
wear 91.3
man 90.8
adult 90.4
costume 90.3
music 90.1
actor 88.4
theater 88.1
woman 86.2
dancer 86.1
commerce 84
two 80.5
education 78.9
culture 78.4
veil 76.8

Imagga
created on 2022-01-15

menorah 54.8
candelabrum 43.9
candlestick 32.9
fountain 25.9
architecture 25.2
structure 25
holder 21.8
city 20.8
building 18.4
old 18.1
water 16.7
sky 16.6
travel 16.2
history 16.1
famous 15.8
ancient 15.6
statue 13.7
landmark 13.5
blackboard 13.5
urban 13.1
design 12.9
tourism 12.4
art 12.3
historical 12.2
monument 12.1
park 11.5
sculpture 11.3
culture 11.1
holding device 11
business 10.9
symbol 10.8
street 10.1
religion 9
glass 8.9
stone 8.7
bill 8.6
money 8.5
finance 8.4
dollar 8.3
bank 8.3
historic 8.2
cash 8.2
currency 8.1
light 8
river 8
downtown 7.7
grunge 7.7
god 7.6
outdoor 7.6
clouds 7.6
skyline 7.6
pattern 7.5
church 7.4
banking 7.4
detail 7.2
tower 7.2
paper 7.1
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.3
dance 69.6
black and white 53.8
cartoon 53.1
art 51.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 93.6%
Calm 37.4%
Surprised 32.1%
Happy 28.4%
Fear 0.8%
Disgusted 0.4%
Angry 0.3%
Sad 0.3%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.3%

Categories

Captions

Microsoft
created on 2022-01-15

text 23.1%

Text analysis

Amazon

34984
ESI

Google

34984 -NAGOX
34984
-NAGOX