Human Generated Data

Title

Untitled (Ringling performance with men in masks and woman in swim suit)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10645

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Ringling performance with men in masks and woman in swim suit)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10645

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Clothing 93.8
Apparel 93.8
Person 92.2
Human 92.2
Leisure Activities 83.8
Dance Pose 82.1
Stage 80.6
Female 75.3
Dress 73.5
Person 72.9
Shorts 69.4
People 60.9
Woman 58.5
Suit 55.3
Coat 55.3
Overcoat 55.3

Clarifai
created on 2023-10-26

people 99.8
group 97.7
illustration 97.4
interaction 93.7
adult 93.7
man 93.5
monochrome 89.4
two 89
several 88.6
wear 87.4
group together 87.1
music 85.9
art 85.8
commerce 84.7
veil 83.3
three 81.7
one 80.2
woman 80.2
print 79.3
many 76.5

Imagga
created on 2022-01-15

glass 44.1
menorah 43.3
candelabrum 33.9
candlestick 26.3
architecture 22.1
water 18
goblet 17.1
holder 16.8
city 16.6
building 14.8
bridge 14.6
container 14.5
chandelier 14.1
structure 14.1
travel 14.1
sky 14
night 13.3
wine 13
urban 12.2
design 11.8
celebration 11.2
park 10.9
light 10.7
lighting fixture 10.5
old 10.4
party 10.3
tower 9.8
fountain 9.7
table 9.6
cityscape 9.5
glasses 9.2
drink 9.2
reflection 9
history 8.9
digital 8.9
downtown 8.6
cold 8.6
construction 8.5
skyline 8.5
holding device 8.5
fixture 8.3
tourism 8.2
landmark 8.1
business 7.9
holiday 7.9
winter 7.7
place 7.4
ice 7.4
river 7.1

Microsoft
created on 2022-01-15

text 97.9
drawing 94.1
cartoon 77.1
painting 75.8
vase 71.7
clothing 71.3
sketch 71.1
person 64.1
dance 60.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Female, 51%
Happy 89.8%
Calm 3.9%
Sad 3.4%
Surprised 1.8%
Confused 0.5%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 92.2%

Categories

Captions

Microsoft
created on 2022-01-15

text 23%

Text analysis

Amazon

34983
ESC
KODVK-EVEEIA

Google

3
3 46983
46983