Human Generated Data

Title

Untitled (four women gathered around set dining room table with flowers; woman touches flowers)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5317

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women gathered around set dining room table with flowers; woman touches flowers)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5317

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.4
Person 99.4
Clinic 97.7
Helmet 97.5
Clothing 97.5
Apparel 97.5
Person 95.1
Indoors 90.9
Room 89.7
Chair 84.8
Furniture 84.8
People 78.7
Hospital 74.8
Helmet 74.6
Table 68.8
Dining Table 68.8
Doctor 68
Operating Theatre 64.5
Helmet 61.2

Clarifai
created on 2023-10-26

people 99.3
woman 97.6
man 97.6
adult 97.1
indoors 96.5
furniture 96
chair 94.2
room 93.1
table 92.9
group 92.5
family 91.3
monochrome 91.1
sit 88.1
two 86.2
child 85
couple 84.7
group together 81.9
house 81.6
hospital 81.5
inside 78.5

Imagga
created on 2022-01-22

sketch 82.1
drawing 61.7
representation 49.5
glass 33.9
table 32.2
restaurant 22.6
dinner 21.3
dining 20
wedding 19.3
party 18.9
interior 16.8
drink 16.7
setting 16.4
elegant 16.3
laboratory 15.4
decoration 15.2
decor 15
banquet 14.9
glasses 14.8
wine 14.8
reception 14.7
room 14.6
people 14.5
plate 14.4
medical 14.1
medicine 14.1
person 14
luxury 13.7
chemistry 13.5
celebration 12.8
elegance 12.6
fork 12.5
formal 12.4
research 12.4
biology 12.3
indoors 12.3
service 12
lunch 12
set 11.9
food 11.8
catering 11.8
cutlery 11.7
lab 11.7
knife 11.5
man 11.4
work 11.1
event 11.1
silverware 10.8
biotechnology 10.8
meal 10.7
wineglass 10.7
experiment 10.7
design 10.7
chemical 10.6
home 10.5
bouquet 10.5
science 9.8
silver 9.7
fine 9.5
furniture 9.5
negative 9.4
place 9.3
flower 9.2
dish 9.1
tableware 9
working 8.8
tablecloth 8.8
dine 8.8
napkin 8.8
serve 8.8
arrangement 8.7
test 8.7
development 8.6
marriage 8.5
professional 8.5
hand 8.4
film 7.9
business 7.9
art 7.9
glassware 7.8
flowers 7.8
black 7.8
male 7.8
adult 7.8
fancy 7.7
modern 7.7
chair 7.6
house 7.5
worker 7.5
equipment 7.3
day 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

drawing 98.4
text 97.4
sketch 97.3
person 88
window 86.4
clothing 85.3
cartoon 81.8
old 74.9
group 63.1
woman 61.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Male, 77.1%
Happy 64.1%
Calm 32.3%
Surprised 1.6%
Confused 0.6%
Sad 0.5%
Disgusted 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 28-38
Gender Male, 83.6%
Happy 64%
Calm 24.3%
Confused 5.1%
Sad 4.6%
Disgusted 0.7%
Surprised 0.6%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 34-42
Gender Male, 72.6%
Happy 83.4%
Calm 13.9%
Sad 1.5%
Confused 0.3%
Surprised 0.3%
Disgusted 0.3%
Fear 0.2%
Angry 0.1%

AWS Rekognition

Age 25-35
Gender Male, 89%
Calm 99.6%
Sad 0.2%
Angry 0.1%
Surprised 0%
Disgusted 0%
Happy 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Helmet 97.5%
Chair 84.8%
Dining Table 68.8%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

6498