Human Generated Data

Title

Untitled (two couples seated at table/studio light visible)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5267

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples seated at table/studio light visible)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5267

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Chair 99.6
Furniture 99.6
Person 99.4
Human 99.4
Person 98.9
Chair 98.7
Room 98.7
Indoors 98.7
Dining Room 98.1
Dining Table 97.9
Table 97.9
Person 96.9
Chair 96.8
Person 94.9
Restaurant 82
Meal 78.2
Food 78.2
People 76.6
Cafeteria 69.9
Housing 57.3
Building 57.3
Workshop 56

Clarifai
created on 2023-10-26

people 98.6
indoors 98
table 96.8
adult 96
man 95.8
chair 95.5
woman 94.1
furniture 92.6
sit 92.6
window 90.5
monochrome 90.3
room 90.1
dining 89.8
dining room 89.5
group 86.7
inside 85.2
family 82.3
group together 81.7
two 80.4
couple 78

Imagga
created on 2022-01-22

interior 53.1
room 46.5
furniture 42
house 40.1
modern 39.3
table 38.6
home 37.8
architecture 28.4
floor 27.9
design 27
decor 26.5
window 25.7
indoors 24.6
apartment 23
3d 22.5
office 21
chair 20.9
contemporary 19.8
indoor 19.2
luxury 18.9
light 18.7
sofa 18.2
comfortable 18.1
wood 17.6
inside 17.5
decoration 17.4
lamp 17.4
wall 17.1
living 17.1
work 15.8
business 15.8
style 15.6
nobody 15.6
seat 15.1
kitchen 14.5
glass 14.2
lifestyle 13.7
building 13.5
people 12.8
elegance 12.6
residential 12.4
dining 12.4
empty 12
domestic 11.8
structure 11.3
render 11.2
computer 11.2
corporate 11.2
furnishing 11.1
person 11
relaxation 10.9
chairs 10.8
flower 10.8
vase 10.8
new 10.5
door 10
case 9.9
residence 9.9
desk 9.8
businessman 9.7
plant 9.7
rendering 9.5
professional 9.5
day 9.4
construction 9.4
cabinet 9.3
hall 8.9
working 8.8
living room 8.8
life 8.7
man 8.7
women 8.7
windows 8.6
estate 8.5
adult 8.5
space 8.5
marble 8.3
laptop 8.2
happy 8.1
open 8.1
team 8.1
group 8.1
ceiling 7.9
real estate 7.8
decorating 7.8
property 7.7
elegant 7.7
drawing 7.7
communication 7.6
businesswoman 7.3
success 7.2
male 7.1
bedroom 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 77.3%
Happy 80.2%
Sad 9.3%
Surprised 7.8%
Calm 0.8%
Disgusted 0.6%
Angry 0.5%
Fear 0.5%
Confused 0.2%

AWS Rekognition

Age 26-36
Gender Male, 75.9%
Calm 89.5%
Fear 3.7%
Surprised 2.6%
Happy 1.3%
Sad 0.9%
Confused 0.8%
Disgusted 0.7%
Angry 0.5%

AWS Rekognition

Age 26-36
Gender Female, 97.1%
Calm 91.3%
Happy 5.9%
Sad 1.7%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.6%
Person 99.4%

Categories

Imagga

paintings art 98.5%

Text analysis

Amazon

5381
JUE
JUE OV BE
OV BE

Google

381
381