Human Generated Data

Title

Untitled (men and women eating in dining room, Jos. Wharton Estate (Lippincott))

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5158

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women eating in dining room, Jos. Wharton Estate (Lippincott))

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5158

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Chair 99.9
Furniture 99.9
Chair 99.1
Room 96.4
Indoors 96.4
Person 94.6
Human 94.6
Person 94.6
Person 92.7
Person 91.8
Dining Room 91.1
Chair 86.5
Person 85.6
Interior Design 80
People 79.8
Person 78.9
Meal 76.2
Food 76.2
Person 75.1
Person 64.6
Person 61.9
Dining Table 59.9
Table 59.9
Chair 57.1
Chair 56.6
Restaurant 55.2

Clarifai
created on 2023-10-26

people 99.5
indoors 96.8
man 96.4
group 96.3
chair 95.7
sit 95.3
adult 95.2
room 94.4
dining room 93.5
woman 92.3
table 91.2
many 89.8
furniture 89.2
child 87.5
education 86.8
leader 82.7
family 79.7
league 75.9
monochrome 75.7
meeting 73.1

Imagga
created on 2022-01-23

house 27.1
interior 25.6
architecture 21.1
room 20.8
modern 19.6
decoration 19
table 18.4
home 17.5
art 17.1
design 16.9
decor 15.9
furniture 14.8
marble 14.8
luxury 14.6
hall 13.8
history 13.4
people 13.4
sculpture 12.7
drawing 12.5
urban 12.2
window 12.1
old 11.8
city 11.6
building 11.1
indoor 10.9
business 10.9
chair 10.7
life 10.6
structure 10.4
elegant 10.3
style 9.6
sketch 9.5
construction 9.4
light 9.3
inside 9.2
wedding 9.2
glass 9
new 8.9
indoors 8.8
celebration 8.8
symbol 8.7
scene 8.6
residential 8.6
creation 8.5
flower 8.4
restaurant 8.3
silhouette 8.3
retro 8.2
lifestyle 7.9
women 7.9
antique 7.8
person 7.8
facility 7.6
living 7.6
elegance 7.5
historical 7.5
man 7.5
holiday 7.2
family 7.1
day 7.1
travel 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

table 98.5
text 97.8
chair 96.7
window 93.5
house 80.9
wedding 77.7
vase 77.1
black 70.3
white 68.8
kitchen & dining room table 64.2
old 59.7
coffee table 56.1
room 43.9
furniture 27.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 65.9%
Sad 92.7%
Confused 2.2%
Calm 2.1%
Happy 1.9%
Disgusted 0.5%
Surprised 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 25-35
Gender Male, 50.1%
Confused 38.3%
Calm 37.3%
Sad 11.9%
Happy 7.2%
Angry 2.5%
Fear 1.2%
Disgusted 0.8%
Surprised 0.8%

AWS Rekognition

Age 41-49
Gender Female, 51.5%
Calm 73.9%
Sad 11.8%
Confused 6.1%
Disgusted 4.9%
Happy 1.9%
Angry 0.6%
Surprised 0.5%
Fear 0.3%

AWS Rekognition

Age 27-37
Gender Female, 91.1%
Calm 88.4%
Fear 6.6%
Happy 1.7%
Sad 1.3%
Confused 1.1%
Disgusted 0.3%
Surprised 0.3%
Angry 0.2%

AWS Rekognition

Age 31-41
Gender Female, 71.7%
Happy 65.8%
Confused 18.4%
Calm 8.6%
Sad 3.3%
Angry 1.5%
Disgusted 1%
Fear 0.9%
Surprised 0.6%

AWS Rekognition

Age 28-38
Gender Female, 97.8%
Sad 84.8%
Calm 7.6%
Happy 3.3%
Confused 2.7%
Angry 0.5%
Disgusted 0.4%
Fear 0.4%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Chair 99.9%
Person 94.6%

Categories

Imagga

interior objects 54.5%
paintings art 45%

Text analysis

Amazon

13707
19909.

Google

13707
13707