Human Generated Data

Title

Untitled (two couples seated at table/studio light visible)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5268

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples seated at table/studio light visible)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5268

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Chair 99.8
Furniture 99.8
Chair 99.6
Chair 99.3
Person 99.1
Human 99.1
Person 98
Person 97.3
Dining Table 97.2
Table 97.2
Person 96.6
Room 94.9
Indoors 94.9
Restaurant 92.1
Dining Room 89.6
Meal 82.8
Food 82.8
People 79.1
Cafeteria 72.7
Food Court 61.5
Sitting 58.4
Senior Citizen 57.5
Dish 55.3

Clarifai
created on 2023-10-26

people 98.9
table 97.4
man 97.4
indoors 97.1
adult 96.9
chair 96.5
dining room 96.1
sit 95.9
dining 95
woman 94.2
furniture 94
room 92.6
monochrome 91
window 88.8
family 87.7
two 86.6
couple 85.8
restaurant 82.7
coffee 82.6
inside 80.9

Imagga
created on 2022-01-22

sketch 30.9
drawing 26.3
people 25.6
person 21.6
representation 20.6
indoors 18.4
room 17.8
man 17.5
office 17
table 16.6
home 16.2
business 15.8
adult 15.7
window 14.9
smile 14.2
sitting 13.7
computer 13.6
male 13.5
working 13.2
smiling 13
clip art 13
design 12.9
work 12.7
modern 12.6
interior 12.4
portrait 12.3
lifestyle 12.3
chair 11.9
house 11.7
furniture 11.4
professional 11.2
men 11.2
team 10.7
art 10.6
health 10.4
teamwork 10.2
space 10.1
businesswoman 10
silhouette 9.9
holding 9.9
worker 9.8
human 9.7
businessman 9.7
desk 9.5
face 9.2
indoor 9.1
laptop 9.1
technology 8.9
style 8.9
color 8.9
group 8.9
happy 8.8
reflection 8.7
women 8.7
creation 8.4
communication 8.4
alone 8.2
cheerful 8.1
light 8
copy 8
bright 7.9
screen 7.7
casual 7.6
case 7.5
facility 7.1
cartoon 7.1
seat 7.1
decor 7.1
medical 7.1
day 7.1
product 7

Google
created on 2022-01-22

Photograph 94.2
Furniture 93
Chair 90.3
Table 89.8
Black-and-white 83.7
Snapshot 74.3
Monochrome photography 73.5
Monochrome 72.3
Vintage clothing 69
Event 68.6
Room 66.8
Dining room 65.1
Art 64.8
Sitting 63.8
Service 63.3
Hat 60.3
Tableware 54.5
Door 54.1
History 52.8
T-shirt 52.5

Microsoft
created on 2022-01-22

furniture 98.4
chair 97.4
window 97
text 96.7
clothing 90.9
person 87.9
man 82.7
table 47.4
dining table 27.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 90.9%
Happy 49.5%
Calm 22.5%
Surprised 19.4%
Confused 2%
Fear 2%
Angry 1.8%
Sad 1.6%
Disgusted 1.2%

AWS Rekognition

Age 42-50
Gender Female, 72.9%
Calm 91.4%
Happy 6%
Surprised 1.7%
Disgusted 0.3%
Sad 0.2%
Angry 0.2%
Confused 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Male, 81.9%
Calm 94.1%
Sad 2%
Angry 1%
Happy 1%
Surprised 0.7%
Fear 0.6%
Confused 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.8%
Person 99.1%

Categories

Imagga

paintings art 72%
people portraits 26.5%

Text analysis

Amazon

5380

Google

5380
5380