Human Generated Data

Title

Untitled (four women standing around set dining room table)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11940

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (four women standing around set dining room table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11940

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 99.4
Person 99.4
Person 99.1
Chair 99
Furniture 99
Chair 98.7
Clinic 97.2
Dining Table 94.7
Table 94.7
Room 93.7
Indoors 93.7
Helmet 88.5
Apparel 88.5
Clothing 88.5
Helmet 85.6
Helmet 85.4
Workshop 75.7
Operating Theatre 70.5
Hospital 70.5
People 68.1
Bedroom 63.4
Bed 61.2
Living Room 61

Clarifai
created on 2023-10-25

people 99.7
adult 98.2
furniture 97.6
man 97.5
group together 96.2
woman 95.4
monochrome 95.3
group 93.7
room 92.7
two 92.6
child 91.8
wear 91.4
chair 88.7
indoors 88.3
four 88.1
family 86.8
sit 86.6
recreation 84.9
three 84.9
home 82.6

Imagga
created on 2022-01-15

dishwasher 30.8
room 29.6
interior 26.5
table 25.9
man 23.5
people 23.4
chair 22.6
white goods 20.9
person 20.2
indoors 20.2
home 19.9
work 19.6
kitchen 18.2
home appliance 18
modern 17.5
appliance 16.2
business 15.8
classroom 15.5
male 14.9
women 14.2
house 14.2
adult 14.1
furniture 13.9
lifestyle 13.7
office 13.3
smiling 13
happy 12.5
glass 11.8
team 11.6
restaurant 11.5
together 11.4
luxury 11.1
working 10.6
cheerful 10.6
couple 10.4
shop 10.4
floor 10.2
meal 10.1
indoor 10
holding 9.9
dinner 9.8
decor 9.7
group 9.7
sitting 9.4
architecture 9.4
businessman 8.8
chairs 8.8
food 8.6
businesspeople 8.5
togetherness 8.5
meeting 8.5
design 8.4
desk 8.3
inside 8.3
businesswoman 8.2
computer 8
family 8
decoration 8
job 8
lunch 7.9
day 7.8
smile 7.8
blackboard 7.8
empty 7.7
men 7.7
class 7.7
attractive 7.7
hotel 7.6
two 7.6
dining 7.6
communication 7.6
drink 7.5
mature 7.4
light 7.3
worker 7.2
holiday 7.2
happiness 7
hospital 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 98.2
person 96.5
clothing 94.3
table 94.3
man 85.6
drawing 61
furniture 58.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 74.5%
Calm 36.6%
Surprised 31.6%
Happy 31.2%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 95%
Calm 96.4%
Happy 2.3%
Sad 0.5%
Confused 0.3%
Surprised 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 61.2%
Happy 69.4%
Calm 13.9%
Sad 13.6%
Confused 1%
Surprised 0.9%
Disgusted 0.5%
Fear 0.4%
Angry 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 99%
Dining Table 94.7%
Helmet 88.5%

Text analysis

Amazon

2059
YT3RA8
38AG YT3RA8 A30H3330
38AG
A30H3330