Human Generated Data

Title

Untitled (men and women seated around dinner table)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5009

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated around dinner table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5009

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.1
Human 99.1
Person 99
Person 98.1
Person 97.5
Person 97.2
Clothing 89.8
Apparel 89.8
People 82.5
Indoors 75.4
Furniture 74.4
Clinic 72.7
Room 72.5
Text 64.3
Sitting 58.3
Suit 57.3
Coat 57.3
Overcoat 57.3
Photography 56.4
Photo 56.4

Clarifai
created on 2023-10-26

indoors 98.2
people 98.2
adult 97.5
man 96.9
woman 91.3
chair 91
sit 89.7
monochrome 88
group 86.8
furniture 84.2
home 81.9
table 81.9
dining room 79.5
window 76.4
horizontal 75.3
side view 74.4
scientist 73.9
togetherness 73.5
room 71.9
three 68.3

Imagga
created on 2022-01-22

counter 42
man 41
people 35.1
male 34
office 32.6
person 31.8
adult 31
businessman 28.3
happy 28.2
business 27.9
clinic 27.3
smiling 26.8
colleagues 26.2
professional 25.6
indoors 25.5
businesspeople 23.7
sitting 23.2
meeting 22.6
shop 22.5
men 22.3
group 21.8
businesswoman 20.9
team 20.6
room 19.9
women 19.8
job 19.5
couple 19.2
work 18.8
patient 18.7
teamwork 18.5
corporate 18
cheerful 17.9
indoor 17.3
30s 17.3
barbershop 16.9
computer 16.8
modern 16.8
together 16.6
occupation 16.5
casual 16.1
home 16
associates 15.7
mid adult 15.4
talking 15.2
mercantile establishment 15.2
doctor 15
senior 15
smile 15
day 14.9
table 14.9
20s 14.7
laptop 14.6
lifestyle 14.5
worker 14.3
portrait 14.2
working 14.1
executive 13.6
medical 13.2
mature 13
color 12.8
coworkers 12.8
horizontal 12.6
holding 12.4
restaurant 12.4
happiness 11.8
two people 11.7
desk 11.5
coat 11.5
bright 11.4
face 11.4
looking 11.2
hospital 11
conference 10.7
discussion 10.7
four 10.5
clothing 10.2
case 10.2
two 10.2
lab coat 10.1
place of business 10.1
suit 9.9
attractive 9.8
40s 9.7
interior 9.7
health 9.7
building 9.7
corporation 9.6
education 9.5
manager 9.3
teacher 9.1
success 8.8
casual clothing 8.8
staff 8.6
nurse 8.6
ethnic 8.6
communication 8.4
employee 8.3
care 8.2
handsome 8
30 35 years 7.9
good mood 7.8
standing 7.8
middle aged 7.8
cooperation 7.7
females 7.6
service 7.6
technology 7.4
specialist 7.4
salon 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

window 95.7
person 93.8
text 85.9
vase 81.7
furniture 80.6
house 79.1
table 78
clothing 70.7
man 50.4
old 42.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Female, 57.7%
Calm 89.6%
Confused 4.5%
Happy 2.3%
Sad 2%
Surprised 0.7%
Disgusted 0.5%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 27-37
Gender Male, 97.9%
Calm 93.9%
Happy 2.3%
Surprised 1.1%
Confused 0.9%
Sad 0.5%
Angry 0.5%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 29-39
Gender Female, 88.4%
Calm 99.3%
Happy 0.3%
Sad 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 31-41
Gender Male, 99.4%
Calm 36.3%
Sad 36.1%
Confused 12.2%
Happy 6.3%
Angry 4.4%
Fear 2.6%
Surprised 1.5%
Disgusted 0.8%

AWS Rekognition

Age 28-38
Gender Male, 99.2%
Calm 65.2%
Surprised 18.1%
Happy 7.1%
Sad 4.8%
Angry 1.5%
Fear 1.2%
Disgusted 1.1%
Confused 1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%

Categories

Imagga

interior objects 99.8%

Text analysis

Amazon

11418
ar
NAGOY

Google

|| 4 18-
||
4
18-