Human Generated Data

Title

Untitled (men and women seated around dinner table)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5010

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated around dinner table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5010

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.6
Human 99.6
Chair 99.5
Furniture 99.5
Person 98.8
Person 98.4
Chair 86.4
Chair 85.8
Room 85.1
Indoors 85.1
People 81.7
Clinic 79.1
Person 77.6
Text 65.8
Living Room 64.4
Clothing 59.3
Apparel 59.3
Dining Room 57
Cushion 56.5
Sitting 55.4

Clarifai
created on 2023-10-26

indoors 98.5
people 98.3
adult 97
table 95.9
dining room 95.7
man 95.6
chair 95.3
woman 94.5
sit 91.3
furniture 89.8
dining 88.3
restaurant 83.4
monochrome 81.4
window 79
room 77.6
home 76.5
group 74.6
side view 74.5
education 73.3
family 72.6

Imagga
created on 2022-01-22

man 43.7
office 36.2
businessman 33.6
people 32.4
indoors 31.6
male 31.2
room 31.1
person 31
business 29.8
counter 29.3
adult 27.3
computer 26.5
meeting 26.4
table 26
businesspeople 24.7
sitting 24.1
smiling 23.9
colleagues 23.3
modern 23.1
laptop 21.1
indoor 21
home 20.8
happy 20.7
men 20.6
businesswoman 20
desk 20
professional 19.7
group 19.4
interior 18.6
working 18.6
casual 17.8
executive 17.5
women 17.4
lifestyle 16.6
corporate 16.3
team 16.1
communication 16
chair 15
together 14.9
shop 14.6
talking 14.3
work 14.2
mature 14
teamwork 13.9
restaurant 13.8
associates 13.8
conference 13.7
patient 13.6
furniture 13.4
day 13.3
teacher 13.3
20s 12.8
horizontal 12.6
job 12.4
couple 12.2
cheerful 12.2
looking 12
coworkers 11.8
color 11.7
mid adult 11.6
worker 11.6
holding 11.6
30s 11.5
barbershop 11.4
senior 11.3
manager 11.2
board 11
clinic 10.7
two people 10.7
portrait 10.4
house 10
explaining 9.8
handsome 9.8
discussion 9.7
decor 9.7
technology 9.7
mercantile establishment 9.5
hospital 9.5
happiness 9.4
screen 9.4
light 9.4
company 9.3
smile 9.3
face 9.2
classroom 9.1
hall 9.1
cooperation 8.7
leader 8.7
education 8.7
four 8.6
boss 8.6
bright 8.6
space 8.5
doctor 8.5
salon 8.4
presentation 8.4
seat 8.4
clothing 8.4
building 8.2
confident 8.2
suit 8.1
idea 8
crew 7.9
seminar 7.9
40s 7.8
3d 7.7
concentration 7.7
busy 7.7
apartment 7.7
formal 7.6
sit 7.6
keyboard 7.5
occupation 7.3
medical 7.1

Microsoft
created on 2022-01-22

text 96.1
furniture 93.5
table 92.8
person 88.2
house 86.9
window 85.7
vase 78
wedding 76.4
chair 68.7
clothing 63

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Male, 95.7%
Calm 93.9%
Sad 2.9%
Surprised 1.4%
Confused 1%
Angry 0.3%
Happy 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Male, 91.5%
Happy 88%
Calm 3.3%
Surprised 3.3%
Confused 1.6%
Angry 1.5%
Disgusted 1.5%
Sad 0.5%
Fear 0.4%

AWS Rekognition

Age 41-49
Gender Female, 53.5%
Calm 91.9%
Happy 3.6%
Sad 1.6%
Confused 0.9%
Angry 0.9%
Surprised 0.5%
Disgusted 0.3%
Fear 0.3%

AWS Rekognition

Age 27-37
Gender Male, 95.7%
Calm 39.8%
Sad 17.7%
Happy 14.7%
Surprised 11.5%
Confused 6.6%
Fear 4.5%
Angry 2.6%
Disgusted 2.5%

AWS Rekognition

Age 45-51
Gender Male, 96%
Calm 48.1%
Confused 21.5%
Happy 11.9%
Sad 10.9%
Angry 2.8%
Surprised 2.4%
Fear 1.3%
Disgusted 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 99.5%

Categories

Imagga

interior objects 98.8%

Text analysis

Amazon

11419.
ar
NAGOX
NAGOX YT37A0 MAMTBA3
YT37A0
MAMTBA3

Google

I419. I419.
I419.