Human Generated Data

Title

Untitled (two seated men in suits, table with food and coffee)

Date

1936

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4763

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two seated men in suits, table with food and coffee)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4763

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 97.4
Chair 96.5
Furniture 96.5
Clinic 92
Hospital 68.2
Shoe 60.8
Clothing 60.8
Footwear 60.8
Apparel 60.8
Operating Theatre 58.4

Clarifai
created on 2023-10-27

people 99.8
adult 98.7
man 98.6
chair 98.4
sit 97
group 96.6
medical practitioner 95.9
two 92.7
three 91.7
woman 91.6
hospital 90.7
leader 90.6
indoors 89.9
group together 89.6
monochrome 89
furniture 88.5
ailment 88.5
medicine 87.8
sitting 86.9
table 84.8

Imagga
created on 2022-01-29

people 29.6
person 28.2
man 27.6
senior 24.4
seller 24.3
sitting 24.1
male 23.4
adult 23.4
home 23.1
smiling 22.4
indoors 22
couple 20.9
happy 20.7
groom 19.3
together 16.6
men 16.3
worker 15.6
room 15.1
day 14.9
old 14.6
cheerful 14.6
30s 13.5
family 13.3
waiter 13.2
mature 13
table 13
indoor 12.8
women 12.7
happiness 12.5
clothing 12.4
businessman 12.4
portrait 12.3
lifestyle 12.3
business 12.1
looking 12
professional 12
two 11.9
coat 11.5
shop 11.5
elderly 11.5
work 11.2
casual 11
20s 11
colleagues 10.7
working 10.6
enjoying 10.4
restaurant 10.3
drink 10
color 10
smile 10
clinic 9.7
40s 9.7
chair 9.7
businesspeople 9.5
love 9.5
meeting 9.4
employee 9.4
face 9.2
mother 9.2
lab coat 9.1
businesswoman 9.1
dining-room attendant 9
suit 9
patient 8.9
to 8.9
medical 8.8
60s 8.8
two people 8.7
older 8.7
retired 8.7
mercantile establishment 8.7
mid adult 8.7
using 8.7
retirement 8.6
husband 8.6
food 8.5
horizontal 8.4
holding 8.3
dress 8.1
team 8.1
computer 8
bright 7.9
50s 7.8
casual clothing 7.8
office 7.7
attractive 7.7
bride 7.7
dinner 7.7
talking 7.6
wife 7.6
adults 7.6
relaxed 7.5
clothes 7.5
tradition 7.4
inside 7.4
wedding 7.4
laptop 7.3
relaxing 7.3
aged 7.2
interior 7.1
metropolitan 7
nurse 7

Microsoft
created on 2022-01-29

furniture 97.9
person 96.6
chair 96.5
sitting 94.7
text 93.2
table 85
clothing 83.2
old 79.4
black 66.5
posing 36.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.5%
Calm 57.1%
Sad 20.6%
Surprised 6.4%
Confused 6.1%
Disgusted 4.6%
Happy 3.1%
Angry 1.3%
Fear 0.8%

Feature analysis

Amazon

Person
Chair
Shoe
Person 99.4%

Categories

Imagga

paintings art 99.1%

Text analysis

Amazon

in
1
ال٣.
2371072

Google

CTURE TOTTEN
CTURE
TOTTEN