Human Generated Data

Title

Untitled (men and women seated at outdoor table)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5342

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women seated at outdoor table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5342

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Apparel 99.5
Clothing 99.5
Person 99.5
Human 99.5
Person 99.2
Furniture 99
Person 98.6
Person 98.2
Person 97.4
Person 94.4
Home Decor 91.2
Linen 88.1
Face 85.5
Chair 85.4
Chair 85.2
Meal 84.1
Food 84.1
Crowd 79.4
Tie 78
Accessories 78
Accessory 78
Sitting 76.4
Tablecloth 76.3
Table 75.7
Shirt 73.5
Suit 73.1
Overcoat 73.1
Coat 73.1
Chair 71.8
People 70
Hat 68.2
Female 67.8
Person 67.7
Portrait 66.9
Photography 66.9
Photo 66.9
Person 63.5
Dining Table 59.9
Outdoors 59.2
Person 44.6

Clarifai
created on 2023-10-26

people 99.8
man 98.3
group 97.9
group together 97.5
adult 96.8
chair 95.5
leader 95.4
wear 93.1
many 91.5
administration 88.3
several 86.5
retro 80.6
military 79.4
league 78.9
sit 77.9
sitting 75.2
war 68.8
handshake 65.4
five 64.6
three 63.5

Imagga
created on 2022-01-22

man 50.4
male 45.4
person 40.1
businessman 38.9
professional 36.1
business 35.9
people 32.4
office 32.2
adult 31.1
job 30.1
doctor 30.1
coat 29.8
lab coat 28.3
medical 26.5
work 25.9
hospital 25.4
team 24.2
meeting 23.6
colleagues 23.3
patient 23
businesspeople 22.8
corporate 21.5
mature 21.4
businesswoman 20.9
desk 20.8
men 20.6
worker 20
working 19.5
occupation 19.3
looking 19.2
happy 18.8
group 18.6
20s 18.3
table 18.2
specialist 18.1
smiling 18.1
health 18.1
casual 17.8
teamwork 17.6
talking 17.1
medicine 16.7
clinic 16.7
indoor 16.4
senior 15.9
indoors 15.8
room 15.7
care 15.6
portrait 15.5
laptop 15.5
computer 15.2
manager 14.9
sitting 14.6
suit 14.5
handsome 14.3
associates 13.8
clothing 13.7
nurse 13.4
together 13.2
writing 13.1
bright 12.9
color 12.8
practitioner 12.8
two 12.7
40s 12.7
discussion 12.7
executive 12.5
30s 12.5
profession 12.4
ethnic 12.4
smile 12.1
stethoscope 12
brass 11.9
document 11.9
building 11.9
coworkers 11.8
engineer 11.6
lifestyle 11.6
wind instrument 11.6
musical instrument 11.5
uniform 11.5
serious 11.5
face 11.4
plan 11.4
modern 11.2
company 11.2
case 11.1
cornet 10.8
discussing 10.8
attractive 10.5
illness 10.5
technology 10.4
women 10.3
day 10.2
treatment 10.1
garment 10
business people 9.9
businessmen 9.8
four 9.6
career 9.5
sax 9.3
horizontal 9.2
successful 9.2
letter 9
success 8.9
forties 8.8
examination 8.8
lab 8.7
couple 8.7
standing 8.7
mid adult 8.7
corporation 8.7
laboratory 8.7
busy 8.7
exam 8.6
elderly 8.6
twenties 8.6
confident 8.2
new 8.1
builder 8
secretary 7.9
boardroom 7.9
doctors 7.9
25 30 years 7.8
conference 7.8
staff 7.8
designer 7.7
cooperation 7.7
check 7.7
concentration 7.7
architect 7.7
diversity 7.7
student 7.7
life 7.4
focus 7.4
planner 7.4
employee 7.3
surgeon 7.2
look 7
marimba 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 97.7
person 96.6
outdoor 94.7
clothing 89.7
man 86.3
black and white 68.1
posing 61.7
white 60.3
old 59.4
table 53.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.9%
Sad 76.8%
Disgusted 7.9%
Confused 4.6%
Calm 4.5%
Surprised 3.5%
Angry 1%
Fear 0.9%
Happy 0.9%

AWS Rekognition

Age 34-42
Gender Male, 98.5%
Calm 80.1%
Happy 6.5%
Surprised 5.3%
Sad 2.6%
Angry 1.7%
Confused 1.6%
Fear 1.2%
Disgusted 1%

AWS Rekognition

Age 23-33
Gender Male, 99.9%
Calm 48.8%
Sad 43.6%
Confused 5.1%
Happy 0.6%
Surprised 0.5%
Fear 0.5%
Angry 0.5%
Disgusted 0.3%

AWS Rekognition

Age 29-39
Gender Male, 84.6%
Calm 74.5%
Angry 9.6%
Happy 6.1%
Surprised 3.8%
Sad 2.6%
Confused 1.6%
Disgusted 1%
Fear 0.8%

AWS Rekognition

Age 22-30
Gender Female, 74%
Calm 95.9%
Sad 2.6%
Angry 0.5%
Happy 0.4%
Confused 0.3%
Disgusted 0.2%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Chair 85.4%
Tie 78%

Categories

Imagga

paintings art 99.1%

Text analysis

Amazon

17541.

Google

.י5 ר! .ו5 רן
.י5
.ו5
רן
ר!