Human Generated Data

Title

Untitled (two couples seated at dining room table)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5303

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (two couples seated at dining room table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Furniture 99.7
Table 99.6
Dining Table 99.6
Person 99.4
Human 99.4
Person 98.4
Room 98.2
Indoors 98.2
Dining Room 98.2
Person 96.3
Chair 92.2
Meal 90.6
Food 90.6
People 87.7
Dish 77.7
Person 75.8
Senior Citizen 69
Dinner 68.3
Supper 68.3
Photography 66.3
Face 66.3
Portrait 66.3
Photo 66.3

Imagga
created on 2022-01-22

teacher 45.9
man 43
male 39
adult 38.6
person 38.4
senior 34.7
educator 34.6
people 32.4
professional 31.6
indoors 26.4
home 24.7
mature 24.2
sitting 24.1
looking 24
elderly 23.9
businessman 23.8
laptop 23.7
office 21.8
couple 20.9
business 20.7
together 20.2
men 19.7
smiling 19.5
happy 19.4
computer 18.4
room 18.2
old 18.1
table 17.5
retirement 17.3
work 17.3
portrait 16.8
retired 16.5
casual 16.1
meeting 16
face 15.6
lifestyle 15.2
indoor 14.6
desk 14.4
group 13.7
lab coat 13.6
technology 13.4
specialist 13.4
job 13.3
medical 13.2
camera 12.9
women 12.7
older 12.6
executive 12.5
husband 12.4
businesspeople 12.3
doctor 12.2
brass 11.9
pensioner 11.8
horizontal 11.7
colleagues 11.7
team 11.7
to 11.5
working 11.5
hospital 11.5
patient 11.4
wife 11.4
coat 11.2
newspaper 11.2
teamwork 11.1
day 11
businesswoman 10.9
worker 10.8
60s 10.7
two people 10.7
modern 10.5
talking 10.5
health 10.4
plan 10.4
education 10.4
20s 10.1
color 10
aged 10
clinic 9.9
handsome 9.8
look 9.6
wind instrument 9.6
corporate 9.5
happiness 9.4
nurse 9.3
smile 9.3
inside 9.2
alone 9.1
classroom 9
case 8.8
40s 8.8
discussion 8.8
life 8.7
mid adult 8.7
using 8.7
project 8.7
bright 8.6
reading 8.6
relaxed 8.4
notebook 8.4
clothing 8.4
hand 8.4
glasses 8.3
care 8.2
cheerful 8.1
suit 8.1
success 8
medicine 7.9
70s 7.9
coworkers 7.9
client 7.8
paper 7.8
writing 7.8
designer 7.7
concentration 7.7
1 7.7
serious 7.6
student 7.6
age 7.6
jacket 7.5
human 7.5
manager 7.4
holding 7.4
focus 7.4
active 7.2
musical instrument 7.1
interior 7.1
cornet 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.1
table 96.6
window 95.5
person 94.1
candle 86
vase 70.9
wedding cake 69
old 59.4
people 56.3
black and white 52.4

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.6%
Calm 96%
Happy 1.7%
Sad 1.3%
Surprised 0.3%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 95.7%
Calm 94.6%
Sad 2.8%
Happy 1%
Confused 0.5%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people sitting at a table in front of a window 88.3%
a group of people sitting at a table 88.2%
an old photo of a group of people sitting at a table 88.1%

Text analysis

Amazon

4925
BOLEJA
BOLEJA BVDE
BVDE

Google

4925
4925