Human Generated Data

Title

Untitled (group of men and women eating in dining room)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5153

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of men and women eating in dining room)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Chair 99.8
Furniture 99.8
Person 98.8
Human 98.8
Person 98
Person 97.3
Person 97.3
Person 97.1
Chair 96.6
Chair 96.4
Restaurant 95.8
Person 93.9
Person 93.4
Room 92.6
Indoors 92.6
Chair 91.7
Person 88.3
Person 88.2
Person 84.3
Dining Room 83.3
Table 79
Dining Table 79
Person 78.9
Cafeteria 76.7
Meal 75
Food 75
Person 74
People 71.3
Text 61
Food Court 57.4

Imagga
created on 2022-01-23

man 30.9
male 29.1
people 27.3
person 26.4
business 24.9
room 23.8
businessman 22.9
adult 21.7
women 20.5
couple 20
indoors 19.3
table 18.3
group 17.7
classroom 17.4
sitting 17.2
men 17.2
teacher 16.7
meeting 16
office 15.4
sax 15.2
together 14.9
home 14.3
senior 14
kin 13.8
smiling 13.7
interior 13.3
mature 13
cheerful 13
businesswoman 12.7
wind instrument 12.6
chair 12.5
talking 12.3
newspaper 12.2
brass 11.8
professional 11.7
team 11.6
family 11.6
30s 11.5
work 11.4
businesspeople 11.4
happy 11.3
20s 11
day 11
lifestyle 10.8
colleagues 10.7
job 10.6
love 10.2
product 10.2
two 10.2
nurse 10
casual clothing 9.8
blackboard 9.8
40s 9.7
discussion 9.7
patient 9.4
casual 9.3
manager 9.3
planner 9.1
indoor 9.1
modern 9.1
portrait 9.1
educator 9
desk 8.7
child 8.7
four 8.6
plan 8.5
communication 8.4
restaurant 8.2
girls 8.2
suit 8.1
creation 8.1
working 7.9
worker 7.9
discussing 7.9
smile 7.8
education 7.8
grandfather 7.8
two people 7.8
musical instrument 7.8
life 7.7
class 7.7
drawing 7.7
old 7.7
mother 7.6
females 7.6
friends 7.5
human 7.5
friendship 7.5
teamwork 7.4
executive 7.4
inside 7.4
relaxing 7.3
success 7.2
to 7.1
architecture 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.7
table 97.2
person 94.2
furniture 91.6
chair 83.9
man 73.8
clothing 62.8
people 61.2
dining table 11.2

Face analysis

Amazon

Google

AWS Rekognition

Age 41-49
Gender Male, 95.8%
Calm 95.6%
Happy 1.9%
Surprised 1%
Sad 0.6%
Confused 0.4%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 49-57
Gender Male, 68.7%
Calm 95.3%
Surprised 2.4%
Sad 0.6%
Angry 0.5%
Confused 0.5%
Fear 0.4%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 50-58
Gender Female, 57.3%
Calm 44%
Sad 27.9%
Happy 11.4%
Confused 7.1%
Angry 2.9%
Surprised 2.5%
Fear 2.2%
Disgusted 2.1%

AWS Rekognition

Age 48-56
Gender Male, 90.6%
Happy 99.4%
Calm 0.3%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 83.7%
Calm 91.9%
Happy 3.5%
Sad 2.3%
Surprised 0.6%
Fear 0.6%
Angry 0.5%
Confused 0.4%
Disgusted 0.3%

AWS Rekognition

Age 41-49
Gender Female, 63.4%
Calm 88.2%
Sad 7.4%
Happy 2.7%
Fear 0.5%
Confused 0.3%
Surprised 0.3%
Angry 0.3%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.8%
Person 98.8%

Captions

Microsoft

a group of people sitting at a table 92.3%
a group of people sitting around a table 92.2%
a group of people sitting on a table 85.8%

Text analysis

Amazon

13708
13708.
-YY3RAS
MAMTSA3
NAMOR -YY3RAS MAMTSA3
NAMOR

Google

13708. 1370
1370
13708.