Human Generated Data

Title

Untitled (women working at long rectangular tables)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5324

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (women working at long rectangular tables)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Human 99.6
Person 99.6
Person 99.5
Person 99.2
Person 98.6
Person 97.3
Indoors 97
Room 97
Person 96.1
Person 94.5
Person 91.8
Person 89.9
Interior Design 88.3
Furniture 86.6
Person 83.5
Person 82.9
Person 82.3
Meal 72.9
Food 72.9
People 70.6
Living Room 62.6
Dish 62.3
Apparel 59
Clothing 59
Person 58.7
Classroom 57.8
School 57.8
Workshop 56.5
Bed 56
Kindergarten 55.5
Person 46.6
Person 42

Imagga
created on 2022-01-22

room 49.8
shop 43.6
classroom 42.7
barbershop 35.8
mercantile establishment 31.5
interior 23
blackboard 22.5
table 21
place of business 20.9
people 19.5
restaurant 17.3
man 16.2
business 15.2
house 14.2
modern 14
person 14
indoor 13.7
home 13.5
adult 12.8
male 12
work 11.8
decoration 11.6
design 10.7
decor 10.6
indoors 10.5
group 10.5
old 10.4
establishment 10.3
counter 10.2
furniture 10.1
food 9.7
dining 9.5
men 9.4
meeting 9.4
architecture 9.4
case 9
professional 9
building 8.9
new 8.9
worker 8.9
teacher 8.9
businessman 8.8
setting 8.7
party 8.6
dinner 8.5
inside 8.3
retro 8.2
happy 8.1
chair 8.1
clinic 8
family 8
women 7.9
office 7.8
ancient 7.8
diagram 7.7
money 7.6
finance 7.6
student 7.6
contemporary 7.5
service 7.4
wedding 7.4
occupation 7.3
success 7.2
smiling 7.2
celebration 7.2
kitchen 7.1
day 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 95.6
clothing 83.8
person 80.2
man 73.6
black and white 52.6
old 47.3

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Male, 95.3%
Calm 93.1%
Sad 3.5%
Surprised 0.8%
Happy 0.8%
Angry 0.6%
Disgusted 0.5%
Fear 0.4%
Confused 0.4%

AWS Rekognition

Age 29-39
Gender Male, 93.3%
Calm 90.9%
Sad 5.2%
Confused 1.3%
Happy 1.2%
Disgusted 0.5%
Angry 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 26-36
Gender Male, 98.3%
Sad 86.5%
Confused 7.2%
Calm 4.9%
Happy 0.4%
Fear 0.4%
Angry 0.3%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 20-28
Gender Male, 54.5%
Confused 28.5%
Disgusted 27.4%
Sad 19.4%
Calm 16.5%
Happy 2.8%
Angry 2.6%
Surprised 1.5%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people in a room 92.1%
a group of people posing for the camera 79.1%
a group of people posing for a photo 73.1%

Text analysis

Amazon

IL
12:45
10
12476.
12:45 1.30
+5
1.30
19
... 10
.
Close
10:41-11:30
DC
U
IL45
hund
...
Time
TO
7
LE
Saluding

Google

n.
10
I1
15
Salmda so 104-30 n. 10 I1 15 Cise 12:45 1:30 It45 12476.
104-30
It45
Salmda
12:45
12476.
so
Cise
1:30