Human Generated Data

Title

Untitled (office workers, typewriter)

Date

1953

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22302

Human Generated Data

Title

Untitled (office workers, typewriter)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Workshop 99
Person 98.9
Human 98.9
Person 98.8
Person 98.3
Person 83.2
Person 74.4
Machine 74.3
Clinic 73.7
Furniture 73.2
Indoors 72.2
Building 68.7
Room 68.7
Factory 68.2
Table 67.8
Wood 62
Clothing 62
Apparel 62
Lab 59.9
Plywood 58.4
Screen 55
Electronics 55

Imagga
created on 2022-03-11

room 57.5
classroom 47.8
office 40.9
table 39
desk 34.9
meeting 33.9
business 31.6
people 31.2
man 27.8
businessman 26.5
person 26.5
team 26
work 25.1
male 24.8
executive 24.6
businesswoman 23.6
group 23.4
indoors 22.8
interior 22.1
corporate 21.5
computer 20.9
professional 20
teamwork 19.5
laptop 19.4
sitting 18.9
conference 18.6
working 17.7
happy 17.5
smiling 17.4
chair 17.3
businesspeople 17.1
adult 16.8
restaurant 16
communication 16
presentation 15.8
workplace 15.2
women 15
modern 14.7
suit 14.4
together 14
men 13.7
home 13.6
smile 13.5
worker 13.5
teacher 12.9
talking 12.4
education 12.1
coffee 12
discussion 11.7
colleagues 11.7
lifestyle 11.6
director 11.5
document 11.2
furniture 11.2
manager 11.2
indoor 11
job 10.6
dinner 10.2
happiness 10.2
glass 10.1
board 10.1
center 10
debate 9.9
success 9.7
looking 9.6
dining 9.5
student 8.9
paper 8.6
screen 8.4
black 8.4
hall 8.4
house 8.4
successful 8.2
technology 8.2
hospital 7.9
coworkers 7.9
portrait 7.8
party 7.7
gesture 7.6
career 7.6
meal 7.5
contemporary 7.5
life 7.5
company 7.4
food 7.4
occupation 7.3
kitchen 7.3
cheerful 7.3

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

indoor 96.4
text 88.5
person 83.9
table 35.6
worktable 14.5

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Female, 50.2%
Calm 72.6%
Sad 17.4%
Happy 8%
Surprised 0.5%
Confused 0.5%
Angry 0.5%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 33-41
Gender Male, 98.3%
Calm 98.8%
Surprised 0.7%
Sad 0.3%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a man and a woman standing in front of a mirror 49.2%
a person standing in front of a mirror 49.1%
a person standing in front of a mirror posing for the camera 49%

Text analysis

Amazon

h
KUDOK-COVEL