Human Generated Data

Title

Untitled (women typing)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20160

Human Generated Data

Title

Untitled (women typing)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Electronics 99.9
Pc 99.9
Computer 99.9
Human 99.3
Person 99.3
Monitor 99.1
LCD Screen 99.1
Screen 99.1
Display 99.1
Person 99
Table 98.9
Furniture 98.9
Laptop 98.7
Tabletop 80.2
Apparel 71.9
Clothing 71.9
Photography 64.1
Face 64.1
Photo 64.1
Portrait 64.1
Desk 51.7

Imagga
created on 2022-03-05

office 41.9
man 40.3
businessman 35.3
business 34.6
people 34
male 31.2
meeting 30.1
adult 29.4
person 28.1
room 27
professional 26.9
computer 24.9
happy 23.8
work 23.7
businesswoman 23.6
sitting 23.2
men 23.2
worker 23
teacher 22.9
desk 22.8
smiling 22.4
team 22.4
table 22.2
laptop 21.1
modern 21
executive 20.8
teamwork 20.4
indoors 20.2
women 19.8
businesspeople 19
job 18.6
group 18.5
communication 18.5
home 18.3
corporate 18
working 17.7
together 17.5
colleagues 17.5
talking 17.1
interior 15.9
lifestyle 15.9
couple 15.7
conference 15.6
mature 14.9
indoor 14.6
smile 14.2
classroom 14.2
suit 13.5
two 12.7
educator 12.5
portrait 12.3
looking 12
technology 11.9
chair 11.6
mid adult 11.6
30s 11.5
cheerful 11.4
education 11.3
salon 11.2
presentation 11.2
clothing 11.1
20s 11
happiness 11
confident 10.9
coworkers 10.8
employee 10.3
manager 10.2
company 10.2
successful 10.1
handsome 9.8
attractive 9.8
discussion 9.7
staff 9.6
color 9.5
casual 9.3
house 9.2
furniture 9.2
holding 9.1
student 9
success 8.8
associates 8.8
40s 8.8
restaurant 8.8
leader 8.7
boss 8.6
workplace 8.6
keyboard 8.4
senior 8.4
clinic 8.4
hospital 8.3
kitchen 8
building 7.9
shop 7.9
two people 7.8
busy 7.7
ethnic 7.6
career 7.6
togetherness 7.6
study 7.5
board 7.4
life 7.3
nurse 7.3
idea 7.1
face 7.1
information 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

window 98.8
indoor 88.7
computer 86
person 76.7
laptop 75.2
black and white 75.1
text 70.1

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 73.4%
Calm 88.6%
Sad 9.3%
Happy 1%
Angry 0.4%
Disgusted 0.3%
Confused 0.2%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 21-29
Gender Female, 99.9%
Sad 81.2%
Calm 11.1%
Happy 3.5%
Fear 1.9%
Surprised 1%
Angry 0.8%
Disgusted 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Desk 51.7%

Captions

Microsoft

a person sitting at a desk in front of a window 90.4%
a person sitting at a desk in front of a window 84.7%
a person sitting at a table in front of a window 84.5%

Text analysis

Amazon

es
YТ37-X