Human Generated Data

Title

Untitled (women typing)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20161

Human Generated Data

Title

Untitled (women typing)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.5
Person 99.5
Person 99.4
Accessories 94
Accessory 94
Sunglasses 94
Electronics 86.7
Home Decor 84.4
LCD Screen 82.9
Display 82.9
Screen 82.9
Monitor 82.9
Furniture 75.6
Table 75.6
Pc 73.3
Computer 73.3
Desk 71.2
Interior Design 69.4
Indoors 69.4
Girl 63.7
Female 63.7
Face 63.7
Portrait 62.9
Photography 62.9
Photo 62.9
Brick 55.6

Imagga
created on 2022-03-05

man 34.9
office 29.4
people 29
male 25.5
person 25.2
business 24.9
businessman 23.8
room 22.3
table 21.8
meeting 21.6
work 21.2
men 19.7
desk 18.9
nurse 18.6
adult 18.5
indoors 17.5
worker 17.1
team 17
smiling 16.6
happy 16.3
home 15.9
working 15.9
teamwork 15.7
sitting 15.4
businesswoman 15.4
kitchen 14.7
group 14.5
lifestyle 13.7
colleagues 13.6
modern 13.3
businesspeople 13.3
job 13.3
professional 13
computer 12.9
executive 12.5
talking 12.3
laptop 12
corporate 12
house 11.7
medical 11.5
together 11.4
women 11.1
conference 10.7
smile 10.7
mid adult 10.6
30s 10.6
chair 10.3
holding 9.9
restaurant 9.8
coworkers 9.8
cheerful 9.7
clinic 9.7
portrait 9.7
doctor 9.4
furniture 9.3
manager 9.3
presentation 9.3
building 9.2
education 8.6
happiness 8.6
casual 8.5
senior 8.4
mature 8.4
coat 8.3
color 8.3
student 8.3
20s 8.2
patient 8.1
suit 8.1
success 8
hospital 8
interior 8
two people 7.8
glass 7.8
employee 7.7
chemistry 7.7
four 7.7
plan 7.6
communication 7.5
horizontal 7.5
teacher 7.4
coffee 7.4
successful 7.3
indoor 7.3
new 7.3
classroom 7.2
looking 7.2
machine 7.1
science 7.1
face 7.1
shop 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

window 99.9
computer 98
laptop 94.3
indoor 93.5
person 82.8
text 74.8
desk 70
black and white 62.8
clothing 60.5
office 55.6
table 28

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 99.8%
Happy 87.5%
Calm 5.2%
Disgusted 1.7%
Sad 1.7%
Confused 1.2%
Angry 1.2%
Surprised 0.8%
Fear 0.6%

AWS Rekognition

Age 22-30
Gender Female, 99.8%
Calm 93.6%
Sad 2.7%
Surprised 1.4%
Fear 0.9%
Angry 0.5%
Happy 0.5%
Disgusted 0.2%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a person standing in front of a window 89.5%
a person sitting at a desk in front of a window 89.4%
a person standing next to a window 85.7%

Text analysis

Amazon

18
KODAK-SLA

Google

YT37A2-XA
YT37A2-XA