Human Generated Data

Title

Untitled (woman typing)

Date

1970s

People

Artist: Susan Meiselas, American born 1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1855

Copyright

© Susan Meiselas / Magnum

Human Generated Data

Title

Untitled (woman typing)

People

Artist: Susan Meiselas, American born 1948

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Home Decor 99.9
Person 99.7
Human 99.7
Bedroom 95
Indoors 95
Room 95
Furniture 94.6
Desk 94.4
Table 94.4
Dorm Room 89.1
LCD Screen 86
Electronics 86
Screen 86
Display 86
Window 74.8
Machine 59.6
Office 57.7
Computer 57.4
Monitor 51.6

Imagga
created on 2022-01-22

office 94.2
room 58.8
classroom 47.9
table 46.5
desk 41.9
computer 40.5
business 34.6
indoors 31.7
businessman 30
laptop 29.1
furniture 26.5
interior 25.7
people 25.1
working 23.9
chair 23.7
meeting 23.6
male 23.4
man 22.9
sitting 22.3
work 22
group 21.8
modern 21.8
home 21.6
professional 21.5
businesswoman 20.9
businesspeople 19.9
adult 19.3
person 18.5
corporate 18.1
team 17.9
executive 17.9
indoor 17.4
house 16.7
smiling 16.7
happy 15.7
technology 15.6
education 15.6
colleagues 15.6
talking 15.2
job 15.1
manager 14.9
men 14.6
workplace 14.3
floor 14
teamwork 13.9
confident 13.7
looking 13.6
communication 13.5
monitor 13.4
worker 13.4
design 13
success 12.9
suit 12.6
together 12.3
keyboard 12.2
women 11.9
coworkers 11.8
discussion 11.7
lifestyle 11.6
window 11.5
successful 11
associates 10.8
conference 10.8
light 10.7
30s 10.6
cheerful 10.6
career 10.4
contemporary 10.4
empty 10.3
inside 10.1
smile 10
cooperation 9.7
teacher 9.6
personal computer 9.6
reading 9.5
happiness 9.4
student 9.4
nobody 9.3
furnishing 9.3
center 9.3
wood 9.2
kitchen 9
desktop computer 8.9
decor 8.8
discussing 8.8
businessmen 8.8
seat 8.7
corporation 8.7
glass 8.6
learning 8.5
showing 8.4
presentation 8.4
mature 8.4
phone 8.3
occupation 8.3
file 8.2
board 8.1
machine 8.1
boardroom 7.9
30 35 years 7.9
notebook 7.8
color 7.8
partners 7.8
conversation 7.8
portrait 7.8
couch 7.7
employee 7.7
class 7.7
casual 7.6
pen 7.6
senior 7.5
screen 7.5
service 7.4
equipment 7.4
book 7.3
digital computer 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

indoor 99.8
window 99.5
wall 99.3
black and white 91.8
computer 91
table 88
room 78.9
text 75.6
office building 75.5
office 74.5
person 66.5
furniture 19.6
cluttered 13.8
desk 7.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 50-58
Gender Female, 100%
Angry 99.3%
Confused 0.3%
Calm 0.2%
Disgusted 0.1%
Sad 0%
Surprised 0%
Fear 0%
Happy 0%

Microsoft Cognitive Services

Age 61
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Monitor 51.6%

Captions

Microsoft

a person sitting at a desk in a room 98.3%
a person sitting at a desk in front of a window 94.5%
a person sitting at a table in a room 94.4%