Human Generated Data

Title

Untitled (man sitting at desk)

Date

1957

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20102

Human Generated Data

Title

Untitled (man sitting at desk)

People

Artist: Peter James Studio, American

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 97.8
Human 97.8
Furniture 97.7
Clothing 92.4
Apparel 92.4
Chair 89.3
Sitting 88
Monitor 84.9
Display 84.9
Electronics 84.9
Screen 84.9
LCD Screen 80.6
Desk 73.5
Table 73.5
Pc 69.6
Computer 69.6
Photo 67.6
Portrait 67.6
Photography 67.6
Face 67.6
Shoe 64
Footwear 64
Text 63.2
Indoors 61.3
Suit 60
Coat 60
Overcoat 60
Shelf 59.5
Studio 57.2

Imagga
created on 2022-03-05

man 34.9
adult 33.2
office 31.6
people 30.7
business 30.4
male 27
businessman 26.5
indoors 26.4
person 26
sax 24.5
working 23.9
computer 23.3
work 22.8
professional 21.5
room 20.7
corporate 19.8
happy 19.4
men 18.9
laptop 18.5
teacher 17.8
smiling 17.4
businesswoman 17.3
sitting 17.2
worker 16.5
shop 16.3
businesspeople 16.1
meeting 16
job 15.9
desk 15.5
executive 15.4
lifestyle 15.2
smile 15
women 14.2
table 14.1
building 13.7
indoor 13.7
modern 13.3
portrait 12.9
group 12.9
success 12.9
casual 12.7
communication 12.6
chair 12.6
looking 12
newspaper 11.8
interior 11.5
education 11.3
home 11.2
mature 11.2
teamwork 11.1
occupation 11
team 10.7
monitor 10.7
career 10.4
barbershop 9.8
cheerful 9.7
health 9.7
library 9.7
technology 9.6
talking 9.5
happiness 9.4
keyboard 9.4
educator 9.3
manager 9.3
classroom 9.2
horizontal 9.2
successful 9.1
confident 9.1
attractive 9.1
mercantile establishment 9
handsome 8.9
conference 8.8
colleagues 8.7
mid adult 8.7
day 8.6
two 8.5
wind instrument 8.3
training 8.3
suit 8.3
student 8.3
20s 8.2
alone 8.2
life 8.1
product 7.9
clinic 7.9
together 7.9
face 7.8
discussion 7.8
color 7.8
two people 7.8
pretty 7.7
30s 7.7
college 7.6
females 7.6
one person 7.5
senior 7.5
restaurant 7.5
company 7.4
holding 7.4
phone 7.4
patient 7.4
back 7.3
equipment 7.2
bright 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

man 98.8
text 97.3
person 94.3
clothing 88
working 61
male 17.4

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 86%
Calm 65%
Happy 16.3%
Sad 6.6%
Surprised 4.7%
Confused 2.4%
Disgusted 2.3%
Fear 1.3%
Angry 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.8%
Shoe 64%

Captions

Microsoft

a man sitting at a table 65.3%
a man sitting on a table 53.3%
a man sitting in a room 53.2%

Text analysis

Amazon

2
FILM
SAFETY
SAFETY KODAK
KODAK