Human Generated Data

Title

Untitled (boy at desk reading atlas)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17537

Human Generated Data

Title

Untitled (boy at desk reading atlas)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 99.8
Table 99.1
Desk 99
Human 98.3
Person 98.3
Screen 91.8
Monitor 91.8
Display 91.8
Electronics 91.8
Computer 91.2
LCD Screen 89.9
Transportation 85.4
Automobile 85.4
Car 85.4
Vehicle 85.4
Face 80.5
Pc 73.3
Clothing 72.1
Apparel 72.1
Photo 55.3
Portrait 55.3
Photography 55.3
Machine 55

Imagga
created on 2022-02-26

computer 38.9
working 34.5
work 33
people 31.8
laptop 29.5
person 29.4
man 28.9
home 28.7
male 27.7
desk 27.2
office 27
adult 26.3
indoors 25.5
technology 21.5
sitting 21.5
room 20.4
business 20
job 19.5
table 19.4
monitor 18.1
smiling 17.4
equipment 17.3
worker 16.9
happy 16.3
hospital 16.2
portrait 16.2
keyboard 16
machine 15.4
dishwasher 15.3
home appliance 15.1
house 15
professional 14.3
interior 14.2
engineer 14.1
appliance 14
businessman 13.2
senior 13.1
kitchen 12.9
student 12.8
women 12.7
device 12.5
furniture 12.4
specialist 12.3
smile 12.1
white goods 12.1
lifestyle 11.6
modern 11.2
notebook 11.2
one 11.2
looking 11.2
men 11.2
patient 11.1
executive 11.1
businesswoman 10.9
using 10.6
workplace 10.5
clinic 10.1
horizontal 10
indoor 10
surfing 9.8
medical 9.7
30s 9.6
education 9.5
industry 9.4
health 9
success 8.9
browsing 8.8
medicine 8.8
studying 8.6
screen 8.5
face 8.5
casual 8.5
doctor 8.5
attractive 8.4
study 8.4
mature 8.4
camera 8.3
occupation 8.2
team 8.1
paper 7.9
couple 7.8
corporate 7.7
elderly 7.7
university 7.6
illness 7.6
center 7.6
communication 7.6
learning 7.5
holding 7.4
book 7.3
cheerful 7.3
school 7.2
to 7.1
day 7.1
together 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

appliance 99.1
indoor 90.3
person 86.8
text 81.7
computer 77.9
black and white 53.4

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 91.1%
Calm 94.8%
Sad 1.2%
Happy 1%
Angry 0.9%
Fear 0.6%
Surprised 0.6%
Confused 0.4%
Disgusted 0.3%

AWS Rekognition

Age 34-42
Gender Male, 89.8%
Happy 99.5%
Sad 0.1%
Surprised 0.1%
Calm 0.1%
Angry 0%
Fear 0%
Disgusted 0%
Confused 0%

Feature analysis

Amazon

Person 98.3%
Car 85.4%

Captions

Microsoft

a person standing in a room 63.7%
a person on the machine 42.1%
a person standing in a room 42%

Text analysis

Amazon

16
place
for
in
12
65
it's
every ag in it's place
1
10
ace for everythin
every
BLAKE
J
.
ace
everythin
K
ag
I
KODAK--2-ITW

Google

A°2--AGO
MJI7--
ver ans place MJI7-- YT A°2--AGO
ver
ans
YT
place