Human Generated Data

Title

Untitled (boy playing game at table)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17054

Human Generated Data

Title

Untitled (boy playing game at table)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.6
Human 99.6
Indoors 81
Room 70.7
Photography 64.4
Portrait 64.4
Photo 64.4
Face 64.4
Tub 64.3
Baby 61.3
Confectionery 60.4
Sweets 60.4
Food 60.4
Meal 59.4
Finger 58.4
Floor 57.6
Bathtub 55.6

Imagga
created on 2022-02-26

laptop 59.2
computer 51.1
call 40.5
office 36.1
business 35.2
man 33.6
male 33.5
work 30.6
people 30.1
person 28.1
notebook 27.7
working 27.4
businessman 25.6
technology 25.2
adult 24.2
home 23.9
desk 22.7
looking 22.4
grandfather 20.7
sitting 18.9
job 18.6
professional 18.2
senior 17.8
happy 16.3
one 15.7
businesswoman 15.5
smiling 15.2
lifestyle 15.2
keyboard 15.2
communication 15.1
women 15
indoors 14.9
mature 14.9
portable computer 14.7
corporate 13.8
telephone 13.7
student 13.6
smile 13.5
face 13.5
suit 12.9
worker 12.5
education 12.1
men 12
phone 12
room 12
indoor 11.9
house 11.7
table 11.3
executive 11.2
old 11.2
personal computer 11
handsome 10.7
monitor 10.6
employee 10.6
elderly 10.5
look 10.5
child 10.5
serious 10.5
wireless 10.5
businesspeople 10.4
happiness 10.2
alone 10
interior 9.7
portrait 9.7
retirement 9.6
studying 9.6
workplace 9.5
talking 9.5
meeting 9.4
casual 9.3
focus 9.3
confident 9.1
attractive 9.1
cheerful 8.9
success 8.9
chair 8.8
together 8.8
boy 8.7
career 8.5
shirt 8.5
learning 8.5
relax 8.4
modern 8.4
study 8.4
mouse 8.4
hand 8.4
glasses 8.3
inside 8.3
occupation 8.3
lady 8.1
kid 8
couple 7.8
black 7.8
talk 7.7
finance 7.6
sit 7.6
digital computer 7.4
successful 7.3
group 7.3
family 7.1
day 7.1
newspaper 7

Microsoft
created on 2022-02-26

person 96.9
human face 95
black and white 93.7
window 90.1
text 86.1
clothing 82.9
man 77.6
monochrome 56.2

Face analysis

Amazon

AWS Rekognition

Age 18-26
Gender Female, 75.1%
Calm 54.6%
Happy 38.3%
Surprised 4.8%
Disgusted 0.9%
Sad 0.5%
Angry 0.4%
Confused 0.3%
Fear 0.2%

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a man sitting in front of a window 83.8%
a man standing in front of a window 83.7%
a man sitting in a chair in front of a window 82.1%

Text analysis

Amazon

Readers
5
6
Readers Digest
Digest
MJI7--YT37A°--X

Google

AGO
2--
MJI7--YT3RA 2-- AGO
MJI7--YT3RA