Human Generated Data

Title

Untitled (boy sitting at desk)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17606

Human Generated Data

Title

Untitled (boy sitting at desk)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 99.5
Person 99.5
Apparel 98.2
Clothing 98.2
Furniture 85.2
Finger 84.3
Hat 70.3
Person 69.7
Face 65.5
Bonnet 58.8
Chair 56.6
Meal 56.4
Food 56.4
Sleeve 55.7

Imagga
created on 2022-02-26

man 38.3
senior 34.7
person 34
people 30.7
male 29.8
adult 29.4
elderly 27.7
happy 26.3
percussion instrument 26.1
laptop 24.8
computer 23.3
sitting 23.2
home 22.3
mature 22.3
musical instrument 22.1
working 21.2
old 20.9
smiling 20.2
office 20.2
indoors 19.3
portrait 18.8
retirement 18.2
looking 16.8
retired 16.5
work 16.5
business 15.8
worker 15.7
room 15.3
grand piano 15.3
smile 15
holding 14.8
lifestyle 14.4
casual 14.4
men 13.7
grandma 13.6
technology 13.4
professional 12.9
hair 12.7
stringed instrument 12.6
piano 12.4
businessman 12.4
steel drum 12.3
couple 12.2
camera 12
day 11.8
happiness 11.7
grandfather 11.4
glasses 11.1
aged 10.9
gray hair 10.8
older 10.7
face 10.6
modern 10.5
communication 10.1
70s 9.8
handsome 9.8
keyboard instrument 9.5
age 9.5
adults 9.5
corporate 9.4
active 9.3
house 9.2
joy 9.2
alone 9.1
businesswoman 9.1
human 9
one 9
lady 8.9
table 8.9
color 8.9
family 8.9
job 8.8
women 8.7
using 8.7
teacher 8.5
one person 8.5
executive 8.4
horizontal 8.4
pensioner 8.2
suit 8.1
child 8
together 7.9
casual clothing 7.8
60s 7.8
half length 7.8
education 7.8
concentration 7.7
husband 7.6
screen 7.6
wife 7.6
sit 7.6
meeting 7.5
doctor 7.5
hospital 7.4
care 7.4
occupation 7.3
cheerful 7.3
gray 7.2
team 7.2
bright 7.1
medical 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 92.8
text 92.6
black and white 83.3
white 61.2
human face 60.5
clothing 58.8
man 50.7

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Female, 94%
Happy 96.2%
Calm 1.3%
Surprised 0.6%
Sad 0.6%
Confused 0.5%
Disgusted 0.3%
Angry 0.3%
Fear 0.2%

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a person sitting in front of a window 64.6%
a person sitting at a table in front of a window 63.2%
a person standing in front of a window 63.1%

Text analysis

Amazon

MJIA
YT3RAS
MJIA YT3RAS ACCHA
ACCHA

Google

YT3RA2
MJI7 YT3RA2
MJI7