Human Generated Data

Title

Untitled (woman sitting on stool)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19470

Human Generated Data

Title

Untitled (woman sitting on stool)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.3
Human 98.3
Chair 98
Furniture 98
Clothing 80.7
Apparel 80.7
Reading 78.7
Sitting 73.3
Photography 61.4
Face 61.4
Portrait 61.4
Photo 61.4
Nature 60.1
Text 57.3

Imagga
created on 2022-03-05

crutch 95.9
staff 74.2
stick 55.9
adult 29.3
person 28
fashion 26.4
people 25.1
sexy 24.9
attractive 24.5
portrait 23.9
chair 22.8
model 21
body 20.8
sitting 19.7
lifestyle 18.8
hair 18.2
pretty 18.2
lady 17.8
erotic 17
one 15.7
style 15.6
posing 15.1
sensual 14.5
dress 14.5
legs 14.1
happy 13.8
day 13.3
sport 13.3
seat 12.9
casual 12.7
fitness 12.6
blond 12.6
happiness 12.5
studio 12.2
man 12.1
face 12.1
black 12
exercise 11.8
pose 11.8
elegance 11.8
smiling 11.6
smile 11.4
elegant 11.1
women 11.1
street 11
sensuality 10.9
indoors 10.5
human 10.5
brunette 10.5
outdoors 10.4
health 10.4
leisure 10
exercising 9.6
dance 9.5
passion 9.4
male 9.2
summer 9
active 9
looking 8.8
dancer 8.8
nude 8.7
skin 8.5
clothes 8.4
furniture 8.4
dark 8.3
color 8.3
fit 8.3
20s 8.2
fun 8.2
full 8.2
alone 8.2
equipment 8.2
teenager 8.2
relaxing 8.2
water 8
lovely 8
cute 7.9
standing 7.8
device 7.8
full length 7.8
wall 7.7
performance 7.7
fashionable 7.6
relaxation 7.5
enjoy 7.5
life 7.5
city 7.5
leg 7.5
action 7.4
rocking chair 7.3
cheerful 7.3
indoor 7.3
sun 7.2
stylish 7.2
bright 7.1
cool 7.1
clothing 7
look 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 96
black and white 90
outdoor 88.9
person 87.3
black 87.2
clothing 83.2
white 83
man 65.7
old 41.2
posing 36.5

Face analysis

Amazon

AWS Rekognition

Age 35-43
Gender Female, 76.2%
Happy 67%
Calm 16.1%
Surprised 7.2%
Angry 3.8%
Confused 2.1%
Disgusted 1.6%
Sad 1.1%
Fear 1%

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a person posing for the camera 77.8%
an old photo of a person 60.7%
a person standing in front of a building 50.1%

Text analysis

Amazon

I2S

Google

NAGON-
-MAMTZA3
YT37A2
NAGON- YT37A2 -MAMTZA3