Human Generated Data

Title

Untitled (woman sitting on dirt next to wall)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19460

Human Generated Data

Title

Untitled (woman sitting on dirt next to wall)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.2
Human 99.2
Shorts 97.1
Clothing 97.1
Apparel 97.1
Nature 94.9
Outdoors 94.2
Face 92
Rock 83.7
Urban 82.2
Chair 80.7
Furniture 80.7
Mountain 79.3
Portrait 71.4
Photo 71.4
Photography 71.4
Kid 67.6
Child 67.6
Building 66.7
Female 65.4
Town 64.9
City 64.9
Girl 63.6
Wall 62.9
Landscape 59.8
Standing 59
Road 55.8

Imagga
created on 2022-03-05

sexy 32.9
body 32
model 26.5
adult 25.9
person 25.5
attractive 25.2
people 25.1
portrait 22.7
fashion 22.6
pretty 21.7
hair 21.4
skin 19.6
clothing 19.5
one 19.4
happy 16.9
lifestyle 16.6
sensuality 16.4
erotic 16.2
legs 16
garment 15.4
fitness 15.4
blond 14.9
wheeled vehicle 14.7
sitting 14.6
posing 14.2
man 14.1
swimsuit 13.8
lady 13.8
face 13.5
world 13.4
child 13.4
passion 13.2
tricycle 13.1
bikini 13
smile 12.8
black 12.7
splashes 12.7
style 12.6
male 12.5
dark 12.5
wet 12.5
rain 12.3
human 12
athlete 11.9
women 11.9
sensual 11.8
studio 11.4
water 11.3
sport 10.8
brunette 10.5
health 10.4
enjoy 10.4
covering 10.1
smiling 10.1
exercise 10
outdoor 9.9
vehicle 9.7
fun 9.7
shower 9.7
naked 9.7
looking 9.6
couple 9.6
standing 9.6
love 9.5
drops 9.4
cute 9.3
lips 9.3
elegance 9.2
pose 9.1
dress 9
consumer goods 8.8
passionate 8.8
sexual 8.7
seductive 8.6
pleasure 8.5
beach 8.4
vessel 8.3
slim 8.3
fit 8.3
healthy 8.2
active 8.1
wall 7.7
expression 7.7
outdoors 7.6
clothes 7.5
strength 7.5
leisure 7.5
knee pad 7.4
20s 7.3
full 7.3
happiness 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 97.8
clothing 96
human face 94.4
outdoor 88.7
person 88.4
smile 75.3
black and white 74.4
footwear 71.4
girl 65.9
old 57.3
posing 48.9

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 99%
Surprised 99.9%
Fear 0.1%
Angry 0%
Happy 0%
Disgusted 0%
Confused 0%
Calm 0%
Sad 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a person sitting on a bench in front of a building 73.7%
a girl sitting on a bench in front of a building 61.6%
a person sitting on a bench 61.5%

Text analysis

Amazon

P
MAMT2A3