Human Generated Data

Title

Untitled (boy with fishing rod)

Date

1956

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.645

Human Generated Data

Title

Untitled (boy with fishing rod)

People

Artist: Claseman Studio, American 20th century

Date

1956

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.645

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Water 99.8
Outdoors 99.8
Human 99.2
Fishing 99.2
Person 99
Angler 93.5
Leisure Activities 93.5
Painting 60.2
Art 60.2

Clarifai
created on 2023-10-26

people 99.8
monochrome 98.9
portrait 98.6
adult 98.3
one 97.9
wear 95
two 94.5
recreation 94
river 93.6
man 92.7
child 92.4
art 92.3
lid 91.9
woman 90.7
veil 89.7
fishing rod 88.3
water 87.9
fisherman 86.3
lake 85.9
sepia 85.6

Imagga
created on 2022-01-09

model 36.6
sexy 36.2
attractive 32.2
portrait 29.1
adult 27.2
body 27.2
swing 26.9
rustic 26.6
fashion 26.4
people 24.6
person 24.4
sensuality 22.7
pretty 21.7
hair 21.4
posing 20.4
one 20.2
lady 19.5
black 19.5
erotic 19.3
style 19.3
mechanical device 18.8
plaything 18
face 17.8
studio 16.7
dark 16.7
water 16.7
skin 16.1
women 15.8
sensual 15.5
dress 15.4
lifestyle 15.2
elegance 15.1
sport 14
mechanism 14
wet 13.4
passion 13.2
happy 13.2
human 12.8
seductive 12.4
rain 12.3
looking 12
expression 11.9
bikini 11.9
splashes 11.7
clothing 11.4
sitting 11.2
blond 10.9
man 10.8
smile 10.7
sexual 10.6
fun 10.5
male 10.4
child 10.3
love 10.3
cute 10
sex 9.7
shower 9.7
nude 9.7
underwear 9.7
naked 9.7
desire 9.6
brunette 9.6
drops 9.4
world 9.4
slim 9.2
outdoor 9.2
lingerie 9.1
make 9.1
summer 9
cover girl 8.7
enjoy 8.5
outdoors 8.3
vintage 8.3
pose 8.2
fitness 8.1
umbrella 8.1
active 8.1
recreation 8.1
romantic 8
elegant 7.7
healthy 7.6
lips 7.4
vacation 7.4
hat 7.4
sunset 7.2
romance 7.1
lovely 7.1
happiness 7.1
sea 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.2
person 95.9
clothing 86.7
black and white 82.6
woman 75.8
human face 57.2
water 51.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-12
Gender Female, 99%
Calm 97.9%
Happy 1.6%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0%
Angry 0%
Fear 0%

Microsoft Cognitive Services

Age 5
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Painting 60.2%

Captions

Microsoft
created on 2022-01-09

a person sitting on a couch 68.8%
a person sitting on a bed 59.6%
a person sitting on a boat 45.2%